Dec 03 21:47:16.144390 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 03 21:47:16.372199 master-0 kubenswrapper[4754]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:47:16.372199 master-0 kubenswrapper[4754]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 21:47:16.372199 master-0 kubenswrapper[4754]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:47:16.372199 master-0 kubenswrapper[4754]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:47:16.372199 master-0 kubenswrapper[4754]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 21:47:16.372199 master-0 kubenswrapper[4754]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:47:16.373718 master-0 kubenswrapper[4754]: I1203 21:47:16.372697 4754 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376415 4754 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376437 4754 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376442 4754 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376447 4754 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376451 4754 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376456 4754 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376462 4754 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376466 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376472 4754 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:47:16.376458 master-0 kubenswrapper[4754]: W1203 21:47:16.376476 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376488 4754 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376494 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376498 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376502 4754 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376506 4754 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376510 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376514 4754 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376517 4754 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376521 4754 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376525 4754 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376530 4754 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376535 4754 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376540 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376546 4754 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376550 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376555 4754 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376558 4754 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376562 4754 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376566 4754 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:47:16.376963 master-0 kubenswrapper[4754]: W1203 21:47:16.376570 4754 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376574 4754 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376578 4754 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376582 4754 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376586 4754 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376590 4754 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376594 4754 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376597 4754 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376601 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376606 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376610 4754 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376615 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376619 4754 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376623 4754 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376630 4754 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376636 4754 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376640 4754 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376644 4754 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376647 4754 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376651 4754 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:47:16.377912 master-0 kubenswrapper[4754]: W1203 21:47:16.376655 4754 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376658 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376663 4754 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376667 4754 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376671 4754 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376676 4754 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376680 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376684 4754 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376688 4754 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376692 4754 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376695 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376699 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376704 4754 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376708 4754 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376712 4754 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376715 4754 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376718 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376722 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376725 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:47:16.378830 master-0 kubenswrapper[4754]: W1203 21:47:16.376729 4754 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: W1203 21:47:16.376733 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: W1203 21:47:16.376743 4754 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: W1203 21:47:16.376748 4754 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376871 4754 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376882 4754 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376892 4754 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376899 4754 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376905 4754 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376911 4754 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376917 4754 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376922 4754 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376927 4754 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376932 4754 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376936 4754 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376940 4754 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376945 4754 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376949 4754 flags.go:64] FLAG: --cgroup-root="" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376952 4754 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376956 4754 flags.go:64] FLAG: --client-ca-file="" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376961 4754 flags.go:64] FLAG: --cloud-config="" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376966 4754 flags.go:64] FLAG: --cloud-provider="" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376970 4754 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376978 4754 flags.go:64] FLAG: --cluster-domain="" Dec 03 21:47:16.379819 master-0 kubenswrapper[4754]: I1203 21:47:16.376983 4754 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.376988 4754 flags.go:64] FLAG: --config-dir="" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.376993 4754 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.376998 4754 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377005 4754 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377010 4754 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377015 4754 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377019 4754 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377024 4754 flags.go:64] FLAG: --contention-profiling="false" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377028 4754 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377032 4754 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377037 4754 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377041 4754 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377046 4754 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377050 4754 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377056 4754 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377060 4754 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377064 4754 flags.go:64] FLAG: --enable-server="true" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377069 4754 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377075 4754 flags.go:64] FLAG: --event-burst="100" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377079 4754 flags.go:64] FLAG: --event-qps="50" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377084 4754 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377088 4754 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377092 4754 flags.go:64] FLAG: --eviction-hard="" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377097 4754 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 21:47:16.380888 master-0 kubenswrapper[4754]: I1203 21:47:16.377101 4754 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377106 4754 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377110 4754 flags.go:64] FLAG: --eviction-soft="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377115 4754 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377119 4754 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377123 4754 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377127 4754 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377131 4754 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377135 4754 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377139 4754 flags.go:64] FLAG: --feature-gates="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377145 4754 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377149 4754 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377153 4754 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377157 4754 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377163 4754 flags.go:64] FLAG: --healthz-port="10248" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377167 4754 flags.go:64] FLAG: --help="false" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377171 4754 flags.go:64] FLAG: --hostname-override="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377176 4754 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377180 4754 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377184 4754 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377188 4754 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377192 4754 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377197 4754 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377201 4754 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377205 4754 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 21:47:16.382041 master-0 kubenswrapper[4754]: I1203 21:47:16.377209 4754 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377213 4754 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377218 4754 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377222 4754 flags.go:64] FLAG: --kube-reserved="" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377227 4754 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377231 4754 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377235 4754 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377239 4754 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377244 4754 flags.go:64] FLAG: --lock-file="" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377248 4754 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377253 4754 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377257 4754 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377263 4754 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377279 4754 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377283 4754 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377287 4754 flags.go:64] FLAG: --logging-format="text" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377291 4754 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377296 4754 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377301 4754 flags.go:64] FLAG: --manifest-url="" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377305 4754 flags.go:64] FLAG: --manifest-url-header="" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377310 4754 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377314 4754 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377320 4754 flags.go:64] FLAG: --max-pods="110" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377324 4754 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377328 4754 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 21:47:16.383155 master-0 kubenswrapper[4754]: I1203 21:47:16.377333 4754 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377337 4754 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377341 4754 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377346 4754 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377350 4754 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377361 4754 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377366 4754 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377370 4754 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377374 4754 flags.go:64] FLAG: --pod-cidr="" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377378 4754 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377388 4754 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377393 4754 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377397 4754 flags.go:64] FLAG: --pods-per-core="0" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377402 4754 flags.go:64] FLAG: --port="10250" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377406 4754 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377411 4754 flags.go:64] FLAG: --provider-id="" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377415 4754 flags.go:64] FLAG: --qos-reserved="" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377419 4754 flags.go:64] FLAG: --read-only-port="10255" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377431 4754 flags.go:64] FLAG: --register-node="true" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377435 4754 flags.go:64] FLAG: --register-schedulable="true" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377439 4754 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377446 4754 flags.go:64] FLAG: --registry-burst="10" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377450 4754 flags.go:64] FLAG: --registry-qps="5" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377454 4754 flags.go:64] FLAG: --reserved-cpus="" Dec 03 21:47:16.384265 master-0 kubenswrapper[4754]: I1203 21:47:16.377459 4754 flags.go:64] FLAG: --reserved-memory="" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377464 4754 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377468 4754 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377472 4754 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377476 4754 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377480 4754 flags.go:64] FLAG: --runonce="false" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377484 4754 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377488 4754 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377493 4754 flags.go:64] FLAG: --seccomp-default="false" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377497 4754 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377501 4754 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377505 4754 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377509 4754 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377513 4754 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377517 4754 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377521 4754 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377525 4754 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377529 4754 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377533 4754 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377538 4754 flags.go:64] FLAG: --system-cgroups="" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377542 4754 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377548 4754 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377552 4754 flags.go:64] FLAG: --tls-cert-file="" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377556 4754 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377562 4754 flags.go:64] FLAG: --tls-min-version="" Dec 03 21:47:16.385376 master-0 kubenswrapper[4754]: I1203 21:47:16.377566 4754 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377573 4754 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377577 4754 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377581 4754 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377586 4754 flags.go:64] FLAG: --v="2" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377591 4754 flags.go:64] FLAG: --version="false" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377597 4754 flags.go:64] FLAG: --vmodule="" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377602 4754 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: I1203 21:47:16.377606 4754 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377711 4754 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377716 4754 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377721 4754 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377726 4754 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377730 4754 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377735 4754 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377739 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377743 4754 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377747 4754 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377751 4754 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377755 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377759 4754 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:47:16.386520 master-0 kubenswrapper[4754]: W1203 21:47:16.377763 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377781 4754 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377785 4754 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377789 4754 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377793 4754 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377796 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377800 4754 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377804 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377809 4754 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377813 4754 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377817 4754 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377821 4754 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377827 4754 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377831 4754 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377835 4754 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377839 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377843 4754 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377848 4754 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377852 4754 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377856 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:47:16.387532 master-0 kubenswrapper[4754]: W1203 21:47:16.377860 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377864 4754 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377867 4754 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377871 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377876 4754 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377883 4754 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377887 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377891 4754 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377894 4754 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377898 4754 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377902 4754 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377906 4754 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377910 4754 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377915 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377919 4754 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377924 4754 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377928 4754 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377932 4754 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377937 4754 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377941 4754 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:47:16.388436 master-0 kubenswrapper[4754]: W1203 21:47:16.377945 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377949 4754 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377953 4754 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377957 4754 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377964 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377968 4754 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377971 4754 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377975 4754 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377980 4754 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377985 4754 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377989 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377993 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.377998 4754 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378003 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378007 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378011 4754 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378016 4754 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378021 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378025 4754 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:47:16.389351 master-0 kubenswrapper[4754]: W1203 21:47:16.378029 4754 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:47:16.390222 master-0 kubenswrapper[4754]: I1203 21:47:16.378042 4754 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 21:47:16.390916 master-0 kubenswrapper[4754]: I1203 21:47:16.390837 4754 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 03 21:47:16.390916 master-0 kubenswrapper[4754]: I1203 21:47:16.390908 4754 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 21:47:16.391071 master-0 kubenswrapper[4754]: W1203 21:47:16.391045 4754 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:47:16.391071 master-0 kubenswrapper[4754]: W1203 21:47:16.391066 4754 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391078 4754 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391088 4754 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391098 4754 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391106 4754 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391115 4754 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391124 4754 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391132 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391140 4754 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391148 4754 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391156 4754 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391164 4754 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391172 4754 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391180 4754 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391188 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391196 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391205 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:47:16.391189 master-0 kubenswrapper[4754]: W1203 21:47:16.391214 4754 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391222 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391231 4754 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391240 4754 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391247 4754 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391258 4754 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391273 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391283 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391291 4754 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391299 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391307 4754 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391315 4754 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391322 4754 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391331 4754 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391341 4754 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391351 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391361 4754 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391369 4754 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:47:16.392216 master-0 kubenswrapper[4754]: W1203 21:47:16.391379 4754 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391389 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391397 4754 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391406 4754 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391416 4754 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391424 4754 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391431 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391439 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391447 4754 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391455 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391463 4754 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391471 4754 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391479 4754 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391487 4754 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391494 4754 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391503 4754 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391511 4754 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391522 4754 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391532 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:47:16.393060 master-0 kubenswrapper[4754]: W1203 21:47:16.391540 4754 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391549 4754 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391558 4754 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391567 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391575 4754 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391583 4754 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391591 4754 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391599 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391607 4754 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391614 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391622 4754 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391630 4754 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391638 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391645 4754 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391653 4754 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391662 4754 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:47:16.394304 master-0 kubenswrapper[4754]: W1203 21:47:16.391669 4754 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: I1203 21:47:16.391683 4754 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.391955 4754 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.391977 4754 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.391989 4754 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392002 4754 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392012 4754 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392022 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392032 4754 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392041 4754 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392051 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392060 4754 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392070 4754 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392083 4754 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392094 4754 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:47:16.395110 master-0 kubenswrapper[4754]: W1203 21:47:16.392107 4754 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392122 4754 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392135 4754 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392146 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392158 4754 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392168 4754 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392178 4754 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392190 4754 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392199 4754 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392207 4754 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392216 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392224 4754 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392233 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392243 4754 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392253 4754 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392263 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392271 4754 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392280 4754 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392288 4754 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:47:16.395933 master-0 kubenswrapper[4754]: W1203 21:47:16.392296 4754 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392305 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392312 4754 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392322 4754 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392332 4754 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392341 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392352 4754 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392364 4754 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392376 4754 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392386 4754 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392396 4754 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392409 4754 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392418 4754 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392427 4754 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392435 4754 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392444 4754 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392451 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392461 4754 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392471 4754 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392480 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:47:16.396793 master-0 kubenswrapper[4754]: W1203 21:47:16.392490 4754 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392500 4754 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392510 4754 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392520 4754 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392530 4754 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392539 4754 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392549 4754 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392558 4754 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392565 4754 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392573 4754 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392580 4754 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392591 4754 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392601 4754 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392609 4754 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392617 4754 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392625 4754 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392634 4754 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392644 4754 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392652 4754 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:47:16.397706 master-0 kubenswrapper[4754]: W1203 21:47:16.392661 4754 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:47:16.398614 master-0 kubenswrapper[4754]: I1203 21:47:16.392675 4754 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 21:47:16.398614 master-0 kubenswrapper[4754]: I1203 21:47:16.393029 4754 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 21:47:16.399101 master-0 kubenswrapper[4754]: I1203 21:47:16.399046 4754 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Dec 03 21:47:16.400201 master-0 kubenswrapper[4754]: I1203 21:47:16.400153 4754 server.go:997] "Starting client certificate rotation" Dec 03 21:47:16.400201 master-0 kubenswrapper[4754]: I1203 21:47:16.400196 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 21:47:16.400460 master-0 kubenswrapper[4754]: I1203 21:47:16.400381 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 21:47:16.409400 master-0 kubenswrapper[4754]: I1203 21:47:16.409352 4754 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 21:47:16.413617 master-0 kubenswrapper[4754]: I1203 21:47:16.413533 4754 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 21:47:16.413709 master-0 kubenswrapper[4754]: E1203 21:47:16.413656 4754 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:16.430011 master-0 kubenswrapper[4754]: I1203 21:47:16.429924 4754 log.go:25] "Validated CRI v1 runtime API" Dec 03 21:47:16.436697 master-0 kubenswrapper[4754]: I1203 21:47:16.436609 4754 log.go:25] "Validated CRI v1 image API" Dec 03 21:47:16.440172 master-0 kubenswrapper[4754]: I1203 21:47:16.440105 4754 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 21:47:16.444469 master-0 kubenswrapper[4754]: I1203 21:47:16.444384 4754 fs.go:135] Filesystem UUIDs: map[3c671a63-22b6-47f8-bf0c-b9acbe18afb0:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 03 21:47:16.444469 master-0 kubenswrapper[4754]: I1203 21:47:16.444448 4754 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Dec 03 21:47:16.485028 master-0 kubenswrapper[4754]: I1203 21:47:16.484582 4754 manager.go:217] Machine: {Timestamp:2025-12-03 21:47:16.482993223 +0000 UTC m=+0.236090878 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:4df8bb8220024c58895f94eb9cdab694 SystemUUID:4df8bb82-2002-4c58-895f-94eb9cdab694 BootID:a203903b-0841-4def-8d3c-caca39fd1aed Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:95:f0:47 Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:56:de:4c:1d:30:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 21:47:16.485028 master-0 kubenswrapper[4754]: I1203 21:47:16.484947 4754 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 21:47:16.485339 master-0 kubenswrapper[4754]: I1203 21:47:16.485220 4754 manager.go:233] Version: {KernelVersion:5.14.0-427.97.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511041748-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 21:47:16.485935 master-0 kubenswrapper[4754]: I1203 21:47:16.485896 4754 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 21:47:16.486184 master-0 kubenswrapper[4754]: I1203 21:47:16.486125 4754 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 21:47:16.486539 master-0 kubenswrapper[4754]: I1203 21:47:16.486176 4754 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 21:47:16.486633 master-0 kubenswrapper[4754]: I1203 21:47:16.486558 4754 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 21:47:16.486633 master-0 kubenswrapper[4754]: I1203 21:47:16.486574 4754 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 21:47:16.486943 master-0 kubenswrapper[4754]: I1203 21:47:16.486907 4754 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 21:47:16.487010 master-0 kubenswrapper[4754]: I1203 21:47:16.486949 4754 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 21:47:16.487155 master-0 kubenswrapper[4754]: I1203 21:47:16.487123 4754 state_mem.go:36] "Initialized new in-memory state store" Dec 03 21:47:16.487251 master-0 kubenswrapper[4754]: I1203 21:47:16.487232 4754 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 21:47:16.488970 master-0 kubenswrapper[4754]: I1203 21:47:16.488927 4754 kubelet.go:418] "Attempting to sync node with API server" Dec 03 21:47:16.489055 master-0 kubenswrapper[4754]: I1203 21:47:16.488974 4754 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 21:47:16.489055 master-0 kubenswrapper[4754]: I1203 21:47:16.489026 4754 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 21:47:16.489055 master-0 kubenswrapper[4754]: I1203 21:47:16.489052 4754 kubelet.go:324] "Adding apiserver pod source" Dec 03 21:47:16.489204 master-0 kubenswrapper[4754]: I1203 21:47:16.489087 4754 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 21:47:16.491797 master-0 kubenswrapper[4754]: W1203 21:47:16.491622 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:16.491975 master-0 kubenswrapper[4754]: E1203 21:47:16.491869 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:16.491975 master-0 kubenswrapper[4754]: W1203 21:47:16.491660 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:16.491975 master-0 kubenswrapper[4754]: I1203 21:47:16.491943 4754 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 03 21:47:16.492175 master-0 kubenswrapper[4754]: E1203 21:47:16.491980 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:16.493654 master-0 kubenswrapper[4754]: I1203 21:47:16.493606 4754 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 21:47:16.494079 master-0 kubenswrapper[4754]: I1203 21:47:16.494037 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 21:47:16.494143 master-0 kubenswrapper[4754]: I1203 21:47:16.494091 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 21:47:16.494143 master-0 kubenswrapper[4754]: I1203 21:47:16.494107 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 21:47:16.494143 master-0 kubenswrapper[4754]: I1203 21:47:16.494122 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 21:47:16.494143 master-0 kubenswrapper[4754]: I1203 21:47:16.494137 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494153 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494168 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494181 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494199 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494214 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494263 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 21:47:16.494344 master-0 kubenswrapper[4754]: I1203 21:47:16.494287 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 21:47:16.494921 master-0 kubenswrapper[4754]: I1203 21:47:16.494862 4754 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 21:47:16.496045 master-0 kubenswrapper[4754]: I1203 21:47:16.495872 4754 server.go:1280] "Started kubelet" Dec 03 21:47:16.496480 master-0 kubenswrapper[4754]: I1203 21:47:16.496358 4754 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 21:47:16.496700 master-0 kubenswrapper[4754]: I1203 21:47:16.496383 4754 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 21:47:16.496899 master-0 kubenswrapper[4754]: I1203 21:47:16.496840 4754 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 03 21:47:16.497082 master-0 kubenswrapper[4754]: I1203 21:47:16.497022 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:16.497991 master-0 kubenswrapper[4754]: I1203 21:47:16.497942 4754 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 21:47:16.498515 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 03 21:47:16.499181 master-0 kubenswrapper[4754]: I1203 21:47:16.499146 4754 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 21:47:16.499181 master-0 kubenswrapper[4754]: I1203 21:47:16.499180 4754 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 21:47:16.499528 master-0 kubenswrapper[4754]: I1203 21:47:16.499483 4754 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 21:47:16.499528 master-0 kubenswrapper[4754]: I1203 21:47:16.499520 4754 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 21:47:16.499609 master-0 kubenswrapper[4754]: I1203 21:47:16.499564 4754 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 03 21:47:16.499869 master-0 kubenswrapper[4754]: E1203 21:47:16.499481 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:16.502335 master-0 kubenswrapper[4754]: I1203 21:47:16.502288 4754 server.go:449] "Adding debug handlers to kubelet server" Dec 03 21:47:16.502439 master-0 kubenswrapper[4754]: I1203 21:47:16.502290 4754 reconstruct.go:97] "Volume reconstruction finished" Dec 03 21:47:16.502480 master-0 kubenswrapper[4754]: I1203 21:47:16.502466 4754 reconciler.go:26] "Reconciler: start to sync state" Dec 03 21:47:16.502675 master-0 kubenswrapper[4754]: E1203 21:47:16.499635 4754 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187dd2d3e37f4649 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.495803977 +0000 UTC m=+0.248901682,LastTimestamp:2025-12-03 21:47:16.495803977 +0000 UTC m=+0.248901682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:16.503354 master-0 kubenswrapper[4754]: I1203 21:47:16.503319 4754 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 21:47:16.503400 master-0 kubenswrapper[4754]: I1203 21:47:16.503357 4754 factory.go:55] Registering systemd factory Dec 03 21:47:16.503400 master-0 kubenswrapper[4754]: I1203 21:47:16.503380 4754 factory.go:221] Registration of the systemd container factory successfully Dec 03 21:47:16.503481 master-0 kubenswrapper[4754]: E1203 21:47:16.503423 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 03 21:47:16.503517 master-0 kubenswrapper[4754]: W1203 21:47:16.503435 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:16.503646 master-0 kubenswrapper[4754]: E1203 21:47:16.503535 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:16.507933 master-0 kubenswrapper[4754]: I1203 21:47:16.507892 4754 factory.go:153] Registering CRI-O factory Dec 03 21:47:16.507998 master-0 kubenswrapper[4754]: I1203 21:47:16.507938 4754 factory.go:221] Registration of the crio container factory successfully Dec 03 21:47:16.507998 master-0 kubenswrapper[4754]: I1203 21:47:16.507984 4754 factory.go:103] Registering Raw factory Dec 03 21:47:16.508060 master-0 kubenswrapper[4754]: I1203 21:47:16.508012 4754 manager.go:1196] Started watching for new ooms in manager Dec 03 21:47:16.509210 master-0 kubenswrapper[4754]: I1203 21:47:16.509181 4754 manager.go:319] Starting recovery of all containers Dec 03 21:47:16.510967 master-0 kubenswrapper[4754]: E1203 21:47:16.510940 4754 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 03 21:47:16.542589 master-0 kubenswrapper[4754]: I1203 21:47:16.541969 4754 manager.go:324] Recovery completed Dec 03 21:47:16.554989 master-0 kubenswrapper[4754]: I1203 21:47:16.554939 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.556417 master-0 kubenswrapper[4754]: I1203 21:47:16.556376 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.556417 master-0 kubenswrapper[4754]: I1203 21:47:16.556418 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.556488 master-0 kubenswrapper[4754]: I1203 21:47:16.556430 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.557717 master-0 kubenswrapper[4754]: I1203 21:47:16.557683 4754 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 21:47:16.557717 master-0 kubenswrapper[4754]: I1203 21:47:16.557699 4754 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 21:47:16.557717 master-0 kubenswrapper[4754]: I1203 21:47:16.557720 4754 state_mem.go:36] "Initialized new in-memory state store" Dec 03 21:47:16.560440 master-0 kubenswrapper[4754]: I1203 21:47:16.560408 4754 policy_none.go:49] "None policy: Start" Dec 03 21:47:16.561431 master-0 kubenswrapper[4754]: I1203 21:47:16.561394 4754 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 21:47:16.561431 master-0 kubenswrapper[4754]: I1203 21:47:16.561423 4754 state_mem.go:35] "Initializing new in-memory state store" Dec 03 21:47:16.600255 master-0 kubenswrapper[4754]: E1203 21:47:16.600169 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:16.638813 master-0 kubenswrapper[4754]: I1203 21:47:16.638751 4754 manager.go:334] "Starting Device Plugin manager" Dec 03 21:47:16.638951 master-0 kubenswrapper[4754]: I1203 21:47:16.638838 4754 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 21:47:16.638951 master-0 kubenswrapper[4754]: I1203 21:47:16.638853 4754 server.go:79] "Starting device plugin registration server" Dec 03 21:47:16.639467 master-0 kubenswrapper[4754]: I1203 21:47:16.639431 4754 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 21:47:16.639544 master-0 kubenswrapper[4754]: I1203 21:47:16.639452 4754 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 21:47:16.639748 master-0 kubenswrapper[4754]: I1203 21:47:16.639679 4754 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 21:47:16.640059 master-0 kubenswrapper[4754]: I1203 21:47:16.639992 4754 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 21:47:16.640123 master-0 kubenswrapper[4754]: I1203 21:47:16.640056 4754 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 21:47:16.642064 master-0 kubenswrapper[4754]: E1203 21:47:16.641973 4754 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 21:47:16.682200 master-0 kubenswrapper[4754]: I1203 21:47:16.682063 4754 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 21:47:16.684893 master-0 kubenswrapper[4754]: I1203 21:47:16.684825 4754 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 21:47:16.684959 master-0 kubenswrapper[4754]: I1203 21:47:16.684933 4754 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 21:47:16.684994 master-0 kubenswrapper[4754]: I1203 21:47:16.684976 4754 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 21:47:16.685140 master-0 kubenswrapper[4754]: E1203 21:47:16.685094 4754 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Dec 03 21:47:16.687044 master-0 kubenswrapper[4754]: W1203 21:47:16.686922 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:16.687106 master-0 kubenswrapper[4754]: E1203 21:47:16.687068 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:16.705682 master-0 kubenswrapper[4754]: E1203 21:47:16.705583 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 03 21:47:16.739733 master-0 kubenswrapper[4754]: I1203 21:47:16.739643 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.741444 master-0 kubenswrapper[4754]: I1203 21:47:16.741401 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.741493 master-0 kubenswrapper[4754]: I1203 21:47:16.741455 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.741493 master-0 kubenswrapper[4754]: I1203 21:47:16.741468 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.741571 master-0 kubenswrapper[4754]: I1203 21:47:16.741510 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:16.742850 master-0 kubenswrapper[4754]: E1203 21:47:16.742755 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 21:47:16.786195 master-0 kubenswrapper[4754]: I1203 21:47:16.786001 4754 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 21:47:16.786195 master-0 kubenswrapper[4754]: I1203 21:47:16.786168 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.787736 master-0 kubenswrapper[4754]: I1203 21:47:16.787677 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.787871 master-0 kubenswrapper[4754]: I1203 21:47:16.787759 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.787871 master-0 kubenswrapper[4754]: I1203 21:47:16.787795 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.787991 master-0 kubenswrapper[4754]: I1203 21:47:16.787937 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.788445 master-0 kubenswrapper[4754]: I1203 21:47:16.788349 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.788538 master-0 kubenswrapper[4754]: I1203 21:47:16.788465 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.789346 master-0 kubenswrapper[4754]: I1203 21:47:16.789295 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.789431 master-0 kubenswrapper[4754]: I1203 21:47:16.789357 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.789431 master-0 kubenswrapper[4754]: I1203 21:47:16.789392 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.789606 master-0 kubenswrapper[4754]: I1203 21:47:16.789561 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.789606 master-0 kubenswrapper[4754]: I1203 21:47:16.789591 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.789725 master-0 kubenswrapper[4754]: I1203 21:47:16.789600 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.789725 master-0 kubenswrapper[4754]: I1203 21:47:16.789689 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.789725 master-0 kubenswrapper[4754]: I1203 21:47:16.789705 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.789725 master-0 kubenswrapper[4754]: I1203 21:47:16.789722 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.790589 master-0 kubenswrapper[4754]: I1203 21:47:16.790545 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.790589 master-0 kubenswrapper[4754]: I1203 21:47:16.790571 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.790589 master-0 kubenswrapper[4754]: I1203 21:47:16.790580 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.790797 master-0 kubenswrapper[4754]: I1203 21:47:16.790669 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.791001 master-0 kubenswrapper[4754]: I1203 21:47:16.790958 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.791001 master-0 kubenswrapper[4754]: I1203 21:47:16.790990 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.791254 master-0 kubenswrapper[4754]: I1203 21:47:16.791232 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.791254 master-0 kubenswrapper[4754]: I1203 21:47:16.791251 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.791391 master-0 kubenswrapper[4754]: I1203 21:47:16.791260 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.791482 master-0 kubenswrapper[4754]: I1203 21:47:16.791424 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.791482 master-0 kubenswrapper[4754]: I1203 21:47:16.791437 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.791482 master-0 kubenswrapper[4754]: I1203 21:47:16.791445 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.791688 master-0 kubenswrapper[4754]: I1203 21:47:16.791551 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.791688 master-0 kubenswrapper[4754]: I1203 21:47:16.791566 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.791688 master-0 kubenswrapper[4754]: I1203 21:47:16.791573 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.791688 master-0 kubenswrapper[4754]: I1203 21:47:16.791638 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.791976 master-0 kubenswrapper[4754]: I1203 21:47:16.791905 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.791976 master-0 kubenswrapper[4754]: I1203 21:47:16.791964 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.792269 master-0 kubenswrapper[4754]: I1203 21:47:16.792216 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.792269 master-0 kubenswrapper[4754]: I1203 21:47:16.792266 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.792394 master-0 kubenswrapper[4754]: I1203 21:47:16.792286 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.792608 master-0 kubenswrapper[4754]: I1203 21:47:16.792564 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.792703 master-0 kubenswrapper[4754]: I1203 21:47:16.792614 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.793101 master-0 kubenswrapper[4754]: I1203 21:47:16.793056 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.793101 master-0 kubenswrapper[4754]: I1203 21:47:16.793090 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.793232 master-0 kubenswrapper[4754]: I1203 21:47:16.793108 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.793444 master-0 kubenswrapper[4754]: I1203 21:47:16.793396 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.793444 master-0 kubenswrapper[4754]: I1203 21:47:16.793439 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.793571 master-0 kubenswrapper[4754]: I1203 21:47:16.793455 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.805314 master-0 kubenswrapper[4754]: I1203 21:47:16.805261 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.805314 master-0 kubenswrapper[4754]: I1203 21:47:16.805304 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805329 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805358 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805379 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805405 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805430 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805484 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.805522 master-0 kubenswrapper[4754]: I1203 21:47:16.805516 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805543 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805572 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805588 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805612 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805637 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805681 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805811 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.806022 master-0 kubenswrapper[4754]: I1203 21:47:16.805886 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.906740 master-0 kubenswrapper[4754]: I1203 21:47:16.906623 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.906740 master-0 kubenswrapper[4754]: I1203 21:47:16.906691 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.906740 master-0 kubenswrapper[4754]: I1203 21:47:16.906715 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.906740 master-0 kubenswrapper[4754]: I1203 21:47:16.906735 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.906740 master-0 kubenswrapper[4754]: I1203 21:47:16.906754 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906804 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906837 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906862 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906858 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906917 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906909 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906967 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906980 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906942 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906909 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.907024 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.907000 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.906982 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.907114 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.907126 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.907143 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.907243 master-0 kubenswrapper[4754]: I1203 21:47:16.907259 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907278 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907340 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907346 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907372 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907395 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907418 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907442 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907450 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907484 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907500 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907515 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:16.908280 master-0 kubenswrapper[4754]: I1203 21:47:16.907569 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:16.943517 master-0 kubenswrapper[4754]: I1203 21:47:16.943288 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:16.945433 master-0 kubenswrapper[4754]: I1203 21:47:16.945372 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:16.945433 master-0 kubenswrapper[4754]: I1203 21:47:16.945428 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:16.945433 master-0 kubenswrapper[4754]: I1203 21:47:16.945441 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:16.945703 master-0 kubenswrapper[4754]: I1203 21:47:16.945511 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:16.947004 master-0 kubenswrapper[4754]: E1203 21:47:16.946929 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 21:47:17.005012 master-0 kubenswrapper[4754]: E1203 21:47:17.004629 4754 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.187dd2d3e37f4649 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.495803977 +0000 UTC m=+0.248901682,LastTimestamp:2025-12-03 21:47:16.495803977 +0000 UTC m=+0.248901682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:17.108198 master-0 kubenswrapper[4754]: E1203 21:47:17.108050 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 03 21:47:17.114429 master-0 kubenswrapper[4754]: I1203 21:47:17.114359 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:47:17.121239 master-0 kubenswrapper[4754]: I1203 21:47:17.121177 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:47:17.136624 master-0 kubenswrapper[4754]: I1203 21:47:17.136493 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:47:17.144727 master-0 kubenswrapper[4754]: I1203 21:47:17.144673 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:17.149825 master-0 kubenswrapper[4754]: I1203 21:47:17.149761 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:17.339592 master-0 kubenswrapper[4754]: W1203 21:47:17.339304 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:17.339592 master-0 kubenswrapper[4754]: E1203 21:47:17.339493 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:17.348055 master-0 kubenswrapper[4754]: I1203 21:47:17.347949 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:17.349614 master-0 kubenswrapper[4754]: I1203 21:47:17.349546 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:17.349614 master-0 kubenswrapper[4754]: I1203 21:47:17.349607 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:17.349614 master-0 kubenswrapper[4754]: I1203 21:47:17.349628 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:17.350080 master-0 kubenswrapper[4754]: I1203 21:47:17.349721 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:17.350943 master-0 kubenswrapper[4754]: E1203 21:47:17.350884 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 21:47:17.498987 master-0 kubenswrapper[4754]: I1203 21:47:17.498890 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:17.653384 master-0 kubenswrapper[4754]: W1203 21:47:17.653163 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:17.653384 master-0 kubenswrapper[4754]: E1203 21:47:17.653295 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:17.782098 master-0 kubenswrapper[4754]: W1203 21:47:17.782000 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41b95a38663dd6fe34e183818a475977.slice/crio-671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e WatchSource:0}: Error finding container 671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e: Status 404 returned error can't find the container with id 671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e Dec 03 21:47:17.783209 master-0 kubenswrapper[4754]: W1203 21:47:17.783151 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13238af3704fe583f617f61e755cf4c2.slice/crio-2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2 WatchSource:0}: Error finding container 2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2: Status 404 returned error can't find the container with id 2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2 Dec 03 21:47:17.793750 master-0 kubenswrapper[4754]: I1203 21:47:17.793701 4754 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 21:47:17.808313 master-0 kubenswrapper[4754]: W1203 21:47:17.808212 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78739a7694769882b7e47ea5ac08a10.slice/crio-ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9 WatchSource:0}: Error finding container ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9: Status 404 returned error can't find the container with id ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9 Dec 03 21:47:17.837445 master-0 kubenswrapper[4754]: W1203 21:47:17.837347 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:17.837580 master-0 kubenswrapper[4754]: E1203 21:47:17.837451 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:17.838412 master-0 kubenswrapper[4754]: W1203 21:47:17.838365 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb495b0c38f2c54e7cc46282c5f92aab5.slice/crio-f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57 WatchSource:0}: Error finding container f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57: Status 404 returned error can't find the container with id f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57 Dec 03 21:47:17.866067 master-0 kubenswrapper[4754]: W1203 21:47:17.865970 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bce50c457ac1f4721bc81a570dd238a.slice/crio-4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a WatchSource:0}: Error finding container 4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a: Status 404 returned error can't find the container with id 4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a Dec 03 21:47:17.909952 master-0 kubenswrapper[4754]: E1203 21:47:17.909702 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 03 21:47:18.066601 master-0 kubenswrapper[4754]: W1203 21:47:18.066477 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:18.066601 master-0 kubenswrapper[4754]: E1203 21:47:18.066567 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:18.152073 master-0 kubenswrapper[4754]: I1203 21:47:18.151946 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:18.154181 master-0 kubenswrapper[4754]: I1203 21:47:18.154134 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:18.154312 master-0 kubenswrapper[4754]: I1203 21:47:18.154187 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:18.154312 master-0 kubenswrapper[4754]: I1203 21:47:18.154203 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:18.154312 master-0 kubenswrapper[4754]: I1203 21:47:18.154266 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:18.155603 master-0 kubenswrapper[4754]: E1203 21:47:18.155506 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 21:47:18.499035 master-0 kubenswrapper[4754]: I1203 21:47:18.498917 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:18.543496 master-0 kubenswrapper[4754]: I1203 21:47:18.543401 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 21:47:18.545086 master-0 kubenswrapper[4754]: E1203 21:47:18.545013 4754 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:18.693152 master-0 kubenswrapper[4754]: I1203 21:47:18.692841 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a"} Dec 03 21:47:18.693952 master-0 kubenswrapper[4754]: I1203 21:47:18.693914 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57"} Dec 03 21:47:18.695296 master-0 kubenswrapper[4754]: I1203 21:47:18.695180 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9"} Dec 03 21:47:18.696551 master-0 kubenswrapper[4754]: I1203 21:47:18.696524 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2"} Dec 03 21:47:18.697707 master-0 kubenswrapper[4754]: I1203 21:47:18.697671 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e"} Dec 03 21:47:19.498750 master-0 kubenswrapper[4754]: I1203 21:47:19.498693 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:19.510763 master-0 kubenswrapper[4754]: E1203 21:47:19.510715 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 03 21:47:19.701844 master-0 kubenswrapper[4754]: I1203 21:47:19.701350 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4"} Dec 03 21:47:19.701932 master-0 kubenswrapper[4754]: I1203 21:47:19.701426 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:19.702882 master-0 kubenswrapper[4754]: I1203 21:47:19.702833 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:19.702882 master-0 kubenswrapper[4754]: I1203 21:47:19.702870 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:19.702882 master-0 kubenswrapper[4754]: I1203 21:47:19.702880 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:19.756619 master-0 kubenswrapper[4754]: I1203 21:47:19.756509 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:19.758073 master-0 kubenswrapper[4754]: I1203 21:47:19.758046 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:19.758160 master-0 kubenswrapper[4754]: I1203 21:47:19.758090 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:19.758160 master-0 kubenswrapper[4754]: I1203 21:47:19.758101 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:19.758232 master-0 kubenswrapper[4754]: I1203 21:47:19.758176 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:19.759063 master-0 kubenswrapper[4754]: E1203 21:47:19.759023 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 21:47:20.046564 master-0 kubenswrapper[4754]: W1203 21:47:20.046394 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:20.046564 master-0 kubenswrapper[4754]: E1203 21:47:20.046510 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:20.052717 master-0 kubenswrapper[4754]: W1203 21:47:20.052610 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:20.052717 master-0 kubenswrapper[4754]: E1203 21:47:20.052666 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:20.449628 master-0 kubenswrapper[4754]: W1203 21:47:20.449412 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:20.449628 master-0 kubenswrapper[4754]: E1203 21:47:20.449533 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:20.498936 master-0 kubenswrapper[4754]: I1203 21:47:20.498877 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:20.705021 master-0 kubenswrapper[4754]: I1203 21:47:20.704914 4754 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4" exitCode=0 Dec 03 21:47:20.705021 master-0 kubenswrapper[4754]: I1203 21:47:20.704973 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4"} Dec 03 21:47:20.705609 master-0 kubenswrapper[4754]: I1203 21:47:20.705088 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:20.705977 master-0 kubenswrapper[4754]: I1203 21:47:20.705945 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:20.706030 master-0 kubenswrapper[4754]: I1203 21:47:20.705989 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:20.706030 master-0 kubenswrapper[4754]: I1203 21:47:20.706006 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:20.983044 master-0 kubenswrapper[4754]: W1203 21:47:20.982888 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:20.983044 master-0 kubenswrapper[4754]: E1203 21:47:20.982973 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:21.498547 master-0 kubenswrapper[4754]: I1203 21:47:21.498475 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:22.500118 master-0 kubenswrapper[4754]: I1203 21:47:22.499966 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:22.711512 master-0 kubenswrapper[4754]: I1203 21:47:22.711449 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/0.log" Dec 03 21:47:22.712215 master-0 kubenswrapper[4754]: E1203 21:47:22.711865 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 03 21:47:22.712215 master-0 kubenswrapper[4754]: I1203 21:47:22.712108 4754 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="7f961cff72f991803cbef5b391404f85e1bc7bfe6cbdb42ea68a02ace90aa826" exitCode=1 Dec 03 21:47:22.712215 master-0 kubenswrapper[4754]: I1203 21:47:22.712162 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"7f961cff72f991803cbef5b391404f85e1bc7bfe6cbdb42ea68a02ace90aa826"} Dec 03 21:47:22.712215 master-0 kubenswrapper[4754]: I1203 21:47:22.712191 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:22.713124 master-0 kubenswrapper[4754]: I1203 21:47:22.713054 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:22.713124 master-0 kubenswrapper[4754]: I1203 21:47:22.713118 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:22.713124 master-0 kubenswrapper[4754]: I1203 21:47:22.713132 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:22.713615 master-0 kubenswrapper[4754]: I1203 21:47:22.713571 4754 scope.go:117] "RemoveContainer" containerID="7f961cff72f991803cbef5b391404f85e1bc7bfe6cbdb42ea68a02ace90aa826" Dec 03 21:47:22.765615 master-0 kubenswrapper[4754]: I1203 21:47:22.765497 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 21:47:22.766887 master-0 kubenswrapper[4754]: E1203 21:47:22.766856 4754 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:22.959439 master-0 kubenswrapper[4754]: I1203 21:47:22.959344 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:22.960655 master-0 kubenswrapper[4754]: I1203 21:47:22.960628 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:22.960717 master-0 kubenswrapper[4754]: I1203 21:47:22.960658 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:22.960717 master-0 kubenswrapper[4754]: I1203 21:47:22.960667 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:22.960795 master-0 kubenswrapper[4754]: I1203 21:47:22.960753 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:22.961567 master-0 kubenswrapper[4754]: E1203 21:47:22.961528 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Dec 03 21:47:23.499134 master-0 kubenswrapper[4754]: I1203 21:47:23.499079 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:24.499522 master-0 kubenswrapper[4754]: I1203 21:47:24.499434 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:25.489812 master-0 kubenswrapper[4754]: W1203 21:47:25.489527 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:25.489812 master-0 kubenswrapper[4754]: E1203 21:47:25.489625 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:25.499188 master-0 kubenswrapper[4754]: I1203 21:47:25.499125 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:25.720722 master-0 kubenswrapper[4754]: I1203 21:47:25.720633 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/1.log" Dec 03 21:47:25.721376 master-0 kubenswrapper[4754]: I1203 21:47:25.721266 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/0.log" Dec 03 21:47:25.721376 master-0 kubenswrapper[4754]: W1203 21:47:25.721291 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Dec 03 21:47:25.721376 master-0 kubenswrapper[4754]: E1203 21:47:25.721334 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Dec 03 21:47:25.722221 master-0 kubenswrapper[4754]: I1203 21:47:25.721805 4754 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="b2e433bb0d812f8998f9c3d94046039e4ad63092afce7ea8f56800703f9ed872" exitCode=1 Dec 03 21:47:25.722221 master-0 kubenswrapper[4754]: I1203 21:47:25.721886 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"b2e433bb0d812f8998f9c3d94046039e4ad63092afce7ea8f56800703f9ed872"} Dec 03 21:47:25.722221 master-0 kubenswrapper[4754]: I1203 21:47:25.721907 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:25.722221 master-0 kubenswrapper[4754]: I1203 21:47:25.721927 4754 scope.go:117] "RemoveContainer" containerID="7f961cff72f991803cbef5b391404f85e1bc7bfe6cbdb42ea68a02ace90aa826" Dec 03 21:47:25.723155 master-0 kubenswrapper[4754]: I1203 21:47:25.722575 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:25.723155 master-0 kubenswrapper[4754]: I1203 21:47:25.722600 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:25.723155 master-0 kubenswrapper[4754]: I1203 21:47:25.722607 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:25.723155 master-0 kubenswrapper[4754]: I1203 21:47:25.722882 4754 scope.go:117] "RemoveContainer" containerID="b2e433bb0d812f8998f9c3d94046039e4ad63092afce7ea8f56800703f9ed872" Dec 03 21:47:25.723155 master-0 kubenswrapper[4754]: E1203 21:47:25.723117 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 21:47:25.724801 master-0 kubenswrapper[4754]: I1203 21:47:25.724755 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:25.724847 master-0 kubenswrapper[4754]: I1203 21:47:25.724804 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"a5db70d8c04a9fdadcb37e94b566026a87f2170f5e692e5d3ce48ba377b79800"} Dec 03 21:47:25.725514 master-0 kubenswrapper[4754]: I1203 21:47:25.725478 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:25.725560 master-0 kubenswrapper[4754]: I1203 21:47:25.725532 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:25.725588 master-0 kubenswrapper[4754]: I1203 21:47:25.725556 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:25.726856 master-0 kubenswrapper[4754]: I1203 21:47:25.726752 4754 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="ebd2204fe249e4ff0cd7edb5edf639c7a4971288017457fe2c22236d90e0112a" exitCode=0 Dec 03 21:47:25.726966 master-0 kubenswrapper[4754]: I1203 21:47:25.726916 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:25.727042 master-0 kubenswrapper[4754]: I1203 21:47:25.726912 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerDied","Data":"ebd2204fe249e4ff0cd7edb5edf639c7a4971288017457fe2c22236d90e0112a"} Dec 03 21:47:25.728516 master-0 kubenswrapper[4754]: I1203 21:47:25.728475 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:25.728561 master-0 kubenswrapper[4754]: I1203 21:47:25.728545 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:25.728596 master-0 kubenswrapper[4754]: I1203 21:47:25.728573 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:25.729607 master-0 kubenswrapper[4754]: I1203 21:47:25.729562 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"6c365c0b34d496df5874092d2be756d0f4503f99c75f8af416569e515090fd7c"} Dec 03 21:47:25.729607 master-0 kubenswrapper[4754]: I1203 21:47:25.729591 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"d089dbd1d00b21007f2d7f87058c4506fbc7b6ac8e4051768c22497e2ce3a3f4"} Dec 03 21:47:25.729690 master-0 kubenswrapper[4754]: I1203 21:47:25.729612 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:25.730473 master-0 kubenswrapper[4754]: I1203 21:47:25.730449 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:25.730520 master-0 kubenswrapper[4754]: I1203 21:47:25.730475 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:25.730520 master-0 kubenswrapper[4754]: I1203 21:47:25.730488 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:25.731359 master-0 kubenswrapper[4754]: I1203 21:47:25.731324 4754 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="0d4f1a1f19ffb465b7ec7db2a36e7c6f6dc0dc2fd64d69d37c5c1b12205b376b" exitCode=1 Dec 03 21:47:25.731396 master-0 kubenswrapper[4754]: I1203 21:47:25.731361 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"0d4f1a1f19ffb465b7ec7db2a36e7c6f6dc0dc2fd64d69d37c5c1b12205b376b"} Dec 03 21:47:25.733666 master-0 kubenswrapper[4754]: I1203 21:47:25.733630 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:25.734399 master-0 kubenswrapper[4754]: I1203 21:47:25.734363 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:25.734399 master-0 kubenswrapper[4754]: I1203 21:47:25.734391 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:25.734505 master-0 kubenswrapper[4754]: I1203 21:47:25.734408 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:26.642418 master-0 kubenswrapper[4754]: E1203 21:47:26.642303 4754 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 21:47:26.735957 master-0 kubenswrapper[4754]: I1203 21:47:26.735881 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/1.log" Dec 03 21:47:26.737357 master-0 kubenswrapper[4754]: I1203 21:47:26.737323 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:26.738388 master-0 kubenswrapper[4754]: I1203 21:47:26.738334 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:26.738388 master-0 kubenswrapper[4754]: I1203 21:47:26.738379 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:26.738388 master-0 kubenswrapper[4754]: I1203 21:47:26.738395 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:26.738850 master-0 kubenswrapper[4754]: I1203 21:47:26.738817 4754 scope.go:117] "RemoveContainer" containerID="b2e433bb0d812f8998f9c3d94046039e4ad63092afce7ea8f56800703f9ed872" Dec 03 21:47:26.739102 master-0 kubenswrapper[4754]: E1203 21:47:26.739036 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 21:47:26.739387 master-0 kubenswrapper[4754]: I1203 21:47:26.739215 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:26.739387 master-0 kubenswrapper[4754]: I1203 21:47:26.739296 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"70791961c08656ffa1017f6f966c1f2ba603d16610929ed1a69c881343d1bfec"} Dec 03 21:47:26.739593 master-0 kubenswrapper[4754]: I1203 21:47:26.739437 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:26.739744 master-0 kubenswrapper[4754]: I1203 21:47:26.739703 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:26.739744 master-0 kubenswrapper[4754]: I1203 21:47:26.739740 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:26.739947 master-0 kubenswrapper[4754]: I1203 21:47:26.739760 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:26.740479 master-0 kubenswrapper[4754]: I1203 21:47:26.740444 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:26.740479 master-0 kubenswrapper[4754]: I1203 21:47:26.740474 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:26.740479 master-0 kubenswrapper[4754]: I1203 21:47:26.740485 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:27.763985 master-0 kubenswrapper[4754]: I1203 21:47:27.758751 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"f211b15e8d62153f4deaa1bf7dfc87de0781805cce6c2aabbe56ed6f61fa1aa7"} Dec 03 21:47:27.763985 master-0 kubenswrapper[4754]: I1203 21:47:27.758883 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:27.763985 master-0 kubenswrapper[4754]: I1203 21:47:27.759657 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:27.763985 master-0 kubenswrapper[4754]: I1203 21:47:27.759676 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:27.763985 master-0 kubenswrapper[4754]: I1203 21:47:27.759684 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:27.763985 master-0 kubenswrapper[4754]: I1203 21:47:27.759923 4754 scope.go:117] "RemoveContainer" containerID="0d4f1a1f19ffb465b7ec7db2a36e7c6f6dc0dc2fd64d69d37c5c1b12205b376b" Dec 03 21:47:28.265507 master-0 kubenswrapper[4754]: E1203 21:47:28.264553 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e37f4649 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.495803977 +0000 UTC m=+0.248901682,LastTimestamp:2025-12-03 21:47:16.495803977 +0000 UTC m=+0.248901682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.270860 master-0 kubenswrapper[4754]: W1203 21:47:28.269691 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:28.270860 master-0 kubenswrapper[4754]: I1203 21:47:28.269739 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:28.270860 master-0 kubenswrapper[4754]: E1203 21:47:28.269799 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:28.270860 master-0 kubenswrapper[4754]: W1203 21:47:28.269856 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Dec 03 21:47:28.270860 master-0 kubenswrapper[4754]: E1203 21:47:28.269908 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:28.270860 master-0 kubenswrapper[4754]: E1203 21:47:28.269913 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.279484 master-0 kubenswrapper[4754]: E1203 21:47:28.279106 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.281864 master-0 kubenswrapper[4754]: E1203 21:47:28.281548 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.288918 master-0 kubenswrapper[4754]: E1203 21:47:28.288298 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3ec43cb0a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.642900746 +0000 UTC m=+0.395998381,LastTimestamp:2025-12-03 21:47:16.642900746 +0000 UTC m=+0.395998381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.302691 master-0 kubenswrapper[4754]: E1203 21:47:28.302142 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.741437209 +0000 UTC m=+0.494534834,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.314703 master-0 kubenswrapper[4754]: E1203 21:47:28.314511 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.74146392 +0000 UTC m=+0.494561545,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.325647 master-0 kubenswrapper[4754]: E1203 21:47:28.325502 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c7327\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.741474601 +0000 UTC m=+0.494572236,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.331040 master-0 kubenswrapper[4754]: E1203 21:47:28.330440 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.787735731 +0000 UTC m=+0.540833346,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.336206 master-0 kubenswrapper[4754]: E1203 21:47:28.336010 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.787789113 +0000 UTC m=+0.540886728,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.341487 master-0 kubenswrapper[4754]: E1203 21:47:28.341234 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c7327\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.787802974 +0000 UTC m=+0.540900589,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.346162 master-0 kubenswrapper[4754]: E1203 21:47:28.346076 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.789333993 +0000 UTC m=+0.542431648,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.350117 master-0 kubenswrapper[4754]: E1203 21:47:28.350015 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.789369645 +0000 UTC m=+0.542467290,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.355811 master-0 kubenswrapper[4754]: E1203 21:47:28.355705 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c7327\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.789402596 +0000 UTC m=+0.542500241,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.361011 master-0 kubenswrapper[4754]: E1203 21:47:28.360742 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.789587955 +0000 UTC m=+0.542685570,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.366291 master-0 kubenswrapper[4754]: E1203 21:47:28.366140 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.789686069 +0000 UTC m=+0.542783694,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.371048 master-0 kubenswrapper[4754]: E1203 21:47:28.370893 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c7327\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.789716951 +0000 UTC m=+0.542814566,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.376218 master-0 kubenswrapper[4754]: E1203 21:47:28.376045 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.79056499 +0000 UTC m=+0.543662605,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.382001 master-0 kubenswrapper[4754]: E1203 21:47:28.381838 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.7905767 +0000 UTC m=+0.543674315,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.386914 master-0 kubenswrapper[4754]: E1203 21:47:28.386791 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c7327\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.790585011 +0000 UTC m=+0.543682626,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.391834 master-0 kubenswrapper[4754]: E1203 21:47:28.391635 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.791245731 +0000 UTC m=+0.544343346,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.396154 master-0 kubenswrapper[4754]: I1203 21:47:28.396031 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:28.397434 master-0 kubenswrapper[4754]: E1203 21:47:28.397139 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.791256882 +0000 UTC m=+0.544354487,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.402171 master-0 kubenswrapper[4754]: E1203 21:47:28.402061 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c7327\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c7327 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556436263 +0000 UTC m=+0.309533878,LastTimestamp:2025-12-03 21:47:16.791267152 +0000 UTC m=+0.544364757,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.410375 master-0 kubenswrapper[4754]: E1203 21:47:28.409397 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71bf521\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71bf521 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556404001 +0000 UTC m=+0.309501616,LastTimestamp:2025-12-03 21:47:16.79143403 +0000 UTC m=+0.544531645,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.416109 master-0 kubenswrapper[4754]: E1203 21:47:28.414750 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.187dd2d3e71c4978\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.187dd2d3e71c4978 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:16.556425592 +0000 UTC m=+0.309523207,LastTimestamp:2025-12-03 21:47:16.79144248 +0000 UTC m=+0.544540095,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.427130 master-0 kubenswrapper[4754]: E1203 21:47:28.426965 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d430da8c2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:17.793631276 +0000 UTC m=+1.546728931,LastTimestamp:2025-12-03 21:47:17.793631276 +0000 UTC m=+1.546728931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.432966 master-0 kubenswrapper[4754]: E1203 21:47:28.432745 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d430dbb1c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:17.79370644 +0000 UTC m=+1.546804095,LastTimestamp:2025-12-03 21:47:17.79370644 +0000 UTC m=+1.546804095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.438288 master-0 kubenswrapper[4754]: E1203 21:47:28.438135 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd2d4325f2abc kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:17.819099836 +0000 UTC m=+1.572197481,LastTimestamp:2025-12-03 21:47:17.819099836 +0000 UTC m=+1.572197481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.443728 master-0 kubenswrapper[4754]: E1203 21:47:28.443605 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d433b22888 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:17.841315976 +0000 UTC m=+1.594413601,LastTimestamp:2025-12-03 21:47:17.841315976 +0000 UTC m=+1.594413601,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.449010 master-0 kubenswrapper[4754]: E1203 21:47:28.448844 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d4355348bf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:17.868652735 +0000 UTC m=+1.621750360,LastTimestamp:2025-12-03 21:47:17.868652735 +0000 UTC m=+1.621750360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.456273 master-0 kubenswrapper[4754]: E1203 21:47:28.456096 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d494435775 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" in 1.62s (1.62s including waiting). Image size: 459566623 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:19.461443445 +0000 UTC m=+3.214541100,LastTimestamp:2025-12-03 21:47:19.461443445 +0000 UTC m=+3.214541100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.485162 master-0 kubenswrapper[4754]: E1203 21:47:28.480988 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d4a0eaf0a5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:19.673753765 +0000 UTC m=+3.426851380,LastTimestamp:2025-12-03 21:47:19.673753765 +0000 UTC m=+3.426851380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.496320 master-0 kubenswrapper[4754]: E1203 21:47:28.496179 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d4a1de3db1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:19.689698737 +0000 UTC m=+3.442796362,LastTimestamp:2025-12-03 21:47:19.689698737 +0000 UTC m=+3.442796362,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.511262 master-0 kubenswrapper[4754]: E1203 21:47:28.511101 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d516a1a5f9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:21.648662009 +0000 UTC m=+5.401759624,LastTimestamp:2025-12-03 21:47:21.648662009 +0000 UTC m=+5.401759624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.511638 master-0 kubenswrapper[4754]: I1203 21:47:28.511595 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:28.517916 master-0 kubenswrapper[4754]: E1203 21:47:28.517740 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d535d748f0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:22.172270832 +0000 UTC m=+5.925368467,LastTimestamp:2025-12-03 21:47:22.172270832 +0000 UTC m=+5.925368467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.523518 master-0 kubenswrapper[4754]: E1203 21:47:28.523376 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d536f78b45 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:22.191162181 +0000 UTC m=+5.944259826,LastTimestamp:2025-12-03 21:47:22.191162181 +0000 UTC m=+5.944259826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.530094 master-0 kubenswrapper[4754]: E1203 21:47:28.529783 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d5d4e77c13 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\" in 7.047s (7.047s including waiting). Image size: 532668041 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:24.840909843 +0000 UTC m=+8.594007488,LastTimestamp:2025-12-03 21:47:24.840909843 +0000 UTC m=+8.594007488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.534625 master-0 kubenswrapper[4754]: E1203 21:47:28.534481 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d516a1a5f9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d516a1a5f9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:21.648662009 +0000 UTC m=+5.401759624,LastTimestamp:2025-12-03 21:47:24.841437542 +0000 UTC m=+8.594535157,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.540209 master-0 kubenswrapper[4754]: E1203 21:47:28.540047 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d5da97450d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" in 7.067s (7.067s including waiting). Image size: 938321573 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:24.936316173 +0000 UTC m=+8.689413798,LastTimestamp:2025-12-03 21:47:24.936316173 +0000 UTC m=+8.689413798,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.544853 master-0 kubenswrapper[4754]: E1203 21:47:28.544681 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d5db7a97f1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" in 7.157s (7.157s including waiting). Image size: 938321573 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:24.951214065 +0000 UTC m=+8.704311690,LastTimestamp:2025-12-03 21:47:24.951214065 +0000 UTC m=+8.704311690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.550665 master-0 kubenswrapper[4754]: E1203 21:47:28.550513 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd2d5dd7a6018 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" in 7.165s (7.165s including waiting). Image size: 938321573 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:24.9847542 +0000 UTC m=+8.737851825,LastTimestamp:2025-12-03 21:47:24.9847542 +0000 UTC m=+8.737851825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.556034 master-0 kubenswrapper[4754]: E1203 21:47:28.555906 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d535d748f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d535d748f0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:22.172270832 +0000 UTC m=+5.925368467,LastTimestamp:2025-12-03 21:47:25.085242345 +0000 UTC m=+8.838339960,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.561001 master-0 kubenswrapper[4754]: E1203 21:47:28.560908 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d5e3a555fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.088232955 +0000 UTC m=+8.841330570,LastTimestamp:2025-12-03 21:47:25.088232955 +0000 UTC m=+8.841330570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.565969 master-0 kubenswrapper[4754]: E1203 21:47:28.565823 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d536f78b45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d536f78b45 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:22.191162181 +0000 UTC m=+5.944259826,LastTimestamp:2025-12-03 21:47:25.098810954 +0000 UTC m=+8.851908579,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.571887 master-0 kubenswrapper[4754]: E1203 21:47:28.571703 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d5e458ae84 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.099986564 +0000 UTC m=+8.853084189,LastTimestamp:2025-12-03 21:47:25.099986564 +0000 UTC m=+8.853084189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.593379 master-0 kubenswrapper[4754]: E1203 21:47:28.593085 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d5e4a602d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.105054419 +0000 UTC m=+8.858152034,LastTimestamp:2025-12-03 21:47:25.105054419 +0000 UTC m=+8.858152034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.597547 master-0 kubenswrapper[4754]: E1203 21:47:28.597353 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d5e504288f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.111224463 +0000 UTC m=+8.864322078,LastTimestamp:2025-12-03 21:47:25.111224463 +0000 UTC m=+8.864322078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.601310 master-0 kubenswrapper[4754]: E1203 21:47:28.601237 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d5e589be53 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.119979091 +0000 UTC m=+8.873076706,LastTimestamp:2025-12-03 21:47:25.119979091 +0000 UTC m=+8.873076706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.604930 master-0 kubenswrapper[4754]: E1203 21:47:28.604845 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d5e59f8b8d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.121407885 +0000 UTC m=+8.874505500,LastTimestamp:2025-12-03 21:47:25.121407885 +0000 UTC m=+8.874505500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.609240 master-0 kubenswrapper[4754]: E1203 21:47:28.609166 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d5e69c845e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.137986654 +0000 UTC m=+8.891084269,LastTimestamp:2025-12-03 21:47:25.137986654 +0000 UTC m=+8.891084269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.613376 master-0 kubenswrapper[4754]: E1203 21:47:28.613307 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d5e7c71eca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.157555914 +0000 UTC m=+8.910653529,LastTimestamp:2025-12-03 21:47:25.157555914 +0000 UTC m=+8.910653529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.617401 master-0 kubenswrapper[4754]: E1203 21:47:28.617317 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd2d5e812185b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.162469467 +0000 UTC m=+8.915567082,LastTimestamp:2025-12-03 21:47:25.162469467 +0000 UTC m=+8.915567082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.621098 master-0 kubenswrapper[4754]: E1203 21:47:28.621033 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd2d5e8b4aa68 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.173123688 +0000 UTC m=+8.926221303,LastTimestamp:2025-12-03 21:47:25.173123688 +0000 UTC m=+8.926221303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.624685 master-0 kubenswrapper[4754]: E1203 21:47:28.624613 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d5f0b99866 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.307664486 +0000 UTC m=+9.060762111,LastTimestamp:2025-12-03 21:47:25.307664486 +0000 UTC m=+9.060762111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.628100 master-0 kubenswrapper[4754]: E1203 21:47:28.628025 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd2d5f18a2e73 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.321334387 +0000 UTC m=+9.074432002,LastTimestamp:2025-12-03 21:47:25.321334387 +0000 UTC m=+9.074432002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.631935 master-0 kubenswrapper[4754]: E1203 21:47:28.631867 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d6097c01f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.723058673 +0000 UTC m=+9.476156278,LastTimestamp:2025-12-03 21:47:25.723058673 +0000 UTC m=+9.476156278,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.635351 master-0 kubenswrapper[4754]: E1203 21:47:28.635285 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d60a1c2c78 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.73355532 +0000 UTC m=+9.486652945,LastTimestamp:2025-12-03 21:47:25.73355532 +0000 UTC m=+9.486652945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.638954 master-0 kubenswrapper[4754]: E1203 21:47:28.638806 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d615f5f53b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.932377403 +0000 UTC m=+9.685475058,LastTimestamp:2025-12-03 21:47:25.932377403 +0000 UTC m=+9.685475058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.642490 master-0 kubenswrapper[4754]: E1203 21:47:28.642409 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d6169432dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.942747868 +0000 UTC m=+9.695845493,LastTimestamp:2025-12-03 21:47:25.942747868 +0000 UTC m=+9.695845493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.645545 master-0 kubenswrapper[4754]: E1203 21:47:28.645474 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d616a62f14 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.943926548 +0000 UTC m=+9.697024173,LastTimestamp:2025-12-03 21:47:25.943926548 +0000 UTC m=+9.697024173,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.649555 master-0 kubenswrapper[4754]: E1203 21:47:28.649484 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d6097c01f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d6097c01f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.723058673 +0000 UTC m=+9.476156278,LastTimestamp:2025-12-03 21:47:26.738998668 +0000 UTC m=+10.492096303,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.652953 master-0 kubenswrapper[4754]: E1203 21:47:28.652889 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d65590053b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\" in 1.877s (1.878s including waiting). Image size: 499719811 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:26.999438651 +0000 UTC m=+10.752536266,LastTimestamp:2025-12-03 21:47:26.999438651 +0000 UTC m=+10.752536266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.655865 master-0 kubenswrapper[4754]: E1203 21:47:28.655783 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d661f3ee6f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:27.207313007 +0000 UTC m=+10.960410632,LastTimestamp:2025-12-03 21:47:27.207313007 +0000 UTC m=+10.960410632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.659040 master-0 kubenswrapper[4754]: E1203 21:47:28.658982 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d66299a93e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:27.21817427 +0000 UTC m=+10.971271895,LastTimestamp:2025-12-03 21:47:27.21817427 +0000 UTC m=+10.971271895,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.662283 master-0 kubenswrapper[4754]: E1203 21:47:28.662211 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d6a5ba40ba kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:28.344383674 +0000 UTC m=+12.097481289,LastTimestamp:2025-12-03 21:47:28.344383674 +0000 UTC m=+12.097481289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:28.665970 master-0 kubenswrapper[4754]: E1203 21:47:28.665906 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d6a855572f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\" in 2.444s (2.444s including waiting). Image size: 509451797 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:28.388101935 +0000 UTC m=+12.141199550,LastTimestamp:2025-12-03 21:47:28.388101935 +0000 UTC m=+12.141199550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:29.125085 master-0 kubenswrapper[4754]: E1203 21:47:29.124931 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.187dd2d5e504288f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d5e504288f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.111224463 +0000 UTC m=+8.864322078,LastTimestamp:2025-12-03 21:47:29.111290326 +0000 UTC m=+12.864387941,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:29.125085 master-0 kubenswrapper[4754]: E1203 21:47:29.125051 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 03 21:47:29.218401 master-0 kubenswrapper[4754]: E1203 21:47:29.218144 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d6d932ee20 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:29.2079304 +0000 UTC m=+12.961028005,LastTimestamp:2025-12-03 21:47:29.2079304 +0000 UTC m=+12.961028005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:29.253084 master-0 kubenswrapper[4754]: E1203 21:47:29.252762 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.187dd2d5e589be53\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd2d5e589be53 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.119979091 +0000 UTC m=+8.873076706,LastTimestamp:2025-12-03 21:47:29.244137259 +0000 UTC m=+12.997234914,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:29.348462 master-0 kubenswrapper[4754]: E1203 21:47:29.348205 4754 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.187dd2d6db716cf4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:13238af3704fe583f617f61e755cf4c2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:29.245580532 +0000 UTC m=+12.998678147,LastTimestamp:2025-12-03 21:47:29.245580532 +0000 UTC m=+12.998678147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:29.362692 master-0 kubenswrapper[4754]: I1203 21:47:29.362622 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:29.364027 master-0 kubenswrapper[4754]: I1203 21:47:29.363988 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:29.364027 master-0 kubenswrapper[4754]: I1203 21:47:29.364029 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:29.364152 master-0 kubenswrapper[4754]: I1203 21:47:29.364042 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:29.364152 master-0 kubenswrapper[4754]: I1203 21:47:29.364096 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:29.369309 master-0 kubenswrapper[4754]: E1203 21:47:29.369263 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 03 21:47:29.503805 master-0 kubenswrapper[4754]: I1203 21:47:29.503720 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:29.771985 master-0 kubenswrapper[4754]: I1203 21:47:29.771862 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133"} Dec 03 21:47:29.771985 master-0 kubenswrapper[4754]: I1203 21:47:29.771898 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:29.773364 master-0 kubenswrapper[4754]: I1203 21:47:29.773314 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:29.773364 master-0 kubenswrapper[4754]: I1203 21:47:29.773357 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:29.773364 master-0 kubenswrapper[4754]: I1203 21:47:29.773368 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:29.775401 master-0 kubenswrapper[4754]: I1203 21:47:29.775309 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"6c639bc69c380d96355863b219dedf37b52fea28072ae5913704ecc349c5d8c1"} Dec 03 21:47:29.775662 master-0 kubenswrapper[4754]: I1203 21:47:29.775457 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:29.776509 master-0 kubenswrapper[4754]: I1203 21:47:29.776476 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:29.776652 master-0 kubenswrapper[4754]: I1203 21:47:29.776527 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:29.776652 master-0 kubenswrapper[4754]: I1203 21:47:29.776546 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:30.045856 master-0 kubenswrapper[4754]: I1203 21:47:30.045669 4754 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:30.053684 master-0 kubenswrapper[4754]: I1203 21:47:30.053639 4754 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:30.505019 master-0 kubenswrapper[4754]: I1203 21:47:30.504970 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:30.777732 master-0 kubenswrapper[4754]: I1203 21:47:30.777587 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:30.777732 master-0 kubenswrapper[4754]: I1203 21:47:30.777635 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:30.777732 master-0 kubenswrapper[4754]: I1203 21:47:30.777594 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:30.778741 master-0 kubenswrapper[4754]: I1203 21:47:30.778673 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:30.778741 master-0 kubenswrapper[4754]: I1203 21:47:30.778704 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:30.778741 master-0 kubenswrapper[4754]: I1203 21:47:30.778716 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:30.779105 master-0 kubenswrapper[4754]: I1203 21:47:30.779074 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:30.779184 master-0 kubenswrapper[4754]: I1203 21:47:30.779112 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:30.779184 master-0 kubenswrapper[4754]: I1203 21:47:30.779129 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:31.229490 master-0 kubenswrapper[4754]: I1203 21:47:31.229362 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 21:47:31.251362 master-0 kubenswrapper[4754]: I1203 21:47:31.251294 4754 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 21:47:31.507197 master-0 kubenswrapper[4754]: I1203 21:47:31.507038 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:31.781537 master-0 kubenswrapper[4754]: I1203 21:47:31.781299 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:31.782642 master-0 kubenswrapper[4754]: I1203 21:47:31.782576 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:31.782642 master-0 kubenswrapper[4754]: I1203 21:47:31.782636 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:31.782925 master-0 kubenswrapper[4754]: I1203 21:47:31.782658 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:32.376327 master-0 kubenswrapper[4754]: I1203 21:47:32.376231 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:32.376642 master-0 kubenswrapper[4754]: I1203 21:47:32.376444 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:32.378196 master-0 kubenswrapper[4754]: I1203 21:47:32.378116 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:32.378358 master-0 kubenswrapper[4754]: I1203 21:47:32.378203 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:32.378358 master-0 kubenswrapper[4754]: I1203 21:47:32.378242 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:32.506365 master-0 kubenswrapper[4754]: I1203 21:47:32.506264 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:33.198156 master-0 kubenswrapper[4754]: I1203 21:47:33.197995 4754 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:33.198850 master-0 kubenswrapper[4754]: I1203 21:47:33.198384 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:33.199934 master-0 kubenswrapper[4754]: I1203 21:47:33.199892 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:33.199934 master-0 kubenswrapper[4754]: I1203 21:47:33.199930 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:33.200044 master-0 kubenswrapper[4754]: I1203 21:47:33.199944 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:33.205600 master-0 kubenswrapper[4754]: I1203 21:47:33.205546 4754 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:33.507942 master-0 kubenswrapper[4754]: I1203 21:47:33.507734 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:33.787728 master-0 kubenswrapper[4754]: I1203 21:47:33.787535 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:33.788495 master-0 kubenswrapper[4754]: I1203 21:47:33.788449 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:33.788557 master-0 kubenswrapper[4754]: I1203 21:47:33.788509 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:33.788557 master-0 kubenswrapper[4754]: I1203 21:47:33.788529 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:33.794633 master-0 kubenswrapper[4754]: I1203 21:47:33.794594 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:47:34.505414 master-0 kubenswrapper[4754]: I1203 21:47:34.505320 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:34.716312 master-0 kubenswrapper[4754]: W1203 21:47:34.716205 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Dec 03 21:47:34.716312 master-0 kubenswrapper[4754]: E1203 21:47:34.716311 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:34.791050 master-0 kubenswrapper[4754]: I1203 21:47:34.790812 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:34.792128 master-0 kubenswrapper[4754]: I1203 21:47:34.792063 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:34.792218 master-0 kubenswrapper[4754]: I1203 21:47:34.792145 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:34.792218 master-0 kubenswrapper[4754]: I1203 21:47:34.792175 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:35.505725 master-0 kubenswrapper[4754]: I1203 21:47:35.505477 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:35.816891 master-0 kubenswrapper[4754]: W1203 21:47:35.816823 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Dec 03 21:47:35.816891 master-0 kubenswrapper[4754]: E1203 21:47:35.816880 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:36.132392 master-0 kubenswrapper[4754]: E1203 21:47:36.132300 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 03 21:47:36.356173 master-0 kubenswrapper[4754]: W1203 21:47:36.355987 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Dec 03 21:47:36.356173 master-0 kubenswrapper[4754]: E1203 21:47:36.356065 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:36.369699 master-0 kubenswrapper[4754]: I1203 21:47:36.369635 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:36.371203 master-0 kubenswrapper[4754]: I1203 21:47:36.371135 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:36.371203 master-0 kubenswrapper[4754]: I1203 21:47:36.371186 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:36.371203 master-0 kubenswrapper[4754]: I1203 21:47:36.371195 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:36.371429 master-0 kubenswrapper[4754]: I1203 21:47:36.371250 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:36.379283 master-0 kubenswrapper[4754]: E1203 21:47:36.379236 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 03 21:47:36.509244 master-0 kubenswrapper[4754]: I1203 21:47:36.509153 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:36.592844 master-0 kubenswrapper[4754]: I1203 21:47:36.592725 4754 csr.go:261] certificate signing request csr-n74v4 is approved, waiting to be issued Dec 03 21:47:36.643432 master-0 kubenswrapper[4754]: E1203 21:47:36.643316 4754 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 21:47:37.038853 master-0 kubenswrapper[4754]: I1203 21:47:37.038637 4754 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:37.039091 master-0 kubenswrapper[4754]: I1203 21:47:37.038955 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:37.040569 master-0 kubenswrapper[4754]: I1203 21:47:37.040511 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:37.040569 master-0 kubenswrapper[4754]: I1203 21:47:37.040567 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:37.040708 master-0 kubenswrapper[4754]: I1203 21:47:37.040587 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:37.046273 master-0 kubenswrapper[4754]: I1203 21:47:37.046206 4754 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:37.046361 master-0 kubenswrapper[4754]: I1203 21:47:37.046329 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:37.502016 master-0 kubenswrapper[4754]: I1203 21:47:37.501936 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:37.797519 master-0 kubenswrapper[4754]: I1203 21:47:37.797365 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:37.797519 master-0 kubenswrapper[4754]: I1203 21:47:37.797474 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:37.798806 master-0 kubenswrapper[4754]: I1203 21:47:37.798719 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:37.798912 master-0 kubenswrapper[4754]: I1203 21:47:37.798822 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:37.798912 master-0 kubenswrapper[4754]: I1203 21:47:37.798847 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:38.505398 master-0 kubenswrapper[4754]: I1203 21:47:38.505307 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:38.800440 master-0 kubenswrapper[4754]: I1203 21:47:38.800226 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:38.802030 master-0 kubenswrapper[4754]: I1203 21:47:38.801953 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:38.802171 master-0 kubenswrapper[4754]: I1203 21:47:38.802053 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:38.802171 master-0 kubenswrapper[4754]: I1203 21:47:38.802083 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:39.505549 master-0 kubenswrapper[4754]: I1203 21:47:39.505484 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:39.520762 master-0 kubenswrapper[4754]: W1203 21:47:39.520708 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:39.520964 master-0 kubenswrapper[4754]: E1203 21:47:39.520820 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:40.508116 master-0 kubenswrapper[4754]: I1203 21:47:40.508000 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:41.505167 master-0 kubenswrapper[4754]: I1203 21:47:41.505076 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:41.685562 master-0 kubenswrapper[4754]: I1203 21:47:41.685352 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:41.687276 master-0 kubenswrapper[4754]: I1203 21:47:41.687213 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:41.687276 master-0 kubenswrapper[4754]: I1203 21:47:41.687263 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:41.687276 master-0 kubenswrapper[4754]: I1203 21:47:41.687280 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:41.687891 master-0 kubenswrapper[4754]: I1203 21:47:41.687857 4754 scope.go:117] "RemoveContainer" containerID="b2e433bb0d812f8998f9c3d94046039e4ad63092afce7ea8f56800703f9ed872" Dec 03 21:47:41.697765 master-0 kubenswrapper[4754]: E1203 21:47:41.697541 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d516a1a5f9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d516a1a5f9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:21.648662009 +0000 UTC m=+5.401759624,LastTimestamp:2025-12-03 21:47:41.691434912 +0000 UTC m=+25.444532577,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:41.976483 master-0 kubenswrapper[4754]: E1203 21:47:41.976314 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d535d748f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d535d748f0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:22.172270832 +0000 UTC m=+5.925368467,LastTimestamp:2025-12-03 21:47:41.967820645 +0000 UTC m=+25.720918300,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:42.000865 master-0 kubenswrapper[4754]: E1203 21:47:41.999419 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d536f78b45\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d536f78b45 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:22.191162181 +0000 UTC m=+5.944259826,LastTimestamp:2025-12-03 21:47:41.99066077 +0000 UTC m=+25.743758435,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:42.505494 master-0 kubenswrapper[4754]: I1203 21:47:42.505399 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:42.812317 master-0 kubenswrapper[4754]: I1203 21:47:42.812127 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 21:47:42.813960 master-0 kubenswrapper[4754]: I1203 21:47:42.813910 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/1.log" Dec 03 21:47:42.814576 master-0 kubenswrapper[4754]: I1203 21:47:42.814523 4754 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" exitCode=1 Dec 03 21:47:42.814576 master-0 kubenswrapper[4754]: I1203 21:47:42.814571 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864"} Dec 03 21:47:42.814724 master-0 kubenswrapper[4754]: I1203 21:47:42.814615 4754 scope.go:117] "RemoveContainer" containerID="b2e433bb0d812f8998f9c3d94046039e4ad63092afce7ea8f56800703f9ed872" Dec 03 21:47:42.814830 master-0 kubenswrapper[4754]: I1203 21:47:42.814809 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:42.816933 master-0 kubenswrapper[4754]: I1203 21:47:42.816874 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:42.817050 master-0 kubenswrapper[4754]: I1203 21:47:42.816949 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:42.817050 master-0 kubenswrapper[4754]: I1203 21:47:42.816969 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:42.818011 master-0 kubenswrapper[4754]: I1203 21:47:42.817530 4754 scope.go:117] "RemoveContainer" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" Dec 03 21:47:42.818011 master-0 kubenswrapper[4754]: E1203 21:47:42.817824 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 21:47:42.825732 master-0 kubenswrapper[4754]: E1203 21:47:42.825515 4754 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.187dd2d6097c01f1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.187dd2d6097c01f1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:b495b0c38f2c54e7cc46282c5f92aab5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:47:25.723058673 +0000 UTC m=+9.476156278,LastTimestamp:2025-12-03 21:47:42.817731769 +0000 UTC m=+26.570829414,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:47:43.141360 master-0 kubenswrapper[4754]: E1203 21:47:43.141252 4754 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Dec 03 21:47:43.380411 master-0 kubenswrapper[4754]: I1203 21:47:43.380292 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:43.381993 master-0 kubenswrapper[4754]: I1203 21:47:43.381917 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:43.382100 master-0 kubenswrapper[4754]: I1203 21:47:43.382000 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:43.382100 master-0 kubenswrapper[4754]: I1203 21:47:43.382020 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:43.382100 master-0 kubenswrapper[4754]: I1203 21:47:43.382077 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:43.389301 master-0 kubenswrapper[4754]: E1203 21:47:43.389245 4754 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Dec 03 21:47:43.506148 master-0 kubenswrapper[4754]: I1203 21:47:43.505945 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:43.821938 master-0 kubenswrapper[4754]: I1203 21:47:43.821725 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 21:47:44.505943 master-0 kubenswrapper[4754]: I1203 21:47:44.505841 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:45.505423 master-0 kubenswrapper[4754]: I1203 21:47:45.505361 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:46.506699 master-0 kubenswrapper[4754]: I1203 21:47:46.506583 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:46.643743 master-0 kubenswrapper[4754]: E1203 21:47:46.643612 4754 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Dec 03 21:47:47.508253 master-0 kubenswrapper[4754]: I1203 21:47:47.508163 4754 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Dec 03 21:47:47.627455 master-0 kubenswrapper[4754]: W1203 21:47:47.627388 4754 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Dec 03 21:47:47.627825 master-0 kubenswrapper[4754]: E1203 21:47:47.627478 4754 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Dec 03 21:47:48.100520 master-0 kubenswrapper[4754]: I1203 21:47:48.100454 4754 csr.go:257] certificate signing request csr-n74v4 is issued Dec 03 21:47:48.400881 master-0 kubenswrapper[4754]: I1203 21:47:48.400814 4754 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 03 21:47:48.407233 master-0 kubenswrapper[4754]: I1203 21:47:48.403898 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:47:48.407233 master-0 kubenswrapper[4754]: I1203 21:47:48.405761 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:48.410029 master-0 kubenswrapper[4754]: I1203 21:47:48.409973 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:48.410391 master-0 kubenswrapper[4754]: I1203 21:47:48.410375 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:48.410474 master-0 kubenswrapper[4754]: I1203 21:47:48.410463 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:48.513376 master-0 kubenswrapper[4754]: I1203 21:47:48.513302 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:48.529304 master-0 kubenswrapper[4754]: I1203 21:47:48.529232 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:48.589678 master-0 kubenswrapper[4754]: I1203 21:47:48.589625 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:48.857928 master-0 kubenswrapper[4754]: I1203 21:47:48.857752 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:48.857928 master-0 kubenswrapper[4754]: E1203 21:47:48.857812 4754 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 03 21:47:48.879638 master-0 kubenswrapper[4754]: I1203 21:47:48.879568 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:48.894870 master-0 kubenswrapper[4754]: I1203 21:47:48.894824 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:48.953513 master-0 kubenswrapper[4754]: I1203 21:47:48.953422 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:49.102442 master-0 kubenswrapper[4754]: I1203 21:47:49.102360 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 15:30:44.002290781 +0000 UTC Dec 03 21:47:49.102815 master-0 kubenswrapper[4754]: I1203 21:47:49.102755 4754 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h42m54.899545116s for next certificate rotation Dec 03 21:47:49.237609 master-0 kubenswrapper[4754]: I1203 21:47:49.237549 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:49.238053 master-0 kubenswrapper[4754]: E1203 21:47:49.238028 4754 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 03 21:47:49.344602 master-0 kubenswrapper[4754]: I1203 21:47:49.344534 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:49.364317 master-0 kubenswrapper[4754]: I1203 21:47:49.364273 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:49.423513 master-0 kubenswrapper[4754]: I1203 21:47:49.423451 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:49.680133 master-0 kubenswrapper[4754]: I1203 21:47:49.680084 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:49.680947 master-0 kubenswrapper[4754]: E1203 21:47:49.680712 4754 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Dec 03 21:47:49.861069 master-0 kubenswrapper[4754]: I1203 21:47:49.860950 4754 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 21:47:50.148140 master-0 kubenswrapper[4754]: E1203 21:47:50.148039 4754 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Dec 03 21:47:50.230968 master-0 kubenswrapper[4754]: I1203 21:47:50.230839 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:50.248817 master-0 kubenswrapper[4754]: I1203 21:47:50.248724 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:50.306518 master-0 kubenswrapper[4754]: I1203 21:47:50.306441 4754 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Dec 03 21:47:50.390112 master-0 kubenswrapper[4754]: I1203 21:47:50.390002 4754 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:47:50.391889 master-0 kubenswrapper[4754]: I1203 21:47:50.391744 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:47:50.392014 master-0 kubenswrapper[4754]: I1203 21:47:50.391904 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:47:50.392014 master-0 kubenswrapper[4754]: I1203 21:47:50.391931 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:47:50.392141 master-0 kubenswrapper[4754]: I1203 21:47:50.392047 4754 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:47:50.402575 master-0 kubenswrapper[4754]: I1203 21:47:50.402438 4754 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 03 21:47:50.402575 master-0 kubenswrapper[4754]: E1203 21:47:50.402517 4754 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Dec 03 21:47:50.418669 master-0 kubenswrapper[4754]: E1203 21:47:50.418627 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:50.519897 master-0 kubenswrapper[4754]: I1203 21:47:50.519821 4754 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Dec 03 21:47:50.520170 master-0 kubenswrapper[4754]: E1203 21:47:50.519998 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:50.538740 master-0 kubenswrapper[4754]: I1203 21:47:50.538635 4754 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 21:47:50.621001 master-0 kubenswrapper[4754]: E1203 21:47:50.620866 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:50.721906 master-0 kubenswrapper[4754]: E1203 21:47:50.721623 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:50.822591 master-0 kubenswrapper[4754]: E1203 21:47:50.822482 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:50.923515 master-0 kubenswrapper[4754]: E1203 21:47:50.923426 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.024678 master-0 kubenswrapper[4754]: E1203 21:47:51.024483 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.125450 master-0 kubenswrapper[4754]: E1203 21:47:51.125380 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.226091 master-0 kubenswrapper[4754]: E1203 21:47:51.225997 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.327072 master-0 kubenswrapper[4754]: E1203 21:47:51.326868 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.427510 master-0 kubenswrapper[4754]: E1203 21:47:51.427395 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.527721 master-0 kubenswrapper[4754]: E1203 21:47:51.527633 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.627921 master-0 kubenswrapper[4754]: E1203 21:47:51.627839 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.728489 master-0 kubenswrapper[4754]: E1203 21:47:51.728412 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.829358 master-0 kubenswrapper[4754]: E1203 21:47:51.829235 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:51.930547 master-0 kubenswrapper[4754]: E1203 21:47:51.930372 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.031593 master-0 kubenswrapper[4754]: E1203 21:47:52.031525 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.132547 master-0 kubenswrapper[4754]: E1203 21:47:52.132438 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.233467 master-0 kubenswrapper[4754]: E1203 21:47:52.233275 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.334062 master-0 kubenswrapper[4754]: E1203 21:47:52.333967 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.434578 master-0 kubenswrapper[4754]: E1203 21:47:52.434475 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.535358 master-0 kubenswrapper[4754]: E1203 21:47:52.535193 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.635442 master-0 kubenswrapper[4754]: E1203 21:47:52.635329 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.735807 master-0 kubenswrapper[4754]: E1203 21:47:52.735682 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.836851 master-0 kubenswrapper[4754]: E1203 21:47:52.836672 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:52.937237 master-0 kubenswrapper[4754]: E1203 21:47:52.937118 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.038126 master-0 kubenswrapper[4754]: E1203 21:47:53.038026 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.139026 master-0 kubenswrapper[4754]: E1203 21:47:53.138921 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.239759 master-0 kubenswrapper[4754]: E1203 21:47:53.239685 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.340946 master-0 kubenswrapper[4754]: E1203 21:47:53.340856 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.441605 master-0 kubenswrapper[4754]: E1203 21:47:53.441412 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.542009 master-0 kubenswrapper[4754]: E1203 21:47:53.541872 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.642853 master-0 kubenswrapper[4754]: E1203 21:47:53.642735 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.744071 master-0 kubenswrapper[4754]: E1203 21:47:53.743851 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.844737 master-0 kubenswrapper[4754]: E1203 21:47:53.844665 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:53.944974 master-0 kubenswrapper[4754]: E1203 21:47:53.944844 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.046021 master-0 kubenswrapper[4754]: E1203 21:47:54.045822 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.146612 master-0 kubenswrapper[4754]: E1203 21:47:54.146511 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.247400 master-0 kubenswrapper[4754]: E1203 21:47:54.247289 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.348510 master-0 kubenswrapper[4754]: E1203 21:47:54.348310 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.449551 master-0 kubenswrapper[4754]: E1203 21:47:54.449446 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.550129 master-0 kubenswrapper[4754]: E1203 21:47:54.550006 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.651326 master-0 kubenswrapper[4754]: E1203 21:47:54.651222 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.752161 master-0 kubenswrapper[4754]: E1203 21:47:54.752015 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.852241 master-0 kubenswrapper[4754]: E1203 21:47:54.852137 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:54.953009 master-0 kubenswrapper[4754]: E1203 21:47:54.952748 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.053230 master-0 kubenswrapper[4754]: E1203 21:47:55.053140 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.154067 master-0 kubenswrapper[4754]: E1203 21:47:55.153908 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.254842 master-0 kubenswrapper[4754]: E1203 21:47:55.254639 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.354975 master-0 kubenswrapper[4754]: E1203 21:47:55.354868 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.456071 master-0 kubenswrapper[4754]: E1203 21:47:55.455972 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.556669 master-0 kubenswrapper[4754]: E1203 21:47:55.556510 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.657263 master-0 kubenswrapper[4754]: E1203 21:47:55.657160 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.758467 master-0 kubenswrapper[4754]: E1203 21:47:55.758339 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.859333 master-0 kubenswrapper[4754]: E1203 21:47:55.859149 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:55.960216 master-0 kubenswrapper[4754]: E1203 21:47:55.960072 4754 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Dec 03 21:47:56.048594 master-0 kubenswrapper[4754]: I1203 21:47:56.048473 4754 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 21:47:56.512810 master-0 kubenswrapper[4754]: I1203 21:47:56.512679 4754 apiserver.go:52] "Watching apiserver" Dec 03 21:47:56.516640 master-0 kubenswrapper[4754]: I1203 21:47:56.516585 4754 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 21:47:56.517134 master-0 kubenswrapper[4754]: I1203 21:47:56.517045 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-q7jjz","openshift-cluster-version/cluster-version-operator-869c786959-2bnjf","openshift-network-operator/network-operator-6cbf58c977-zk7jw"] Dec 03 21:47:56.517597 master-0 kubenswrapper[4754]: I1203 21:47:56.517563 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.517744 master-0 kubenswrapper[4754]: I1203 21:47:56.517700 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.517881 master-0 kubenswrapper[4754]: I1203 21:47:56.517700 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.521234 master-0 kubenswrapper[4754]: I1203 21:47:56.521188 4754 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Dec 03 21:47:56.521487 master-0 kubenswrapper[4754]: I1203 21:47:56.521449 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 21:47:56.521537 master-0 kubenswrapper[4754]: I1203 21:47:56.521521 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 21:47:56.521695 master-0 kubenswrapper[4754]: I1203 21:47:56.521663 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Dec 03 21:47:56.522437 master-0 kubenswrapper[4754]: I1203 21:47:56.522369 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 21:47:56.522437 master-0 kubenswrapper[4754]: I1203 21:47:56.522371 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Dec 03 21:47:56.523252 master-0 kubenswrapper[4754]: I1203 21:47:56.522507 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 21:47:56.523252 master-0 kubenswrapper[4754]: I1203 21:47:56.522631 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 21:47:56.523252 master-0 kubenswrapper[4754]: I1203 21:47:56.523010 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 21:47:56.523949 master-0 kubenswrapper[4754]: I1203 21:47:56.523762 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Dec 03 21:47:56.601124 master-0 kubenswrapper[4754]: I1203 21:47:56.601038 4754 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 03 21:47:56.606409 master-0 kubenswrapper[4754]: I1203 21:47:56.606349 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-var-run-resolv-conf\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.606513 master-0 kubenswrapper[4754]: I1203 21:47:56.606414 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-ca-bundle\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.606513 master-0 kubenswrapper[4754]: I1203 21:47:56.606451 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.606513 master-0 kubenswrapper[4754]: I1203 21:47:56.606485 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.606513 master-0 kubenswrapper[4754]: I1203 21:47:56.606522 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k829m\" (UniqueName: \"kubernetes.io/projected/87c3edb2-12e8-45b0-99ac-9a794dd2881d-kube-api-access-k829m\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.606556 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.606646 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.606737 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.606833 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-resolv-conf\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.606912 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-sno-bootstrap-files\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.606991 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.607099 master-0 kubenswrapper[4754]: I1203 21:47:56.607091 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.607496 master-0 kubenswrapper[4754]: I1203 21:47:56.607189 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.708019 master-0 kubenswrapper[4754]: I1203 21:47:56.707913 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.708019 master-0 kubenswrapper[4754]: I1203 21:47:56.707980 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.708019 master-0 kubenswrapper[4754]: I1203 21:47:56.708015 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.708019 master-0 kubenswrapper[4754]: I1203 21:47:56.708044 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-var-run-resolv-conf\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708069 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-ca-bundle\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708095 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708100 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708117 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708195 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708201 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k829m\" (UniqueName: \"kubernetes.io/projected/87c3edb2-12e8-45b0-99ac-9a794dd2881d-kube-api-access-k829m\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708231 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708263 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708289 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708317 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-resolv-conf\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708341 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-sno-bootstrap-files\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.708539 master-0 kubenswrapper[4754]: I1203 21:47:56.708392 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-sno-bootstrap-files\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.709504 master-0 kubenswrapper[4754]: E1203 21:47:56.708836 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:47:56.709504 master-0 kubenswrapper[4754]: I1203 21:47:56.708868 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.709504 master-0 kubenswrapper[4754]: E1203 21:47:56.708915 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:47:57.208884806 +0000 UTC m=+40.961982431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:47:56.709504 master-0 kubenswrapper[4754]: I1203 21:47:56.708943 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-ca-bundle\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.709504 master-0 kubenswrapper[4754]: I1203 21:47:56.708979 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-var-run-resolv-conf\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.709504 master-0 kubenswrapper[4754]: I1203 21:47:56.709496 4754 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 21:47:56.710029 master-0 kubenswrapper[4754]: I1203 21:47:56.709588 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-resolv-conf\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.710029 master-0 kubenswrapper[4754]: I1203 21:47:56.709694 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.717615 master-0 kubenswrapper[4754]: I1203 21:47:56.717564 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.728012 master-0 kubenswrapper[4754]: I1203 21:47:56.727966 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.730132 master-0 kubenswrapper[4754]: I1203 21:47:56.730065 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:56.730726 master-0 kubenswrapper[4754]: I1203 21:47:56.730681 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k829m\" (UniqueName: \"kubernetes.io/projected/87c3edb2-12e8-45b0-99ac-9a794dd2881d-kube-api-access-k829m\") pod \"assisted-installer-controller-q7jjz\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.852890 master-0 kubenswrapper[4754]: I1203 21:47:56.852669 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:47:56.874910 master-0 kubenswrapper[4754]: I1203 21:47:56.874853 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:47:56.890435 master-0 kubenswrapper[4754]: W1203 21:47:56.890383 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892d5611_debf_402f_abc5_3f99aa080159.slice/crio-865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00 WatchSource:0}: Error finding container 865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00: Status 404 returned error can't find the container with id 865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00 Dec 03 21:47:57.212528 master-0 kubenswrapper[4754]: I1203 21:47:57.212439 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:57.212812 master-0 kubenswrapper[4754]: E1203 21:47:57.212702 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:47:57.212915 master-0 kubenswrapper[4754]: E1203 21:47:57.212887 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:47:58.212845912 +0000 UTC m=+41.965943567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:47:57.434760 master-0 kubenswrapper[4754]: I1203 21:47:57.434609 4754 csr.go:261] certificate signing request csr-znqts is approved, waiting to be issued Dec 03 21:47:57.449058 master-0 kubenswrapper[4754]: I1203 21:47:57.448993 4754 csr.go:257] certificate signing request csr-znqts is issued Dec 03 21:47:57.700431 master-0 kubenswrapper[4754]: I1203 21:47:57.700359 4754 scope.go:117] "RemoveContainer" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" Dec 03 21:47:57.700688 master-0 kubenswrapper[4754]: I1203 21:47:57.700459 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Dec 03 21:47:57.701040 master-0 kubenswrapper[4754]: E1203 21:47:57.701002 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 21:47:57.863132 master-0 kubenswrapper[4754]: I1203 21:47:57.863044 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerStarted","Data":"865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00"} Dec 03 21:47:57.864264 master-0 kubenswrapper[4754]: I1203 21:47:57.864170 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-q7jjz" event={"ID":"87c3edb2-12e8-45b0-99ac-9a794dd2881d","Type":"ContainerStarted","Data":"dd41923c7718b750c80c184d862535442462bd815561e9bbfb7bb52e77b97884"} Dec 03 21:47:57.865249 master-0 kubenswrapper[4754]: I1203 21:47:57.865209 4754 scope.go:117] "RemoveContainer" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" Dec 03 21:47:57.865480 master-0 kubenswrapper[4754]: E1203 21:47:57.865434 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(b495b0c38f2c54e7cc46282c5f92aab5)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="b495b0c38f2c54e7cc46282c5f92aab5" Dec 03 21:47:58.219353 master-0 kubenswrapper[4754]: I1203 21:47:58.219276 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:47:58.219675 master-0 kubenswrapper[4754]: E1203 21:47:58.219588 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:47:58.219799 master-0 kubenswrapper[4754]: E1203 21:47:58.219748 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:48:00.219709071 +0000 UTC m=+43.972806706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:47:58.451292 master-0 kubenswrapper[4754]: I1203 21:47:58.451231 4754 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 18:05:05.127122633 +0000 UTC Dec 03 21:47:58.451292 master-0 kubenswrapper[4754]: I1203 21:47:58.451270 4754 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 20h17m6.675855822s for next certificate rotation Dec 03 21:47:58.982299 master-0 kubenswrapper[4754]: I1203 21:47:58.982223 4754 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 21:47:59.451735 master-0 kubenswrapper[4754]: I1203 21:47:59.451672 4754 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 15:16:44.737326845 +0000 UTC Dec 03 21:47:59.451735 master-0 kubenswrapper[4754]: I1203 21:47:59.451704 4754 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h28m45.285625656s for next certificate rotation Dec 03 21:48:00.235547 master-0 kubenswrapper[4754]: I1203 21:48:00.235479 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:48:00.236229 master-0 kubenswrapper[4754]: E1203 21:48:00.235692 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:00.236229 master-0 kubenswrapper[4754]: E1203 21:48:00.235802 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:48:04.235761078 +0000 UTC m=+47.988858693 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: E1203 21:48:02.169912 4754 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,Command:[/bin/bash -c #!/bin/bash Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: set -o allexport Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: source /etc/kubernetes/apiserver-url.env Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: else Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: exit 1 Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: fi Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c16e0847bd9ae0470e9702e5cfb4ccd5551a42ff062bd507f267ed55d1c31b42,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee896bce586a3fcd37b4be8165cf1b4a83e88b5d47667de10475ec43e31b7926,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f86d9ffe13cbab06ff676496b50a26bbc4819d8b81b98fbacca6aee9b56792f,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9e597b928c0bdcdebea19f093353a7ada98f5164601abf23aa97f0065c6e293,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7825952834ade266ce08d1a9eb0665e4661dea0a40647d3e1de2cf6266665e9d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2dc42ec15e3ecccc0942415ec68b27c2c10f53f084b6fa23caa1e81fc70f3629,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4629a2d090ecc0b613a9e6b50601fd2cdb99cb2e511f1fed6d335106f2789baf,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:25b69045d961dc26719bc4cbb3a854737938b6e97375c04197e9cbc932541b17,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvmxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 03 21:48:02.170017 master-0 kubenswrapper[4754]: > logger="UnhandledError" Dec 03 21:48:02.171647 master-0 kubenswrapper[4754]: E1203 21:48:02.171238 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 21:48:02.206339 master-0 kubenswrapper[4754]: E1203 21:48:02.206201 4754 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k829m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-q7jjz_assisted-installer(87c3edb2-12e8-45b0-99ac-9a794dd2881d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 03 21:48:02.207618 master-0 kubenswrapper[4754]: E1203 21:48:02.207528 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-q7jjz" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" Dec 03 21:48:02.879722 master-0 kubenswrapper[4754]: E1203 21:48:02.879512 4754 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k829m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-q7jjz_assisted-installer(87c3edb2-12e8-45b0-99ac-9a794dd2881d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: E1203 21:48:02.880930 4754 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,Command:[/bin/bash -c #!/bin/bash Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: set -o allexport Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: source /etc/kubernetes/apiserver-url.env Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: else Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: exit 1 Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: fi Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c16e0847bd9ae0470e9702e5cfb4ccd5551a42ff062bd507f267ed55d1c31b42,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee896bce586a3fcd37b4be8165cf1b4a83e88b5d47667de10475ec43e31b7926,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f86d9ffe13cbab06ff676496b50a26bbc4819d8b81b98fbacca6aee9b56792f,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9e597b928c0bdcdebea19f093353a7ada98f5164601abf23aa97f0065c6e293,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7825952834ade266ce08d1a9eb0665e4661dea0a40647d3e1de2cf6266665e9d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2dc42ec15e3ecccc0942415ec68b27c2c10f53f084b6fa23caa1e81fc70f3629,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4629a2d090ecc0b613a9e6b50601fd2cdb99cb2e511f1fed6d335106f2789baf,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:25b69045d961dc26719bc4cbb3a854737938b6e97375c04197e9cbc932541b17,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvmxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 03 21:48:02.881007 master-0 kubenswrapper[4754]: > logger="UnhandledError" Dec 03 21:48:02.881657 master-0 kubenswrapper[4754]: E1203 21:48:02.881038 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-q7jjz" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" Dec 03 21:48:02.883012 master-0 kubenswrapper[4754]: E1203 21:48:02.882936 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 21:48:04.266537 master-0 kubenswrapper[4754]: I1203 21:48:04.266435 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:48:04.267454 master-0 kubenswrapper[4754]: E1203 21:48:04.266723 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:04.267454 master-0 kubenswrapper[4754]: E1203 21:48:04.266858 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:48:12.266825882 +0000 UTC m=+56.019923537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:12.328329 master-0 kubenswrapper[4754]: I1203 21:48:12.328233 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:48:12.329395 master-0 kubenswrapper[4754]: E1203 21:48:12.328470 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:12.329395 master-0 kubenswrapper[4754]: E1203 21:48:12.328592 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:48:28.328559487 +0000 UTC m=+72.081657132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:13.686664 master-0 kubenswrapper[4754]: I1203 21:48:13.686554 4754 scope.go:117] "RemoveContainer" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: E1203 21:48:14.691873 4754 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,Command:[/bin/bash -c #!/bin/bash Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: set -o allexport Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: source /etc/kubernetes/apiserver-url.env Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: else Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: exit 1 Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: fi Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c16e0847bd9ae0470e9702e5cfb4ccd5551a42ff062bd507f267ed55d1c31b42,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee896bce586a3fcd37b4be8165cf1b4a83e88b5d47667de10475ec43e31b7926,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f86d9ffe13cbab06ff676496b50a26bbc4819d8b81b98fbacca6aee9b56792f,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9e597b928c0bdcdebea19f093353a7ada98f5164601abf23aa97f0065c6e293,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7825952834ade266ce08d1a9eb0665e4661dea0a40647d3e1de2cf6266665e9d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2dc42ec15e3ecccc0942415ec68b27c2c10f53f084b6fa23caa1e81fc70f3629,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4629a2d090ecc0b613a9e6b50601fd2cdb99cb2e511f1fed6d335106f2789baf,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:25b69045d961dc26719bc4cbb3a854737938b6e97375c04197e9cbc932541b17,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvmxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 03 21:48:14.691945 master-0 kubenswrapper[4754]: > logger="UnhandledError" Dec 03 21:48:14.693513 master-0 kubenswrapper[4754]: E1203 21:48:14.693460 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 21:48:14.909823 master-0 kubenswrapper[4754]: I1203 21:48:14.909702 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 21:48:14.910594 master-0 kubenswrapper[4754]: I1203 21:48:14.910481 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"a66c1588269465b26d7305b3f03876e778d4729a5541123141a6ed0d5a7e9d38"} Dec 03 21:48:18.688300 master-0 kubenswrapper[4754]: E1203 21:48:18.687745 4754 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k829m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-q7jjz_assisted-installer(87c3edb2-12e8-45b0-99ac-9a794dd2881d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 03 21:48:18.689143 master-0 kubenswrapper[4754]: E1203 21:48:18.689081 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-q7jjz" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: E1203 21:48:25.688120 4754 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,Command:[/bin/bash -c #!/bin/bash Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: set -o allexport Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: source /etc/kubernetes/apiserver-url.env Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: else Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: exit 1 Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: fi Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c16e0847bd9ae0470e9702e5cfb4ccd5551a42ff062bd507f267ed55d1c31b42,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee896bce586a3fcd37b4be8165cf1b4a83e88b5d47667de10475ec43e31b7926,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f86d9ffe13cbab06ff676496b50a26bbc4819d8b81b98fbacca6aee9b56792f,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9e597b928c0bdcdebea19f093353a7ada98f5164601abf23aa97f0065c6e293,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7825952834ade266ce08d1a9eb0665e4661dea0a40647d3e1de2cf6266665e9d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2dc42ec15e3ecccc0942415ec68b27c2c10f53f084b6fa23caa1e81fc70f3629,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4629a2d090ecc0b613a9e6b50601fd2cdb99cb2e511f1fed6d335106f2789baf,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:25b69045d961dc26719bc4cbb3a854737938b6e97375c04197e9cbc932541b17,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bvmxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Dec 03 21:48:25.688213 master-0 kubenswrapper[4754]: > logger="UnhandledError" Dec 03 21:48:25.689739 master-0 kubenswrapper[4754]: E1203 21:48:25.689672 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 21:48:28.353126 master-0 kubenswrapper[4754]: I1203 21:48:28.353032 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:48:28.353933 master-0 kubenswrapper[4754]: E1203 21:48:28.353265 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:28.353933 master-0 kubenswrapper[4754]: E1203 21:48:28.353396 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:00.353346455 +0000 UTC m=+104.106444120 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:48:30.688291 master-0 kubenswrapper[4754]: E1203 21:48:30.688070 4754 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:assisted-installer-controller,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLUSTER_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:cluster-id,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:INVENTORY_URL,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:inventory-url,Optional:nil,},SecretKeyRef:nil,},},EnvVar{Name:PULL_SECRET_TOKEN,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-secret,},Key:pull-secret-token,Optional:nil,},},},EnvVar{Name:CA_CERT_PATH,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:ca-cert-path,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:SKIP_CERT_VERIFICATION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:skip-cert-verification,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:OPENSHIFT_VERSION,Value:4.18.28,ValueFrom:nil,},EnvVar{Name:NOTIFY_NUM_REBOOTS,Value:true,ValueFrom:nil,},EnvVar{Name:HIGH_AVAILABILITY_MODE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:high-availability-mode,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:CHECK_CLUSTER_VERSION,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:check-cluster-version,Optional:*true,},SecretKeyRef:nil,},},EnvVar{Name:MUST_GATHER_IMAGE,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:&ConfigMapKeySelector{LocalObjectReference:LocalObjectReference{Name:assisted-installer-controller-config,},Key:must-gather-image,Optional:*true,},SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-ca-bundle,ReadOnly:false,MountPath:/etc/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-run-resolv-conf,ReadOnly:false,MountPath:/tmp/var-run-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-resolv-conf,ReadOnly:false,MountPath:/tmp/host-resolv.conf,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sno-bootstrap-files,ReadOnly:false,MountPath:/tmp/bootstrap-secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k829m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[KILL MKNOD SETGID SETUID],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000120000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod assisted-installer-controller-q7jjz_assisted-installer(87c3edb2-12e8-45b0-99ac-9a794dd2881d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Dec 03 21:48:30.689559 master-0 kubenswrapper[4754]: E1203 21:48:30.689476 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"assisted-installer-controller\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="assisted-installer/assisted-installer-controller-q7jjz" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" Dec 03 21:48:37.576115 master-0 kubenswrapper[4754]: I1203 21:48:37.576049 4754 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 21:48:39.703131 master-0 kubenswrapper[4754]: I1203 21:48:39.702994 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=42.702934466 podStartE2EDuration="42.702934466s" podCreationTimestamp="2025-12-03 21:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:48:14.928088267 +0000 UTC m=+58.681185922" watchObservedRunningTime="2025-12-03 21:48:39.702934466 +0000 UTC m=+83.456032131" Dec 03 21:48:39.703919 master-0 kubenswrapper[4754]: I1203 21:48:39.703319 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 21:48:40.979710 master-0 kubenswrapper[4754]: I1203 21:48:40.979626 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerStarted","Data":"d1b75fc955087be48690894945186bc731e2bed0635b14d30a5226a6cb7dbae4"} Dec 03 21:48:41.022635 master-0 kubenswrapper[4754]: I1203 21:48:41.022521 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podStartSLOduration=43.746530906 podStartE2EDuration="49.02248729s" podCreationTimestamp="2025-12-03 21:47:52 +0000 UTC" firstStartedPulling="2025-12-03 21:47:56.8935376 +0000 UTC m=+40.646635215" lastFinishedPulling="2025-12-03 21:48:02.169493964 +0000 UTC m=+45.922591599" observedRunningTime="2025-12-03 21:48:41.002419657 +0000 UTC m=+84.755517332" watchObservedRunningTime="2025-12-03 21:48:41.02248729 +0000 UTC m=+84.775584935" Dec 03 21:48:42.553633 master-0 kubenswrapper[4754]: I1203 21:48:42.553547 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=3.553521222 podStartE2EDuration="3.553521222s" podCreationTimestamp="2025-12-03 21:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:48:41.023583633 +0000 UTC m=+84.776681338" watchObservedRunningTime="2025-12-03 21:48:42.553521222 +0000 UTC m=+86.306618847" Dec 03 21:48:42.554528 master-0 kubenswrapper[4754]: I1203 21:48:42.553709 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-b9r5p"] Dec 03 21:48:42.554528 master-0 kubenswrapper[4754]: I1203 21:48:42.554114 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:42.665498 master-0 kubenswrapper[4754]: I1203 21:48:42.665419 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrd9\" (UniqueName: \"kubernetes.io/projected/ffdea166-7cd6-4319-966e-43579d960fc1-kube-api-access-7vrd9\") pod \"mtu-prober-b9r5p\" (UID: \"ffdea166-7cd6-4319-966e-43579d960fc1\") " pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:42.691466 master-0 kubenswrapper[4754]: I1203 21:48:42.691410 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Dec 03 21:48:42.700976 master-0 kubenswrapper[4754]: I1203 21:48:42.700926 4754 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Dec 03 21:48:42.765993 master-0 kubenswrapper[4754]: I1203 21:48:42.765901 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrd9\" (UniqueName: \"kubernetes.io/projected/ffdea166-7cd6-4319-966e-43579d960fc1-kube-api-access-7vrd9\") pod \"mtu-prober-b9r5p\" (UID: \"ffdea166-7cd6-4319-966e-43579d960fc1\") " pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:42.794243 master-0 kubenswrapper[4754]: I1203 21:48:42.794155 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrd9\" (UniqueName: \"kubernetes.io/projected/ffdea166-7cd6-4319-966e-43579d960fc1-kube-api-access-7vrd9\") pod \"mtu-prober-b9r5p\" (UID: \"ffdea166-7cd6-4319-966e-43579d960fc1\") " pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:42.870604 master-0 kubenswrapper[4754]: I1203 21:48:42.870549 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:42.887904 master-0 kubenswrapper[4754]: W1203 21:48:42.887854 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffdea166_7cd6_4319_966e_43579d960fc1.slice/crio-7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f WatchSource:0}: Error finding container 7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f: Status 404 returned error can't find the container with id 7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f Dec 03 21:48:42.988392 master-0 kubenswrapper[4754]: I1203 21:48:42.988317 4754 generic.go:334] "Generic (PLEG): container finished" podID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerID="a2582bc7c1924ecfbd971e3c3eba302e8128d7abf1b76de440d5c3eb71052837" exitCode=0 Dec 03 21:48:42.988691 master-0 kubenswrapper[4754]: I1203 21:48:42.988407 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-q7jjz" event={"ID":"87c3edb2-12e8-45b0-99ac-9a794dd2881d","Type":"ContainerDied","Data":"a2582bc7c1924ecfbd971e3c3eba302e8128d7abf1b76de440d5c3eb71052837"} Dec 03 21:48:42.990828 master-0 kubenswrapper[4754]: I1203 21:48:42.990725 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-b9r5p" event={"ID":"ffdea166-7cd6-4319-966e-43579d960fc1","Type":"ContainerStarted","Data":"7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f"} Dec 03 21:48:43.996475 master-0 kubenswrapper[4754]: I1203 21:48:43.996362 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffdea166-7cd6-4319-966e-43579d960fc1" containerID="5e573ccc03a6c280986237abcd7396968c1017f2190b689fe94d1f1000629079" exitCode=0 Dec 03 21:48:43.997748 master-0 kubenswrapper[4754]: I1203 21:48:43.996500 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-b9r5p" event={"ID":"ffdea166-7cd6-4319-966e-43579d960fc1","Type":"ContainerDied","Data":"5e573ccc03a6c280986237abcd7396968c1017f2190b689fe94d1f1000629079"} Dec 03 21:48:44.026299 master-0 kubenswrapper[4754]: I1203 21:48:44.026207 4754 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:48:44.078846 master-0 kubenswrapper[4754]: I1203 21:48:44.078742 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-var-run-resolv-conf\") pod \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " Dec 03 21:48:44.078846 master-0 kubenswrapper[4754]: I1203 21:48:44.078814 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-resolv-conf\") pod \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " Dec 03 21:48:44.078846 master-0 kubenswrapper[4754]: I1203 21:48:44.078842 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-sno-bootstrap-files\") pod \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " Dec 03 21:48:44.079138 master-0 kubenswrapper[4754]: I1203 21:48:44.078861 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-ca-bundle\") pod \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " Dec 03 21:48:44.079138 master-0 kubenswrapper[4754]: I1203 21:48:44.078897 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k829m\" (UniqueName: \"kubernetes.io/projected/87c3edb2-12e8-45b0-99ac-9a794dd2881d-kube-api-access-k829m\") pod \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\" (UID: \"87c3edb2-12e8-45b0-99ac-9a794dd2881d\") " Dec 03 21:48:44.079138 master-0 kubenswrapper[4754]: I1203 21:48:44.078950 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "87c3edb2-12e8-45b0-99ac-9a794dd2881d" (UID: "87c3edb2-12e8-45b0-99ac-9a794dd2881d"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:48:44.079138 master-0 kubenswrapper[4754]: I1203 21:48:44.078932 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "87c3edb2-12e8-45b0-99ac-9a794dd2881d" (UID: "87c3edb2-12e8-45b0-99ac-9a794dd2881d"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:48:44.079138 master-0 kubenswrapper[4754]: I1203 21:48:44.079010 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "87c3edb2-12e8-45b0-99ac-9a794dd2881d" (UID: "87c3edb2-12e8-45b0-99ac-9a794dd2881d"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:48:44.079138 master-0 kubenswrapper[4754]: I1203 21:48:44.079070 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "87c3edb2-12e8-45b0-99ac-9a794dd2881d" (UID: "87c3edb2-12e8-45b0-99ac-9a794dd2881d"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:48:44.083271 master-0 kubenswrapper[4754]: I1203 21:48:44.083198 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c3edb2-12e8-45b0-99ac-9a794dd2881d-kube-api-access-k829m" (OuterVolumeSpecName: "kube-api-access-k829m") pod "87c3edb2-12e8-45b0-99ac-9a794dd2881d" (UID: "87c3edb2-12e8-45b0-99ac-9a794dd2881d"). InnerVolumeSpecName "kube-api-access-k829m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:48:44.180008 master-0 kubenswrapper[4754]: I1203 21:48:44.179932 4754 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 03 21:48:44.180008 master-0 kubenswrapper[4754]: I1203 21:48:44.179978 4754 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Dec 03 21:48:44.180008 master-0 kubenswrapper[4754]: I1203 21:48:44.179990 4754 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 21:48:44.180008 master-0 kubenswrapper[4754]: I1203 21:48:44.180004 4754 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k829m\" (UniqueName: \"kubernetes.io/projected/87c3edb2-12e8-45b0-99ac-9a794dd2881d-kube-api-access-k829m\") on node \"master-0\" DevicePath \"\"" Dec 03 21:48:44.180008 master-0 kubenswrapper[4754]: I1203 21:48:44.180015 4754 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/87c3edb2-12e8-45b0-99ac-9a794dd2881d-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Dec 03 21:48:45.002904 master-0 kubenswrapper[4754]: I1203 21:48:45.002650 4754 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:48:45.002904 master-0 kubenswrapper[4754]: I1203 21:48:45.002675 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-q7jjz" event={"ID":"87c3edb2-12e8-45b0-99ac-9a794dd2881d","Type":"ContainerDied","Data":"dd41923c7718b750c80c184d862535442462bd815561e9bbfb7bb52e77b97884"} Dec 03 21:48:45.002904 master-0 kubenswrapper[4754]: I1203 21:48:45.002750 4754 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd41923c7718b750c80c184d862535442462bd815561e9bbfb7bb52e77b97884" Dec 03 21:48:45.031516 master-0 kubenswrapper[4754]: I1203 21:48:45.031426 4754 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:45.088231 master-0 kubenswrapper[4754]: I1203 21:48:45.088176 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vrd9\" (UniqueName: \"kubernetes.io/projected/ffdea166-7cd6-4319-966e-43579d960fc1-kube-api-access-7vrd9\") pod \"ffdea166-7cd6-4319-966e-43579d960fc1\" (UID: \"ffdea166-7cd6-4319-966e-43579d960fc1\") " Dec 03 21:48:45.093471 master-0 kubenswrapper[4754]: I1203 21:48:45.093325 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffdea166-7cd6-4319-966e-43579d960fc1-kube-api-access-7vrd9" (OuterVolumeSpecName: "kube-api-access-7vrd9") pod "ffdea166-7cd6-4319-966e-43579d960fc1" (UID: "ffdea166-7cd6-4319-966e-43579d960fc1"). InnerVolumeSpecName "kube-api-access-7vrd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:48:45.188823 master-0 kubenswrapper[4754]: I1203 21:48:45.188698 4754 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vrd9\" (UniqueName: \"kubernetes.io/projected/ffdea166-7cd6-4319-966e-43579d960fc1-kube-api-access-7vrd9\") on node \"master-0\" DevicePath \"\"" Dec 03 21:48:46.007108 master-0 kubenswrapper[4754]: I1203 21:48:46.007029 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-b9r5p" event={"ID":"ffdea166-7cd6-4319-966e-43579d960fc1","Type":"ContainerDied","Data":"7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f"} Dec 03 21:48:46.007108 master-0 kubenswrapper[4754]: I1203 21:48:46.007103 4754 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f" Dec 03 21:48:46.007633 master-0 kubenswrapper[4754]: I1203 21:48:46.007121 4754 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-b9r5p" Dec 03 21:48:47.571136 master-0 kubenswrapper[4754]: I1203 21:48:47.571047 4754 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-b9r5p"] Dec 03 21:48:47.576764 master-0 kubenswrapper[4754]: I1203 21:48:47.576718 4754 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-b9r5p"] Dec 03 21:48:47.697629 master-0 kubenswrapper[4754]: I1203 21:48:47.697512 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 21:48:47.699637 master-0 kubenswrapper[4754]: I1203 21:48:47.699572 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 21:48:47.699637 master-0 kubenswrapper[4754]: W1203 21:48:47.699617 4754 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 03 21:48:48.690800 master-0 kubenswrapper[4754]: I1203 21:48:48.690686 4754 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" path="/var/lib/kubelet/pods/ffdea166-7cd6-4319-966e-43579d960fc1/volumes" Dec 03 21:48:52.447745 master-0 kubenswrapper[4754]: I1203 21:48:52.447113 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6jlh8"] Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: E1203 21:48:52.447812 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: I1203 21:48:52.447845 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: E1203 21:48:52.447865 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" containerName="prober" Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: I1203 21:48:52.447884 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" containerName="prober" Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: I1203 21:48:52.447939 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" containerName="prober" Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: I1203 21:48:52.447957 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 21:48:52.449112 master-0 kubenswrapper[4754]: I1203 21:48:52.448325 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.451586 master-0 kubenswrapper[4754]: I1203 21:48:52.451532 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 21:48:52.453101 master-0 kubenswrapper[4754]: I1203 21:48:52.453050 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 21:48:52.453348 master-0 kubenswrapper[4754]: I1203 21:48:52.453134 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 21:48:52.454329 master-0 kubenswrapper[4754]: I1203 21:48:52.454293 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 21:48:52.474428 master-0 kubenswrapper[4754]: I1203 21:48:52.474324 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=5.474289018 podStartE2EDuration="5.474289018s" podCreationTimestamp="2025-12-03 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:48:52.474137713 +0000 UTC m=+96.227235348" watchObservedRunningTime="2025-12-03 21:48:52.474289018 +0000 UTC m=+96.227386663" Dec 03 21:48:52.491459 master-0 kubenswrapper[4754]: I1203 21:48:52.491356 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=5.491313591 podStartE2EDuration="5.491313591s" podCreationTimestamp="2025-12-03 21:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:48:52.491089514 +0000 UTC m=+96.244187169" watchObservedRunningTime="2025-12-03 21:48:52.491313591 +0000 UTC m=+96.244411226" Dec 03 21:48:52.541950 master-0 kubenswrapper[4754]: I1203 21:48:52.541845 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.541950 master-0 kubenswrapper[4754]: I1203 21:48:52.541916 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.541950 master-0 kubenswrapper[4754]: I1203 21:48:52.541966 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542016 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542069 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542114 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542152 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542187 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542237 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542266 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542378 master-0 kubenswrapper[4754]: I1203 21:48:52.542298 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542937 master-0 kubenswrapper[4754]: I1203 21:48:52.542447 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542937 master-0 kubenswrapper[4754]: I1203 21:48:52.542524 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542937 master-0 kubenswrapper[4754]: I1203 21:48:52.542554 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542937 master-0 kubenswrapper[4754]: I1203 21:48:52.542583 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542937 master-0 kubenswrapper[4754]: I1203 21:48:52.542621 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.542937 master-0 kubenswrapper[4754]: I1203 21:48:52.542647 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.643826 master-0 kubenswrapper[4754]: I1203 21:48:52.643722 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.643826 master-0 kubenswrapper[4754]: I1203 21:48:52.643820 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.643879 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.643913 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.643944 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.643976 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.643980 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644046 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644019 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644005 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644096 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644112 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644150 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644144 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644232 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.644233 master-0 kubenswrapper[4754]: I1203 21:48:52.644195 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644190 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644188 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644343 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644422 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644485 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644599 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644818 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644834 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.645263 master-0 kubenswrapper[4754]: I1203 21:48:52.644961 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646145 master-0 kubenswrapper[4754]: I1203 21:48:52.645656 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646145 master-0 kubenswrapper[4754]: I1203 21:48:52.645871 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646334 master-0 kubenswrapper[4754]: I1203 21:48:52.646017 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646334 master-0 kubenswrapper[4754]: I1203 21:48:52.646130 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646506 master-0 kubenswrapper[4754]: I1203 21:48:52.646422 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646665 master-0 kubenswrapper[4754]: I1203 21:48:52.646592 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646737 master-0 kubenswrapper[4754]: I1203 21:48:52.646694 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.646945 master-0 kubenswrapper[4754]: I1203 21:48:52.646862 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.647835 master-0 kubenswrapper[4754]: I1203 21:48:52.647747 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qz5vh"] Dec 03 21:48:52.648582 master-0 kubenswrapper[4754]: I1203 21:48:52.648500 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.651733 master-0 kubenswrapper[4754]: I1203 21:48:52.651663 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 21:48:52.652570 master-0 kubenswrapper[4754]: I1203 21:48:52.652516 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 21:48:52.683130 master-0 kubenswrapper[4754]: I1203 21:48:52.683004 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.747810 master-0 kubenswrapper[4754]: I1203 21:48:52.747647 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.747810 master-0 kubenswrapper[4754]: I1203 21:48:52.747716 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.747810 master-0 kubenswrapper[4754]: I1203 21:48:52.747749 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.747810 master-0 kubenswrapper[4754]: I1203 21:48:52.747806 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.748133 master-0 kubenswrapper[4754]: I1203 21:48:52.747868 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.748133 master-0 kubenswrapper[4754]: I1203 21:48:52.747899 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.748133 master-0 kubenswrapper[4754]: I1203 21:48:52.747934 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.748133 master-0 kubenswrapper[4754]: I1203 21:48:52.747972 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.777985 master-0 kubenswrapper[4754]: I1203 21:48:52.777897 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6jlh8" Dec 03 21:48:52.793553 master-0 kubenswrapper[4754]: W1203 21:48:52.793467 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8194009_3743_4da7_baf1_f9bb0afd6187.slice/crio-e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95 WatchSource:0}: Error finding container e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95: Status 404 returned error can't find the container with id e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95 Dec 03 21:48:52.849112 master-0 kubenswrapper[4754]: I1203 21:48:52.849019 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849112 master-0 kubenswrapper[4754]: I1203 21:48:52.849090 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849131 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849165 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849209 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849242 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849279 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849316 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849338 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.849618 master-0 kubenswrapper[4754]: I1203 21:48:52.849509 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.850675 master-0 kubenswrapper[4754]: I1203 21:48:52.849763 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.850675 master-0 kubenswrapper[4754]: I1203 21:48:52.849897 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.850675 master-0 kubenswrapper[4754]: I1203 21:48:52.850429 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.850675 master-0 kubenswrapper[4754]: I1203 21:48:52.850430 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.851164 master-0 kubenswrapper[4754]: I1203 21:48:52.850924 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.879801 master-0 kubenswrapper[4754]: I1203 21:48:52.879694 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.968936 master-0 kubenswrapper[4754]: I1203 21:48:52.968822 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:48:52.986255 master-0 kubenswrapper[4754]: W1203 21:48:52.986023 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffad8fc8_4378_44de_8864_dd2f666ade68.slice/crio-df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e WatchSource:0}: Error finding container df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e: Status 404 returned error can't find the container with id df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e Dec 03 21:48:53.032556 master-0 kubenswrapper[4754]: I1203 21:48:53.032380 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerStarted","Data":"df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e"} Dec 03 21:48:53.034804 master-0 kubenswrapper[4754]: I1203 21:48:53.034756 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6jlh8" event={"ID":"b8194009-3743-4da7-baf1-f9bb0afd6187","Type":"ContainerStarted","Data":"e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95"} Dec 03 21:48:53.434629 master-0 kubenswrapper[4754]: I1203 21:48:53.434526 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-h6569"] Dec 03 21:48:53.435088 master-0 kubenswrapper[4754]: I1203 21:48:53.435065 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:53.435206 master-0 kubenswrapper[4754]: E1203 21:48:53.435157 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:48:53.556820 master-0 kubenswrapper[4754]: I1203 21:48:53.556684 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:53.556820 master-0 kubenswrapper[4754]: I1203 21:48:53.556803 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:53.657898 master-0 kubenswrapper[4754]: I1203 21:48:53.657848 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:53.658156 master-0 kubenswrapper[4754]: I1203 21:48:53.657918 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:53.658156 master-0 kubenswrapper[4754]: E1203 21:48:53.658058 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:53.658156 master-0 kubenswrapper[4754]: E1203 21:48:53.658122 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:48:54.158102203 +0000 UTC m=+97.911199818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:53.689605 master-0 kubenswrapper[4754]: I1203 21:48:53.689520 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:54.161685 master-0 kubenswrapper[4754]: I1203 21:48:54.161617 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:54.162050 master-0 kubenswrapper[4754]: E1203 21:48:54.161801 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:54.162050 master-0 kubenswrapper[4754]: E1203 21:48:54.161876 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:48:55.161854203 +0000 UTC m=+98.914951818 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:55.170702 master-0 kubenswrapper[4754]: I1203 21:48:55.170564 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:55.171710 master-0 kubenswrapper[4754]: E1203 21:48:55.170807 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:55.171710 master-0 kubenswrapper[4754]: E1203 21:48:55.170925 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:48:57.170904166 +0000 UTC m=+100.924001781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:55.686067 master-0 kubenswrapper[4754]: I1203 21:48:55.685939 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:55.686405 master-0 kubenswrapper[4754]: E1203 21:48:55.686166 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:48:56.044680 master-0 kubenswrapper[4754]: I1203 21:48:56.044565 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="2378e0ccca63367b0b1f9fee7b2b6b1c87516db57078ddfc7d8d7140f0467b96" exitCode=0 Dec 03 21:48:56.044680 master-0 kubenswrapper[4754]: I1203 21:48:56.044624 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerDied","Data":"2378e0ccca63367b0b1f9fee7b2b6b1c87516db57078ddfc7d8d7140f0467b96"} Dec 03 21:48:57.186850 master-0 kubenswrapper[4754]: I1203 21:48:57.186718 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:57.188124 master-0 kubenswrapper[4754]: E1203 21:48:57.187007 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:57.188124 master-0 kubenswrapper[4754]: E1203 21:48:57.187136 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:01.187098008 +0000 UTC m=+104.940195663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:48:57.685760 master-0 kubenswrapper[4754]: I1203 21:48:57.685677 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:57.686035 master-0 kubenswrapper[4754]: E1203 21:48:57.685874 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:48:59.686318 master-0 kubenswrapper[4754]: I1203 21:48:59.686185 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:48:59.689693 master-0 kubenswrapper[4754]: E1203 21:48:59.686365 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:00.417944 master-0 kubenswrapper[4754]: I1203 21:49:00.417827 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:00.418202 master-0 kubenswrapper[4754]: E1203 21:49:00.418002 4754 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:00.418202 master-0 kubenswrapper[4754]: E1203 21:49:00.418077 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:04.41805756 +0000 UTC m=+168.171155175 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:01.224891 master-0 kubenswrapper[4754]: I1203 21:49:01.224827 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:01.225480 master-0 kubenswrapper[4754]: E1203 21:49:01.225025 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:49:01.225480 master-0 kubenswrapper[4754]: E1203 21:49:01.225157 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:09.225124149 +0000 UTC m=+112.978221944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:49:01.685265 master-0 kubenswrapper[4754]: I1203 21:49:01.685185 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:01.685548 master-0 kubenswrapper[4754]: E1203 21:49:01.685324 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:03.686174 master-0 kubenswrapper[4754]: I1203 21:49:03.686101 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:03.686829 master-0 kubenswrapper[4754]: E1203 21:49:03.686260 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:04.836925 master-0 kubenswrapper[4754]: I1203 21:49:04.836245 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w"] Dec 03 21:49:04.836925 master-0 kubenswrapper[4754]: I1203 21:49:04.836583 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:04.839203 master-0 kubenswrapper[4754]: I1203 21:49:04.839154 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 21:49:04.839743 master-0 kubenswrapper[4754]: I1203 21:49:04.839698 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 21:49:04.839984 master-0 kubenswrapper[4754]: I1203 21:49:04.839831 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 21:49:04.840072 master-0 kubenswrapper[4754]: I1203 21:49:04.839852 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 21:49:04.840177 master-0 kubenswrapper[4754]: I1203 21:49:04.840138 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 21:49:04.952068 master-0 kubenswrapper[4754]: I1203 21:49:04.952001 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:04.952068 master-0 kubenswrapper[4754]: I1203 21:49:04.952063 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:04.952241 master-0 kubenswrapper[4754]: I1203 21:49:04.952107 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:04.952450 master-0 kubenswrapper[4754]: I1203 21:49:04.952336 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.050751 master-0 kubenswrapper[4754]: I1203 21:49:05.050685 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-td6fl"] Dec 03 21:49:05.051467 master-0 kubenswrapper[4754]: I1203 21:49:05.051433 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.053433 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.053539 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.053582 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.053638 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.054480 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.054685 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 21:49:05.054887 master-0 kubenswrapper[4754]: I1203 21:49:05.054810 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.056571 master-0 kubenswrapper[4754]: I1203 21:49:05.056013 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 21:49:05.061652 master-0 kubenswrapper[4754]: I1203 21:49:05.061572 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.068835 master-0 kubenswrapper[4754]: I1203 21:49:05.068732 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="c4e06171c55f7e5fa99ea1abceeee733b284cf3a34d0a86bf2f243d3f655f7a5" exitCode=0 Dec 03 21:49:05.069136 master-0 kubenswrapper[4754]: I1203 21:49:05.068959 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerDied","Data":"c4e06171c55f7e5fa99ea1abceeee733b284cf3a34d0a86bf2f243d3f655f7a5"} Dec 03 21:49:05.072568 master-0 kubenswrapper[4754]: I1203 21:49:05.072514 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6jlh8" event={"ID":"b8194009-3743-4da7-baf1-f9bb0afd6187","Type":"ContainerStarted","Data":"76e43e5fbee7ffc24194d2c33eab847023aabeb362bd045487fdd30a3e9fd56c"} Dec 03 21:49:05.092343 master-0 kubenswrapper[4754]: I1203 21:49:05.092219 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.154184 master-0 kubenswrapper[4754]: I1203 21:49:05.154067 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-slash\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154184 master-0 kubenswrapper[4754]: I1203 21:49:05.154147 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-systemd-units\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154184 master-0 kubenswrapper[4754]: I1203 21:49:05.154189 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-ovn-kubernetes\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154227 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-etc-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154286 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-kubelet\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154320 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-ovn\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154354 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154392 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-script-lib\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154426 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-netns\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154464 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-var-lib-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154496 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.154635 master-0 kubenswrapper[4754]: I1203 21:49:05.154527 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-config\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155179 master-0 kubenswrapper[4754]: I1203 21:49:05.154653 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lb89\" (UniqueName: \"kubernetes.io/projected/86c2417b-23bb-4eb8-abc5-4fc5beab2873-kube-api-access-4lb89\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155179 master-0 kubenswrapper[4754]: I1203 21:49:05.154790 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-systemd\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155179 master-0 kubenswrapper[4754]: I1203 21:49:05.154908 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-netd\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155179 master-0 kubenswrapper[4754]: I1203 21:49:05.155012 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-bin\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155179 master-0 kubenswrapper[4754]: I1203 21:49:05.155076 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-node-log\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155179 master-0 kubenswrapper[4754]: I1203 21:49:05.155124 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-env-overrides\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155536 master-0 kubenswrapper[4754]: I1203 21:49:05.155199 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovn-node-metrics-cert\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.155536 master-0 kubenswrapper[4754]: I1203 21:49:05.155258 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-log-socket\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.160922 master-0 kubenswrapper[4754]: I1203 21:49:05.160874 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:05.178153 master-0 kubenswrapper[4754]: W1203 21:49:05.178082 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29ac4a9d_1228_49c7_9051_338e7dc98a38.slice/crio-111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977 WatchSource:0}: Error finding container 111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977: Status 404 returned error can't find the container with id 111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977 Dec 03 21:49:05.256746 master-0 kubenswrapper[4754]: I1203 21:49:05.256597 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-systemd\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.256746 master-0 kubenswrapper[4754]: I1203 21:49:05.256733 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-bin\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257154 master-0 kubenswrapper[4754]: I1203 21:49:05.256982 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-netd\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257154 master-0 kubenswrapper[4754]: I1203 21:49:05.257051 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-node-log\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257154 master-0 kubenswrapper[4754]: I1203 21:49:05.257089 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-env-overrides\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257154 master-0 kubenswrapper[4754]: I1203 21:49:05.257086 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-systemd\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257154 master-0 kubenswrapper[4754]: I1203 21:49:05.257140 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovn-node-metrics-cert\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257181 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-log-socket\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257220 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-slash\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257218 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-bin\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257257 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-systemd-units\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257326 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-systemd-units\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257324 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-netd\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257437 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-ovn-kubernetes\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257407 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-slash\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257385 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-ovn-kubernetes\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257430 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-node-log\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257483 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-log-socket\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257548 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-etc-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257608 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-etc-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257648 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-kubelet\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257703 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-ovn\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257727 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.257764 master-0 kubenswrapper[4754]: I1203 21:49:05.257752 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-script-lib\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257798 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-var-lib-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257820 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257839 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-config\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257843 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257859 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-netns\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257862 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-ovn\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257911 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-netns\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.257937 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-kubelet\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.258013 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lb89\" (UniqueName: \"kubernetes.io/projected/86c2417b-23bb-4eb8-abc5-4fc5beab2873-kube-api-access-4lb89\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.258272 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-env-overrides\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.258288 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.258336 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-var-lib-openvswitch\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.259049 master-0 kubenswrapper[4754]: I1203 21:49:05.258819 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-script-lib\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.260101 master-0 kubenswrapper[4754]: I1203 21:49:05.259319 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-config\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.262319 master-0 kubenswrapper[4754]: I1203 21:49:05.262225 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovn-node-metrics-cert\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.289493 master-0 kubenswrapper[4754]: I1203 21:49:05.289440 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lb89\" (UniqueName: \"kubernetes.io/projected/86c2417b-23bb-4eb8-abc5-4fc5beab2873-kube-api-access-4lb89\") pod \"ovnkube-node-td6fl\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.382937 master-0 kubenswrapper[4754]: I1203 21:49:05.382843 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:05.403183 master-0 kubenswrapper[4754]: W1203 21:49:05.403129 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c2417b_23bb_4eb8_abc5_4fc5beab2873.slice/crio-160c955e6bc6fbc37a201275978850561279b3beae60311f7da8181d0840a469 WatchSource:0}: Error finding container 160c955e6bc6fbc37a201275978850561279b3beae60311f7da8181d0840a469: Status 404 returned error can't find the container with id 160c955e6bc6fbc37a201275978850561279b3beae60311f7da8181d0840a469 Dec 03 21:49:05.686170 master-0 kubenswrapper[4754]: I1203 21:49:05.685957 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:05.686170 master-0 kubenswrapper[4754]: E1203 21:49:05.686122 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:06.080467 master-0 kubenswrapper[4754]: I1203 21:49:06.080273 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"160c955e6bc6fbc37a201275978850561279b3beae60311f7da8181d0840a469"} Dec 03 21:49:06.082476 master-0 kubenswrapper[4754]: I1203 21:49:06.082406 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerStarted","Data":"e8e97bf08c9cd7d6bdd222c55eb1ccd969134fd23984b660a7fd6614604ed59a"} Dec 03 21:49:06.082476 master-0 kubenswrapper[4754]: I1203 21:49:06.082462 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerStarted","Data":"111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977"} Dec 03 21:49:07.087890 master-0 kubenswrapper[4754]: I1203 21:49:07.087821 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="684c7729ba85fd8ac6a78b633cc41d781048493dac0127949a3ebcf247c67f5b" exitCode=0 Dec 03 21:49:07.088428 master-0 kubenswrapper[4754]: I1203 21:49:07.087894 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerDied","Data":"684c7729ba85fd8ac6a78b633cc41d781048493dac0127949a3ebcf247c67f5b"} Dec 03 21:49:07.111569 master-0 kubenswrapper[4754]: I1203 21:49:07.111146 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6jlh8" podStartSLOduration=3.27463752 podStartE2EDuration="15.111118541s" podCreationTimestamp="2025-12-03 21:48:52 +0000 UTC" firstStartedPulling="2025-12-03 21:48:52.796105463 +0000 UTC m=+96.549203118" lastFinishedPulling="2025-12-03 21:49:04.632586514 +0000 UTC m=+108.385684139" observedRunningTime="2025-12-03 21:49:05.131976924 +0000 UTC m=+108.885074549" watchObservedRunningTime="2025-12-03 21:49:07.111118541 +0000 UTC m=+110.864216156" Dec 03 21:49:07.685275 master-0 kubenswrapper[4754]: I1203 21:49:07.685216 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:07.685501 master-0 kubenswrapper[4754]: E1203 21:49:07.685379 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:08.025820 master-0 kubenswrapper[4754]: I1203 21:49:08.025670 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-78hts"] Dec 03 21:49:08.026191 master-0 kubenswrapper[4754]: I1203 21:49:08.026140 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:08.026245 master-0 kubenswrapper[4754]: E1203 21:49:08.026221 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:08.087914 master-0 kubenswrapper[4754]: I1203 21:49:08.087852 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:08.188902 master-0 kubenswrapper[4754]: I1203 21:49:08.188825 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:08.201296 master-0 kubenswrapper[4754]: E1203 21:49:08.201236 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 21:49:08.201296 master-0 kubenswrapper[4754]: E1203 21:49:08.201282 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 21:49:08.201296 master-0 kubenswrapper[4754]: E1203 21:49:08.201299 4754 projected.go:194] Error preparing data for projected volume kube-api-access-s68fd for pod openshift-network-diagnostics/network-check-target-78hts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:08.201547 master-0 kubenswrapper[4754]: E1203 21:49:08.201376 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd podName:e6d5d61a-c5de-4619-9afb-7fad63ba0525 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:08.701353784 +0000 UTC m=+112.454451409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s68fd" (UniqueName: "kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd") pod "network-check-target-78hts" (UID: "e6d5d61a-c5de-4619-9afb-7fad63ba0525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:08.793963 master-0 kubenswrapper[4754]: I1203 21:49:08.793779 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:08.794260 master-0 kubenswrapper[4754]: E1203 21:49:08.794040 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 21:49:08.794260 master-0 kubenswrapper[4754]: E1203 21:49:08.794096 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 21:49:08.794260 master-0 kubenswrapper[4754]: E1203 21:49:08.794125 4754 projected.go:194] Error preparing data for projected volume kube-api-access-s68fd for pod openshift-network-diagnostics/network-check-target-78hts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:08.794260 master-0 kubenswrapper[4754]: E1203 21:49:08.794236 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd podName:e6d5d61a-c5de-4619-9afb-7fad63ba0525 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:09.794199394 +0000 UTC m=+113.547297059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s68fd" (UniqueName: "kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd") pod "network-check-target-78hts" (UID: "e6d5d61a-c5de-4619-9afb-7fad63ba0525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:09.097999 master-0 kubenswrapper[4754]: I1203 21:49:09.097849 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="742bc32f3346ea0ae584753ab21aedef0a29aac40cae443f406076aeae30033d" exitCode=0 Dec 03 21:49:09.097999 master-0 kubenswrapper[4754]: I1203 21:49:09.097922 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerDied","Data":"742bc32f3346ea0ae584753ab21aedef0a29aac40cae443f406076aeae30033d"} Dec 03 21:49:09.298501 master-0 kubenswrapper[4754]: I1203 21:49:09.298434 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:09.298766 master-0 kubenswrapper[4754]: E1203 21:49:09.298697 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:49:09.298903 master-0 kubenswrapper[4754]: E1203 21:49:09.298840 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:25.298811529 +0000 UTC m=+129.051909144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:49:09.685293 master-0 kubenswrapper[4754]: I1203 21:49:09.685233 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:09.685518 master-0 kubenswrapper[4754]: I1203 21:49:09.685239 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:09.685518 master-0 kubenswrapper[4754]: E1203 21:49:09.685418 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:09.685975 master-0 kubenswrapper[4754]: E1203 21:49:09.685905 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:09.697341 master-0 kubenswrapper[4754]: I1203 21:49:09.697300 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 21:49:09.803464 master-0 kubenswrapper[4754]: I1203 21:49:09.803395 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:09.803719 master-0 kubenswrapper[4754]: E1203 21:49:09.803664 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 21:49:09.803719 master-0 kubenswrapper[4754]: E1203 21:49:09.803715 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 21:49:09.803831 master-0 kubenswrapper[4754]: E1203 21:49:09.803729 4754 projected.go:194] Error preparing data for projected volume kube-api-access-s68fd for pod openshift-network-diagnostics/network-check-target-78hts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:09.803831 master-0 kubenswrapper[4754]: E1203 21:49:09.803808 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd podName:e6d5d61a-c5de-4619-9afb-7fad63ba0525 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:11.803789635 +0000 UTC m=+115.556887250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s68fd" (UniqueName: "kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd") pod "network-check-target-78hts" (UID: "e6d5d61a-c5de-4619-9afb-7fad63ba0525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:10.684180 master-0 kubenswrapper[4754]: I1203 21:49:10.683370 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-r24k4"] Dec 03 21:49:10.684180 master-0 kubenswrapper[4754]: I1203 21:49:10.683841 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.688928 master-0 kubenswrapper[4754]: I1203 21:49:10.688843 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 21:49:10.692030 master-0 kubenswrapper[4754]: I1203 21:49:10.691993 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 21:49:10.692279 master-0 kubenswrapper[4754]: I1203 21:49:10.692252 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 21:49:10.692333 master-0 kubenswrapper[4754]: I1203 21:49:10.692304 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 21:49:10.693721 master-0 kubenswrapper[4754]: I1203 21:49:10.693277 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 21:49:10.707953 master-0 kubenswrapper[4754]: I1203 21:49:10.706929 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.70690617 podStartE2EDuration="1.70690617s" podCreationTimestamp="2025-12-03 21:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:49:10.706360123 +0000 UTC m=+114.459457738" watchObservedRunningTime="2025-12-03 21:49:10.70690617 +0000 UTC m=+114.460003785" Dec 03 21:49:10.813571 master-0 kubenswrapper[4754]: I1203 21:49:10.813462 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.813571 master-0 kubenswrapper[4754]: I1203 21:49:10.813538 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.813571 master-0 kubenswrapper[4754]: I1203 21:49:10.813570 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.813990 master-0 kubenswrapper[4754]: I1203 21:49:10.813640 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.914620 master-0 kubenswrapper[4754]: I1203 21:49:10.914508 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.915154 master-0 kubenswrapper[4754]: I1203 21:49:10.915108 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.915154 master-0 kubenswrapper[4754]: I1203 21:49:10.915150 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.915471 master-0 kubenswrapper[4754]: I1203 21:49:10.915199 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.916156 master-0 kubenswrapper[4754]: I1203 21:49:10.916115 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.916410 master-0 kubenswrapper[4754]: I1203 21:49:10.916352 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.926028 master-0 kubenswrapper[4754]: I1203 21:49:10.925973 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.937457 master-0 kubenswrapper[4754]: I1203 21:49:10.937318 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:10.999822 master-0 kubenswrapper[4754]: I1203 21:49:10.999620 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:11.104749 master-0 kubenswrapper[4754]: I1203 21:49:11.104667 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerStarted","Data":"dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1"} Dec 03 21:49:11.686019 master-0 kubenswrapper[4754]: I1203 21:49:11.685836 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:11.686019 master-0 kubenswrapper[4754]: I1203 21:49:11.685910 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:11.687094 master-0 kubenswrapper[4754]: E1203 21:49:11.686021 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:11.687094 master-0 kubenswrapper[4754]: E1203 21:49:11.686946 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:11.829743 master-0 kubenswrapper[4754]: I1203 21:49:11.829677 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:11.829971 master-0 kubenswrapper[4754]: E1203 21:49:11.829856 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 21:49:11.829971 master-0 kubenswrapper[4754]: E1203 21:49:11.829874 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 21:49:11.829971 master-0 kubenswrapper[4754]: E1203 21:49:11.829887 4754 projected.go:194] Error preparing data for projected volume kube-api-access-s68fd for pod openshift-network-diagnostics/network-check-target-78hts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:11.829971 master-0 kubenswrapper[4754]: E1203 21:49:11.829929 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd podName:e6d5d61a-c5de-4619-9afb-7fad63ba0525 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:15.82991605 +0000 UTC m=+119.583013665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s68fd" (UniqueName: "kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd") pod "network-check-target-78hts" (UID: "e6d5d61a-c5de-4619-9afb-7fad63ba0525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:13.685997 master-0 kubenswrapper[4754]: I1203 21:49:13.685918 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:13.686495 master-0 kubenswrapper[4754]: I1203 21:49:13.686008 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:13.686495 master-0 kubenswrapper[4754]: E1203 21:49:13.686088 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:13.686495 master-0 kubenswrapper[4754]: E1203 21:49:13.686237 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:15.686100 master-0 kubenswrapper[4754]: I1203 21:49:15.685574 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:15.686100 master-0 kubenswrapper[4754]: I1203 21:49:15.685628 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:15.686100 master-0 kubenswrapper[4754]: E1203 21:49:15.685722 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:15.686100 master-0 kubenswrapper[4754]: E1203 21:49:15.685796 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:15.866397 master-0 kubenswrapper[4754]: I1203 21:49:15.866316 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:15.866686 master-0 kubenswrapper[4754]: E1203 21:49:15.866576 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 21:49:15.866686 master-0 kubenswrapper[4754]: E1203 21:49:15.866606 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 21:49:15.866686 master-0 kubenswrapper[4754]: E1203 21:49:15.866625 4754 projected.go:194] Error preparing data for projected volume kube-api-access-s68fd for pod openshift-network-diagnostics/network-check-target-78hts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:15.866931 master-0 kubenswrapper[4754]: E1203 21:49:15.866695 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd podName:e6d5d61a-c5de-4619-9afb-7fad63ba0525 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:23.866671102 +0000 UTC m=+127.619768747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s68fd" (UniqueName: "kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd") pod "network-check-target-78hts" (UID: "e6d5d61a-c5de-4619-9afb-7fad63ba0525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:16.577235 master-0 kubenswrapper[4754]: E1203 21:49:16.577172 4754 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 21:49:16.667865 master-0 kubenswrapper[4754]: E1203 21:49:16.667792 4754 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 21:49:17.686055 master-0 kubenswrapper[4754]: I1203 21:49:17.685950 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:17.686928 master-0 kubenswrapper[4754]: I1203 21:49:17.685980 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:17.686928 master-0 kubenswrapper[4754]: E1203 21:49:17.686210 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:17.686928 master-0 kubenswrapper[4754]: E1203 21:49:17.686397 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:19.686150 master-0 kubenswrapper[4754]: I1203 21:49:19.686081 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:19.686820 master-0 kubenswrapper[4754]: I1203 21:49:19.686178 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:19.686820 master-0 kubenswrapper[4754]: E1203 21:49:19.686301 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:19.686820 master-0 kubenswrapper[4754]: E1203 21:49:19.686459 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:21.669368 master-0 kubenswrapper[4754]: E1203 21:49:21.669301 4754 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 21:49:21.685955 master-0 kubenswrapper[4754]: I1203 21:49:21.685888 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:21.686172 master-0 kubenswrapper[4754]: I1203 21:49:21.685888 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:21.686172 master-0 kubenswrapper[4754]: E1203 21:49:21.686056 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:21.686172 master-0 kubenswrapper[4754]: E1203 21:49:21.686113 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:23.154504 master-0 kubenswrapper[4754]: I1203 21:49:23.154061 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerStarted","Data":"e085324cd58339aaa2ff69b985958fab513139997fb20fc1324a7c3c052fa89d"} Dec 03 21:49:23.162996 master-0 kubenswrapper[4754]: I1203 21:49:23.162848 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerStarted","Data":"9a23336ef4cbe48c00b2e6685616a19fe24f57f37d3cf40c5d7d0d8dc909c159"} Dec 03 21:49:23.162996 master-0 kubenswrapper[4754]: I1203 21:49:23.162915 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerStarted","Data":"1e76433f2b57fcfd163818daf2c9960e0847f9b4763dbc897c31721245d6295d"} Dec 03 21:49:23.173537 master-0 kubenswrapper[4754]: I1203 21:49:23.166186 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d" exitCode=0 Dec 03 21:49:23.173537 master-0 kubenswrapper[4754]: I1203 21:49:23.166291 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d"} Dec 03 21:49:23.173537 master-0 kubenswrapper[4754]: I1203 21:49:23.168591 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerStarted","Data":"efd056e2ae598223549439d740b1aee6a580ddac5417acae41c47f9e3c130bb0"} Dec 03 21:49:23.209385 master-0 kubenswrapper[4754]: I1203 21:49:23.209303 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-r24k4" podStartSLOduration=1.388851952 podStartE2EDuration="13.209274579s" podCreationTimestamp="2025-12-03 21:49:10 +0000 UTC" firstStartedPulling="2025-12-03 21:49:11.017332019 +0000 UTC m=+114.770429634" lastFinishedPulling="2025-12-03 21:49:22.837754646 +0000 UTC m=+126.590852261" observedRunningTime="2025-12-03 21:49:23.208708212 +0000 UTC m=+126.961805847" watchObservedRunningTime="2025-12-03 21:49:23.209274579 +0000 UTC m=+126.962372194" Dec 03 21:49:23.224726 master-0 kubenswrapper[4754]: I1203 21:49:23.224640 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" podStartSLOduration=1.830624091 podStartE2EDuration="19.224610612s" podCreationTimestamp="2025-12-03 21:49:04 +0000 UTC" firstStartedPulling="2025-12-03 21:49:05.428010299 +0000 UTC m=+109.181107914" lastFinishedPulling="2025-12-03 21:49:22.82199681 +0000 UTC m=+126.575094435" observedRunningTime="2025-12-03 21:49:23.224345974 +0000 UTC m=+126.977443589" watchObservedRunningTime="2025-12-03 21:49:23.224610612 +0000 UTC m=+126.977708257" Dec 03 21:49:23.686519 master-0 kubenswrapper[4754]: I1203 21:49:23.685994 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:23.686654 master-0 kubenswrapper[4754]: I1203 21:49:23.686066 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:23.686654 master-0 kubenswrapper[4754]: E1203 21:49:23.686638 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:23.686896 master-0 kubenswrapper[4754]: E1203 21:49:23.686827 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:23.937394 master-0 kubenswrapper[4754]: I1203 21:49:23.937245 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:23.937634 master-0 kubenswrapper[4754]: E1203 21:49:23.937555 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 21:49:23.937634 master-0 kubenswrapper[4754]: E1203 21:49:23.937633 4754 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 21:49:23.937825 master-0 kubenswrapper[4754]: E1203 21:49:23.937664 4754 projected.go:194] Error preparing data for projected volume kube-api-access-s68fd for pod openshift-network-diagnostics/network-check-target-78hts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:23.937825 master-0 kubenswrapper[4754]: E1203 21:49:23.937765 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd podName:e6d5d61a-c5de-4619-9afb-7fad63ba0525 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:39.937731535 +0000 UTC m=+143.690829180 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s68fd" (UniqueName: "kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd") pod "network-check-target-78hts" (UID: "e6d5d61a-c5de-4619-9afb-7fad63ba0525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 21:49:24.178823 master-0 kubenswrapper[4754]: I1203 21:49:24.178685 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b"} Dec 03 21:49:24.178823 master-0 kubenswrapper[4754]: I1203 21:49:24.178786 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2"} Dec 03 21:49:24.178823 master-0 kubenswrapper[4754]: I1203 21:49:24.178801 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524"} Dec 03 21:49:24.178823 master-0 kubenswrapper[4754]: I1203 21:49:24.178814 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd"} Dec 03 21:49:24.178823 master-0 kubenswrapper[4754]: I1203 21:49:24.178829 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235"} Dec 03 21:49:24.178823 master-0 kubenswrapper[4754]: I1203 21:49:24.178841 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb"} Dec 03 21:49:24.183019 master-0 kubenswrapper[4754]: I1203 21:49:24.182936 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="e085324cd58339aaa2ff69b985958fab513139997fb20fc1324a7c3c052fa89d" exitCode=0 Dec 03 21:49:24.183143 master-0 kubenswrapper[4754]: I1203 21:49:24.183045 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerDied","Data":"e085324cd58339aaa2ff69b985958fab513139997fb20fc1324a7c3c052fa89d"} Dec 03 21:49:25.192310 master-0 kubenswrapper[4754]: I1203 21:49:25.192238 4754 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="fcf20cc5fbcb1ad49ee0d23a982171e11d3207eebc2515ce8bb2b5502de1eab0" exitCode=0 Dec 03 21:49:25.193306 master-0 kubenswrapper[4754]: I1203 21:49:25.192300 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerDied","Data":"fcf20cc5fbcb1ad49ee0d23a982171e11d3207eebc2515ce8bb2b5502de1eab0"} Dec 03 21:49:25.350972 master-0 kubenswrapper[4754]: I1203 21:49:25.350923 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:25.351113 master-0 kubenswrapper[4754]: E1203 21:49:25.351088 4754 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:49:25.351160 master-0 kubenswrapper[4754]: E1203 21:49:25.351146 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:57.351131502 +0000 UTC m=+161.104229117 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 21:49:25.686031 master-0 kubenswrapper[4754]: I1203 21:49:25.685937 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:25.686275 master-0 kubenswrapper[4754]: I1203 21:49:25.686016 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:25.686275 master-0 kubenswrapper[4754]: E1203 21:49:25.686168 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:25.686486 master-0 kubenswrapper[4754]: E1203 21:49:25.686402 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:26.205029 master-0 kubenswrapper[4754]: I1203 21:49:26.204911 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" event={"ID":"ffad8fc8-4378-44de-8864-dd2f666ade68","Type":"ContainerStarted","Data":"2178fbc45b178ef712830b003837d6836a55f49d86bc96af56687fd1bb5bfbf0"} Dec 03 21:49:26.213407 master-0 kubenswrapper[4754]: I1203 21:49:26.213332 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad"} Dec 03 21:49:26.235622 master-0 kubenswrapper[4754]: I1203 21:49:26.235508 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qz5vh" podStartSLOduration=4.530192117 podStartE2EDuration="34.235433092s" podCreationTimestamp="2025-12-03 21:48:52 +0000 UTC" firstStartedPulling="2025-12-03 21:48:52.989604339 +0000 UTC m=+96.742701984" lastFinishedPulling="2025-12-03 21:49:22.694845334 +0000 UTC m=+126.447942959" observedRunningTime="2025-12-03 21:49:26.234580607 +0000 UTC m=+129.987678312" watchObservedRunningTime="2025-12-03 21:49:26.235433092 +0000 UTC m=+129.988530737" Dec 03 21:49:26.670599 master-0 kubenswrapper[4754]: E1203 21:49:26.670457 4754 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 21:49:27.686111 master-0 kubenswrapper[4754]: I1203 21:49:27.685993 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:27.687001 master-0 kubenswrapper[4754]: E1203 21:49:27.686282 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:27.687001 master-0 kubenswrapper[4754]: I1203 21:49:27.686388 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:27.687001 master-0 kubenswrapper[4754]: E1203 21:49:27.686634 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:29.238566 master-0 kubenswrapper[4754]: I1203 21:49:29.237631 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerStarted","Data":"40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a"} Dec 03 21:49:29.238566 master-0 kubenswrapper[4754]: I1203 21:49:29.238174 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:29.238566 master-0 kubenswrapper[4754]: I1203 21:49:29.238254 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:29.238566 master-0 kubenswrapper[4754]: I1203 21:49:29.238410 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:29.277050 master-0 kubenswrapper[4754]: I1203 21:49:29.276992 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:29.278508 master-0 kubenswrapper[4754]: I1203 21:49:29.278454 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:29.315036 master-0 kubenswrapper[4754]: I1203 21:49:29.314911 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podStartSLOduration=6.873527688 podStartE2EDuration="24.314862108s" podCreationTimestamp="2025-12-03 21:49:05 +0000 UTC" firstStartedPulling="2025-12-03 21:49:05.408182303 +0000 UTC m=+109.161279918" lastFinishedPulling="2025-12-03 21:49:22.849516723 +0000 UTC m=+126.602614338" observedRunningTime="2025-12-03 21:49:29.27565092 +0000 UTC m=+133.028748585" watchObservedRunningTime="2025-12-03 21:49:29.314862108 +0000 UTC m=+133.067959783" Dec 03 21:49:29.685423 master-0 kubenswrapper[4754]: I1203 21:49:29.685325 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:29.685423 master-0 kubenswrapper[4754]: I1203 21:49:29.685387 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:29.685846 master-0 kubenswrapper[4754]: E1203 21:49:29.685608 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:29.685960 master-0 kubenswrapper[4754]: E1203 21:49:29.685835 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:31.257436 master-0 kubenswrapper[4754]: I1203 21:49:31.256938 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-78hts"] Dec 03 21:49:31.257436 master-0 kubenswrapper[4754]: I1203 21:49:31.257437 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:31.258564 master-0 kubenswrapper[4754]: E1203 21:49:31.257533 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:31.260972 master-0 kubenswrapper[4754]: I1203 21:49:31.260891 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h6569"] Dec 03 21:49:31.261153 master-0 kubenswrapper[4754]: I1203 21:49:31.261060 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:31.261257 master-0 kubenswrapper[4754]: E1203 21:49:31.261204 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:31.672523 master-0 kubenswrapper[4754]: E1203 21:49:31.672399 4754 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 21:49:32.685832 master-0 kubenswrapper[4754]: I1203 21:49:32.685720 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:32.686807 master-0 kubenswrapper[4754]: I1203 21:49:32.685846 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:32.686807 master-0 kubenswrapper[4754]: E1203 21:49:32.686027 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:32.686807 master-0 kubenswrapper[4754]: E1203 21:49:32.686187 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:33.189529 master-0 kubenswrapper[4754]: I1203 21:49:33.189445 4754 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-td6fl"] Dec 03 21:49:33.190165 master-0 kubenswrapper[4754]: I1203 21:49:33.190054 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-controller" containerID="cri-o://3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb" gracePeriod=30 Dec 03 21:49:33.194792 master-0 kubenswrapper[4754]: I1203 21:49:33.194702 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="sbdb" containerID="cri-o://b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad" gracePeriod=30 Dec 03 21:49:33.194943 master-0 kubenswrapper[4754]: I1203 21:49:33.194823 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="nbdb" containerID="cri-o://b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b" gracePeriod=30 Dec 03 21:49:33.194943 master-0 kubenswrapper[4754]: I1203 21:49:33.194918 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="northd" containerID="cri-o://eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2" gracePeriod=30 Dec 03 21:49:33.195025 master-0 kubenswrapper[4754]: I1203 21:49:33.194976 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524" gracePeriod=30 Dec 03 21:49:33.195137 master-0 kubenswrapper[4754]: I1203 21:49:33.195033 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-node" containerID="cri-o://75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd" gracePeriod=30 Dec 03 21:49:33.195137 master-0 kubenswrapper[4754]: I1203 21:49:33.195089 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-acl-logging" containerID="cri-o://f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235" gracePeriod=30 Dec 03 21:49:33.231003 master-0 kubenswrapper[4754]: I1203 21:49:33.230796 4754 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovnkube-controller" containerID="cri-o://40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a" gracePeriod=30 Dec 03 21:49:33.231375 master-0 kubenswrapper[4754]: I1203 21:49:33.231060 4754 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovnkube-controller" probeResult="failure" output="" Dec 03 21:49:34.266356 master-0 kubenswrapper[4754]: I1203 21:49:34.266234 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/kube-rbac-proxy-ovn-metrics/0.log" Dec 03 21:49:34.267767 master-0 kubenswrapper[4754]: I1203 21:49:34.267008 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/kube-rbac-proxy-node/0.log" Dec 03 21:49:34.267767 master-0 kubenswrapper[4754]: I1203 21:49:34.267576 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovn-acl-logging/0.log" Dec 03 21:49:34.268354 master-0 kubenswrapper[4754]: I1203 21:49:34.268300 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovn-controller/0.log" Dec 03 21:49:34.268930 master-0 kubenswrapper[4754]: I1203 21:49:34.268870 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad" exitCode=0 Dec 03 21:49:34.268930 master-0 kubenswrapper[4754]: I1203 21:49:34.268921 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b" exitCode=0 Dec 03 21:49:34.269092 master-0 kubenswrapper[4754]: I1203 21:49:34.268946 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2" exitCode=0 Dec 03 21:49:34.269092 master-0 kubenswrapper[4754]: I1203 21:49:34.268968 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524" exitCode=143 Dec 03 21:49:34.269092 master-0 kubenswrapper[4754]: I1203 21:49:34.268986 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd" exitCode=143 Dec 03 21:49:34.269092 master-0 kubenswrapper[4754]: I1203 21:49:34.269005 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235" exitCode=143 Dec 03 21:49:34.269092 master-0 kubenswrapper[4754]: I1203 21:49:34.269023 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb" exitCode=143 Dec 03 21:49:34.269092 master-0 kubenswrapper[4754]: I1203 21:49:34.269060 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad"} Dec 03 21:49:34.269363 master-0 kubenswrapper[4754]: I1203 21:49:34.269109 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b"} Dec 03 21:49:34.269363 master-0 kubenswrapper[4754]: I1203 21:49:34.269135 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2"} Dec 03 21:49:34.269363 master-0 kubenswrapper[4754]: I1203 21:49:34.269155 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524"} Dec 03 21:49:34.269363 master-0 kubenswrapper[4754]: I1203 21:49:34.269173 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd"} Dec 03 21:49:34.269363 master-0 kubenswrapper[4754]: I1203 21:49:34.269192 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235"} Dec 03 21:49:34.269363 master-0 kubenswrapper[4754]: I1203 21:49:34.269211 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb"} Dec 03 21:49:34.686097 master-0 kubenswrapper[4754]: I1203 21:49:34.685989 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:34.686365 master-0 kubenswrapper[4754]: I1203 21:49:34.685994 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:34.686476 master-0 kubenswrapper[4754]: E1203 21:49:34.686255 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h6569" podUID="812401c0-d1ac-4857-b939-217b7b07f8bc" Dec 03 21:49:34.686545 master-0 kubenswrapper[4754]: E1203 21:49:34.686411 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:35.000911 master-0 kubenswrapper[4754]: I1203 21:49:35.000844 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovnkube-controller/0.log" Dec 03 21:49:35.003598 master-0 kubenswrapper[4754]: I1203 21:49:35.003554 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/kube-rbac-proxy-ovn-metrics/0.log" Dec 03 21:49:35.004255 master-0 kubenswrapper[4754]: I1203 21:49:35.004218 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/kube-rbac-proxy-node/0.log" Dec 03 21:49:35.005089 master-0 kubenswrapper[4754]: I1203 21:49:35.005045 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovn-acl-logging/0.log" Dec 03 21:49:35.006299 master-0 kubenswrapper[4754]: I1203 21:49:35.006250 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovn-controller/0.log" Dec 03 21:49:35.007337 master-0 kubenswrapper[4754]: I1203 21:49:35.007287 4754 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:35.073480 master-0 kubenswrapper[4754]: I1203 21:49:35.073363 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-k2j45"] Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073519 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="northd" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073536 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="northd" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073544 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="sbdb" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073550 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="sbdb" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073559 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kubecfg-setup" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073565 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kubecfg-setup" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073572 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-node" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073578 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-node" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073585 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-controller" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073592 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-controller" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073598 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-acl-logging" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073604 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-acl-logging" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073613 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="nbdb" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073619 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="nbdb" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073625 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073632 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: E1203 21:49:35.073638 4754 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovnkube-controller" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073644 4754 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovnkube-controller" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073680 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="sbdb" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073691 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-controller" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073698 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="northd" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073706 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="nbdb" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073712 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-node" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073718 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovn-acl-logging" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073724 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 21:49:35.073886 master-0 kubenswrapper[4754]: I1203 21:49:35.073731 4754 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerName="ovnkube-controller" Dec 03 21:49:35.076021 master-0 kubenswrapper[4754]: I1203 21:49:35.074544 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.150677 master-0 kubenswrapper[4754]: I1203 21:49:35.150570 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lb89\" (UniqueName: \"kubernetes.io/projected/86c2417b-23bb-4eb8-abc5-4fc5beab2873-kube-api-access-4lb89\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.150677 master-0 kubenswrapper[4754]: I1203 21:49:35.150633 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-netd\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150735 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-ovn\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150763 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-slash\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150814 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-ovn-kubernetes\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150837 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-openvswitch\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150873 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-env-overrides\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150896 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-systemd-units\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150922 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-var-lib-openvswitch\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150942 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-etc-openvswitch\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150958 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-log-socket\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.150993 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-kubelet\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.151012 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-var-lib-cni-networks-ovn-kubernetes\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.151041 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-script-lib\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.151062 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-config\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.151081 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-netns\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.151061 master-0 kubenswrapper[4754]: I1203 21:49:35.151105 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovn-node-metrics-cert\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.152252 master-0 kubenswrapper[4754]: I1203 21:49:35.151127 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-systemd\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.152252 master-0 kubenswrapper[4754]: I1203 21:49:35.151184 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-bin\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.152252 master-0 kubenswrapper[4754]: I1203 21:49:35.151205 4754 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-node-log\") pod \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\" (UID: \"86c2417b-23bb-4eb8-abc5-4fc5beab2873\") " Dec 03 21:49:35.152252 master-0 kubenswrapper[4754]: I1203 21:49:35.151817 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-node-log" (OuterVolumeSpecName: "node-log") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.152912 master-0 kubenswrapper[4754]: I1203 21:49:35.152858 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.152912 master-0 kubenswrapper[4754]: I1203 21:49:35.152906 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.152938 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.152964 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.152990 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.153014 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.153038 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.153061 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.153082 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.153109 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153124 master-0 kubenswrapper[4754]: I1203 21:49:35.153131 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153157 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153185 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153209 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153231 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153251 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153292 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153340 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153361 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153383 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153430 4754 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-node-log\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153648 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153676 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153700 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153723 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-slash" (OuterVolumeSpecName: "host-slash") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.153754 master-0 kubenswrapper[4754]: I1203 21:49:35.153745 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.155147 master-0 kubenswrapper[4754]: I1203 21:49:35.153961 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.154075 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.154302 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.154386 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-log-socket" (OuterVolumeSpecName: "log-socket") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.154378 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.154444 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.154498 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.156003 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.156387 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.156519 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:49:35.157581 master-0 kubenswrapper[4754]: I1203 21:49:35.157019 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:49:35.160741 master-0 kubenswrapper[4754]: I1203 21:49:35.160020 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c2417b-23bb-4eb8-abc5-4fc5beab2873-kube-api-access-4lb89" (OuterVolumeSpecName: "kube-api-access-4lb89") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "kube-api-access-4lb89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:49:35.160741 master-0 kubenswrapper[4754]: I1203 21:49:35.160620 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:49:35.161612 master-0 kubenswrapper[4754]: I1203 21:49:35.160646 4754 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "86c2417b-23bb-4eb8-abc5-4fc5beab2873" (UID: "86c2417b-23bb-4eb8-abc5-4fc5beab2873"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:49:35.254618 master-0 kubenswrapper[4754]: I1203 21:49:35.254435 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.254618 master-0 kubenswrapper[4754]: I1203 21:49:35.254538 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.254618 master-0 kubenswrapper[4754]: I1203 21:49:35.254574 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.254618 master-0 kubenswrapper[4754]: I1203 21:49:35.254607 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254658 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254698 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254730 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254742 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254864 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254764 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254904 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254944 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254904 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254985 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254989 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.254953 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255087 master-0 kubenswrapper[4754]: I1203 21:49:35.255011 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255118 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255154 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255243 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255248 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255351 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255410 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255547 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255606 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255658 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255696 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255732 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.255817 master-0 kubenswrapper[4754]: I1203 21:49:35.255817 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.255856 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.255911 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.255963 4754 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.255972 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.255986 4754 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256023 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256030 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256022 4754 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-netns\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256056 4754 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86c2417b-23bb-4eb8-abc5-4fc5beab2873-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256072 4754 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-systemd\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256082 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256085 4754 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256107 4754 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lb89\" (UniqueName: \"kubernetes.io/projected/86c2417b-23bb-4eb8-abc5-4fc5beab2873-kube-api-access-4lb89\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256119 4754 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256132 4754 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256143 4754 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-slash\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256155 4754 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256168 4754 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256179 4754 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-systemd-units\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256193 4754 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86c2417b-23bb-4eb8-abc5-4fc5beab2873-env-overrides\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256209 4754 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256225 4754 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.256483 master-0 kubenswrapper[4754]: I1203 21:49:35.256235 4754 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-log-socket\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.257827 master-0 kubenswrapper[4754]: I1203 21:49:35.256247 4754 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-kubelet\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.257827 master-0 kubenswrapper[4754]: I1203 21:49:35.256258 4754 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86c2417b-23bb-4eb8-abc5-4fc5beab2873-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Dec 03 21:49:35.257827 master-0 kubenswrapper[4754]: I1203 21:49:35.256106 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.257827 master-0 kubenswrapper[4754]: I1203 21:49:35.256434 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.257827 master-0 kubenswrapper[4754]: I1203 21:49:35.257201 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.258446 master-0 kubenswrapper[4754]: I1203 21:49:35.258396 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.275886 master-0 kubenswrapper[4754]: I1203 21:49:35.275161 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovnkube-controller/0.log" Dec 03 21:49:35.277123 master-0 kubenswrapper[4754]: I1203 21:49:35.277094 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/kube-rbac-proxy-ovn-metrics/0.log" Dec 03 21:49:35.278461 master-0 kubenswrapper[4754]: I1203 21:49:35.278409 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/kube-rbac-proxy-node/0.log" Dec 03 21:49:35.279104 master-0 kubenswrapper[4754]: I1203 21:49:35.279085 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovn-acl-logging/0.log" Dec 03 21:49:35.279661 master-0 kubenswrapper[4754]: I1203 21:49:35.279639 4754 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-td6fl_86c2417b-23bb-4eb8-abc5-4fc5beab2873/ovn-controller/0.log" Dec 03 21:49:35.280118 master-0 kubenswrapper[4754]: I1203 21:49:35.280081 4754 generic.go:334] "Generic (PLEG): container finished" podID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" containerID="40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a" exitCode=1 Dec 03 21:49:35.280193 master-0 kubenswrapper[4754]: I1203 21:49:35.280138 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a"} Dec 03 21:49:35.280244 master-0 kubenswrapper[4754]: I1203 21:49:35.280195 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" event={"ID":"86c2417b-23bb-4eb8-abc5-4fc5beab2873","Type":"ContainerDied","Data":"160c955e6bc6fbc37a201275978850561279b3beae60311f7da8181d0840a469"} Dec 03 21:49:35.280244 master-0 kubenswrapper[4754]: I1203 21:49:35.280224 4754 scope.go:117] "RemoveContainer" containerID="40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a" Dec 03 21:49:35.280417 master-0 kubenswrapper[4754]: I1203 21:49:35.280378 4754 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-td6fl" Dec 03 21:49:35.284450 master-0 kubenswrapper[4754]: I1203 21:49:35.284200 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.306475 master-0 kubenswrapper[4754]: I1203 21:49:35.306422 4754 scope.go:117] "RemoveContainer" containerID="b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad" Dec 03 21:49:35.319766 master-0 kubenswrapper[4754]: I1203 21:49:35.319688 4754 scope.go:117] "RemoveContainer" containerID="b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b" Dec 03 21:49:35.331083 master-0 kubenswrapper[4754]: I1203 21:49:35.331033 4754 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-td6fl"] Dec 03 21:49:35.340427 master-0 kubenswrapper[4754]: I1203 21:49:35.340361 4754 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-td6fl"] Dec 03 21:49:35.341188 master-0 kubenswrapper[4754]: I1203 21:49:35.341136 4754 scope.go:117] "RemoveContainer" containerID="eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2" Dec 03 21:49:35.351437 master-0 kubenswrapper[4754]: I1203 21:49:35.351381 4754 scope.go:117] "RemoveContainer" containerID="70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524" Dec 03 21:49:35.365040 master-0 kubenswrapper[4754]: I1203 21:49:35.364986 4754 scope.go:117] "RemoveContainer" containerID="75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd" Dec 03 21:49:35.374909 master-0 kubenswrapper[4754]: I1203 21:49:35.374869 4754 scope.go:117] "RemoveContainer" containerID="f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235" Dec 03 21:49:35.384799 master-0 kubenswrapper[4754]: I1203 21:49:35.384731 4754 scope.go:117] "RemoveContainer" containerID="3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb" Dec 03 21:49:35.391669 master-0 kubenswrapper[4754]: I1203 21:49:35.391614 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:35.405630 master-0 kubenswrapper[4754]: I1203 21:49:35.405590 4754 scope.go:117] "RemoveContainer" containerID="f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d" Dec 03 21:49:35.413389 master-0 kubenswrapper[4754]: W1203 21:49:35.413330 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53713eab_c920_4d5a_ae05_7cdb59ace852.slice/crio-512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9 WatchSource:0}: Error finding container 512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9: Status 404 returned error can't find the container with id 512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9 Dec 03 21:49:35.418013 master-0 kubenswrapper[4754]: I1203 21:49:35.417975 4754 scope.go:117] "RemoveContainer" containerID="40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a" Dec 03 21:49:35.418703 master-0 kubenswrapper[4754]: E1203 21:49:35.418653 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a\": container with ID starting with 40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a not found: ID does not exist" containerID="40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a" Dec 03 21:49:35.418831 master-0 kubenswrapper[4754]: I1203 21:49:35.418712 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a"} err="failed to get container status \"40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a\": rpc error: code = NotFound desc = could not find container \"40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a\": container with ID starting with 40d80a4d39d69a7894e9a8f08b347f0593d992840c2f774e604f256d65907f9a not found: ID does not exist" Dec 03 21:49:35.418890 master-0 kubenswrapper[4754]: I1203 21:49:35.418831 4754 scope.go:117] "RemoveContainer" containerID="b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad" Dec 03 21:49:35.419574 master-0 kubenswrapper[4754]: E1203 21:49:35.419502 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad\": container with ID starting with b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad not found: ID does not exist" containerID="b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad" Dec 03 21:49:35.419658 master-0 kubenswrapper[4754]: I1203 21:49:35.419596 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad"} err="failed to get container status \"b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad\": rpc error: code = NotFound desc = could not find container \"b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad\": container with ID starting with b89ca2c2160e87b394eb9c25dd62014f96cec2c1ad7ca29fd05ba0a52cc2ffad not found: ID does not exist" Dec 03 21:49:35.419703 master-0 kubenswrapper[4754]: I1203 21:49:35.419668 4754 scope.go:117] "RemoveContainer" containerID="b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b" Dec 03 21:49:35.420163 master-0 kubenswrapper[4754]: E1203 21:49:35.420120 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b\": container with ID starting with b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b not found: ID does not exist" containerID="b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b" Dec 03 21:49:35.420223 master-0 kubenswrapper[4754]: I1203 21:49:35.420167 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b"} err="failed to get container status \"b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b\": rpc error: code = NotFound desc = could not find container \"b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b\": container with ID starting with b7fb877b9dc398aaf5e3663b06fa37219c71f4c678b6e6b1b1600c57c369295b not found: ID does not exist" Dec 03 21:49:35.420223 master-0 kubenswrapper[4754]: I1203 21:49:35.420198 4754 scope.go:117] "RemoveContainer" containerID="eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2" Dec 03 21:49:35.420632 master-0 kubenswrapper[4754]: E1203 21:49:35.420556 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2\": container with ID starting with eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2 not found: ID does not exist" containerID="eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2" Dec 03 21:49:35.420678 master-0 kubenswrapper[4754]: I1203 21:49:35.420638 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2"} err="failed to get container status \"eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2\": rpc error: code = NotFound desc = could not find container \"eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2\": container with ID starting with eb708f9016d1ba8f60b83765e97eee92e1cb840f8df75e6d3f23319e66d84cc2 not found: ID does not exist" Dec 03 21:49:35.420721 master-0 kubenswrapper[4754]: I1203 21:49:35.420687 4754 scope.go:117] "RemoveContainer" containerID="70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524" Dec 03 21:49:35.421351 master-0 kubenswrapper[4754]: E1203 21:49:35.421267 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524\": container with ID starting with 70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524 not found: ID does not exist" containerID="70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524" Dec 03 21:49:35.421402 master-0 kubenswrapper[4754]: I1203 21:49:35.421364 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524"} err="failed to get container status \"70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524\": rpc error: code = NotFound desc = could not find container \"70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524\": container with ID starting with 70d7c61bcb5a1ad6c179de9b63e908b6d03efe916ae622b39caa33b176f4f524 not found: ID does not exist" Dec 03 21:49:35.421439 master-0 kubenswrapper[4754]: I1203 21:49:35.421412 4754 scope.go:117] "RemoveContainer" containerID="75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd" Dec 03 21:49:35.423538 master-0 kubenswrapper[4754]: E1203 21:49:35.423486 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd\": container with ID starting with 75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd not found: ID does not exist" containerID="75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd" Dec 03 21:49:35.424479 master-0 kubenswrapper[4754]: I1203 21:49:35.424417 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd"} err="failed to get container status \"75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd\": rpc error: code = NotFound desc = could not find container \"75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd\": container with ID starting with 75c140eaf6a210def049105402f21d0fd36f6f8ffb40d7ea39e6fbbf641247cd not found: ID does not exist" Dec 03 21:49:35.424614 master-0 kubenswrapper[4754]: I1203 21:49:35.424578 4754 scope.go:117] "RemoveContainer" containerID="f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235" Dec 03 21:49:35.425110 master-0 kubenswrapper[4754]: E1203 21:49:35.425081 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235\": container with ID starting with f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235 not found: ID does not exist" containerID="f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235" Dec 03 21:49:35.425227 master-0 kubenswrapper[4754]: I1203 21:49:35.425198 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235"} err="failed to get container status \"f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235\": rpc error: code = NotFound desc = could not find container \"f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235\": container with ID starting with f87c6fad2b7808c40a8d22d64c761805de34723fbaf83ed877472033e00e8235 not found: ID does not exist" Dec 03 21:49:35.425298 master-0 kubenswrapper[4754]: I1203 21:49:35.425287 4754 scope.go:117] "RemoveContainer" containerID="3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb" Dec 03 21:49:35.425903 master-0 kubenswrapper[4754]: E1203 21:49:35.425883 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb\": container with ID starting with 3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb not found: ID does not exist" containerID="3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb" Dec 03 21:49:35.426006 master-0 kubenswrapper[4754]: I1203 21:49:35.425985 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb"} err="failed to get container status \"3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb\": rpc error: code = NotFound desc = could not find container \"3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb\": container with ID starting with 3f33183d242c82c968225b751537da09870d3015639fd073f2a18f2c444fa0fb not found: ID does not exist" Dec 03 21:49:35.426090 master-0 kubenswrapper[4754]: I1203 21:49:35.426077 4754 scope.go:117] "RemoveContainer" containerID="f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d" Dec 03 21:49:35.426659 master-0 kubenswrapper[4754]: E1203 21:49:35.426601 4754 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d\": container with ID starting with f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d not found: ID does not exist" containerID="f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d" Dec 03 21:49:35.426720 master-0 kubenswrapper[4754]: I1203 21:49:35.426679 4754 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d"} err="failed to get container status \"f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d\": rpc error: code = NotFound desc = could not find container \"f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d\": container with ID starting with f0ce8acb254ea015dd32837ae9de654985749e13fc2a4adac48ef75a8baa7f6d not found: ID does not exist" Dec 03 21:49:36.286664 master-0 kubenswrapper[4754]: I1203 21:49:36.285348 4754 generic.go:334] "Generic (PLEG): container finished" podID="53713eab-c920-4d5a-ae05-7cdb59ace852" containerID="e2f2192c61ed1d621d2aff90353ee42f69de43d1e563bba5ffc2fd9223a2ba8a" exitCode=0 Dec 03 21:49:36.286664 master-0 kubenswrapper[4754]: I1203 21:49:36.285470 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerDied","Data":"e2f2192c61ed1d621d2aff90353ee42f69de43d1e563bba5ffc2fd9223a2ba8a"} Dec 03 21:49:36.286664 master-0 kubenswrapper[4754]: I1203 21:49:36.285545 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9"} Dec 03 21:49:36.686829 master-0 kubenswrapper[4754]: I1203 21:49:36.686747 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:36.687464 master-0 kubenswrapper[4754]: I1203 21:49:36.687107 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:36.689155 master-0 kubenswrapper[4754]: I1203 21:49:36.689098 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 21:49:36.690315 master-0 kubenswrapper[4754]: I1203 21:49:36.690014 4754 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c2417b-23bb-4eb8-abc5-4fc5beab2873" path="/var/lib/kubelet/pods/86c2417b-23bb-4eb8-abc5-4fc5beab2873/volumes" Dec 03 21:49:36.690315 master-0 kubenswrapper[4754]: I1203 21:49:36.690041 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 21:49:36.690315 master-0 kubenswrapper[4754]: I1203 21:49:36.690216 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 21:49:37.296333 master-0 kubenswrapper[4754]: I1203 21:49:37.295947 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"79f9b02e40a0b2fdb0fc976f992c94d5f772500b193c01f6b87bc49625471dfa"} Dec 03 21:49:37.296333 master-0 kubenswrapper[4754]: I1203 21:49:37.296318 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"e1b0beb217e8e7d5512c8807607843523d3ba95c04cd5f39ae87c249e0aba4eb"} Dec 03 21:49:37.296333 master-0 kubenswrapper[4754]: I1203 21:49:37.296333 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"1115a34685de25fc1d04e05a256063a0e2268f63ab2bb423f49fd1e405298bb2"} Dec 03 21:49:37.296333 master-0 kubenswrapper[4754]: I1203 21:49:37.296346 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"6056e9b1d6031b79abfb90dd30c3dd6892881b1858040d34c02ab18e822ffbe1"} Dec 03 21:49:37.296333 master-0 kubenswrapper[4754]: I1203 21:49:37.296358 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"e4e951251517513c34974a277f46287e64a9141f8631368d73bb58ff6b2bba5a"} Dec 03 21:49:37.297648 master-0 kubenswrapper[4754]: I1203 21:49:37.296371 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"7f986d94211e10657c1c09a829b6b4b50f56e235482e2569c34bca84b39a6696"} Dec 03 21:49:40.001303 master-0 kubenswrapper[4754]: I1203 21:49:40.001239 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:40.011651 master-0 kubenswrapper[4754]: I1203 21:49:40.011562 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:40.022140 master-0 kubenswrapper[4754]: I1203 21:49:40.022062 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:40.225092 master-0 kubenswrapper[4754]: E1203 21:49:40.224997 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:40.225092 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-78hts_openshift-network-diagnostics_e6d5d61a-c5de-4619-9afb-7fad63ba0525_0(d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d): error adding pod openshift-network-diagnostics_network-check-target-78hts to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d" Netns:"/var/run/netns/e33d226e-1c9f-49e9-aa6a-9eb7cf02a2f6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-78hts;K8S_POD_INFRA_CONTAINER_ID=d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d;K8S_POD_UID=e6d5d61a-c5de-4619-9afb-7fad63ba0525" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-78hts] networking: [openshift-network-diagnostics/network-check-target-78hts/e6d5d61a-c5de-4619-9afb-7fad63ba0525:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:40.225092 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:40.225092 master-0 kubenswrapper[4754]: > Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: E1203 21:49:40.225140 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-78hts_openshift-network-diagnostics_e6d5d61a-c5de-4619-9afb-7fad63ba0525_0(d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d): error adding pod openshift-network-diagnostics_network-check-target-78hts to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d" Netns:"/var/run/netns/e33d226e-1c9f-49e9-aa6a-9eb7cf02a2f6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-78hts;K8S_POD_INFRA_CONTAINER_ID=d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d;K8S_POD_UID=e6d5d61a-c5de-4619-9afb-7fad63ba0525" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-78hts] networking: [openshift-network-diagnostics/network-check-target-78hts/e6d5d61a-c5de-4619-9afb-7fad63ba0525:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: > pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: E1203 21:49:40.225174 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-78hts_openshift-network-diagnostics_e6d5d61a-c5de-4619-9afb-7fad63ba0525_0(d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d): error adding pod openshift-network-diagnostics_network-check-target-78hts to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d" Netns:"/var/run/netns/e33d226e-1c9f-49e9-aa6a-9eb7cf02a2f6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-78hts;K8S_POD_INFRA_CONTAINER_ID=d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d;K8S_POD_UID=e6d5d61a-c5de-4619-9afb-7fad63ba0525" Path:"" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-78hts] networking: [openshift-network-diagnostics/network-check-target-78hts/e6d5d61a-c5de-4619-9afb-7fad63ba0525:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: > pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:40.225639 master-0 kubenswrapper[4754]: E1203 21:49:40.225314 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"network-check-target-78hts_openshift-network-diagnostics(e6d5d61a-c5de-4619-9afb-7fad63ba0525)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"network-check-target-78hts_openshift-network-diagnostics(e6d5d61a-c5de-4619-9afb-7fad63ba0525)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_network-check-target-78hts_openshift-network-diagnostics_e6d5d61a-c5de-4619-9afb-7fad63ba0525_0(d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d): error adding pod openshift-network-diagnostics_network-check-target-78hts to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d\\\" Netns:\\\"/var/run/netns/e33d226e-1c9f-49e9-aa6a-9eb7cf02a2f6\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-network-diagnostics;K8S_POD_NAME=network-check-target-78hts;K8S_POD_INFRA_CONTAINER_ID=d57ae083989935b3b792211726f4e73d5bb3550737b110538b06757241b99b5d;K8S_POD_UID=e6d5d61a-c5de-4619-9afb-7fad63ba0525\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-network-diagnostics/network-check-target-78hts] networking: [openshift-network-diagnostics/network-check-target-78hts/e6d5d61a-c5de-4619-9afb-7fad63ba0525:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:49:40.314415 master-0 kubenswrapper[4754]: I1203 21:49:40.314236 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"8d1c6889c5a7ff895597adbef02ca16b0e738a892b347774d9d1fb1e7647ab52"} Dec 03 21:49:42.331885 master-0 kubenswrapper[4754]: I1203 21:49:42.331793 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" event={"ID":"53713eab-c920-4d5a-ae05-7cdb59ace852","Type":"ContainerStarted","Data":"20690b58de9a3df5d2bd22c35f4b0b70d5938b9b4d3aa5b109fca7995dbe3055"} Dec 03 21:49:42.333334 master-0 kubenswrapper[4754]: I1203 21:49:42.332579 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:42.333334 master-0 kubenswrapper[4754]: I1203 21:49:42.332629 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:42.333334 master-0 kubenswrapper[4754]: I1203 21:49:42.332647 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:42.363022 master-0 kubenswrapper[4754]: I1203 21:49:42.362948 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:42.366930 master-0 kubenswrapper[4754]: I1203 21:49:42.366584 4754 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:43.101667 master-0 kubenswrapper[4754]: I1203 21:49:43.101516 4754 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Dec 03 21:49:44.324577 master-0 kubenswrapper[4754]: I1203 21:49:44.324446 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" podStartSLOduration=9.324415428 podStartE2EDuration="9.324415428s" podCreationTimestamp="2025-12-03 21:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:49:43.533307706 +0000 UTC m=+147.286405331" watchObservedRunningTime="2025-12-03 21:49:44.324415428 +0000 UTC m=+148.077513083" Dec 03 21:49:44.330718 master-0 kubenswrapper[4754]: I1203 21:49:44.330628 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt"] Dec 03 21:49:44.331271 master-0 kubenswrapper[4754]: I1203 21:49:44.331226 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.333072 master-0 kubenswrapper[4754]: I1203 21:49:44.332484 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm"] Dec 03 21:49:44.333834 master-0 kubenswrapper[4754]: I1203 21:49:44.333748 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b"] Dec 03 21:49:44.334116 master-0 kubenswrapper[4754]: I1203 21:49:44.334067 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 21:49:44.334253 master-0 kubenswrapper[4754]: I1203 21:49:44.334172 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj"] Dec 03 21:49:44.334482 master-0 kubenswrapper[4754]: I1203 21:49:44.334427 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.334482 master-0 kubenswrapper[4754]: I1203 21:49:44.334458 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.335089 master-0 kubenswrapper[4754]: I1203 21:49:44.334837 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr"] Dec 03 21:49:44.335089 master-0 kubenswrapper[4754]: I1203 21:49:44.334989 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.335409 master-0 kubenswrapper[4754]: I1203 21:49:44.335364 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c"] Dec 03 21:49:44.335956 master-0 kubenswrapper[4754]: I1203 21:49:44.335606 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.335956 master-0 kubenswrapper[4754]: I1203 21:49:44.335613 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.336198 master-0 kubenswrapper[4754]: I1203 21:49:44.335909 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh"] Dec 03 21:49:44.336664 master-0 kubenswrapper[4754]: I1203 21:49:44.336621 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.340874 master-0 kubenswrapper[4754]: I1203 21:49:44.340243 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 21:49:44.343224 master-0 kubenswrapper[4754]: I1203 21:49:44.343134 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.344328 master-0 kubenswrapper[4754]: I1203 21:49:44.344283 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 21:49:44.345616 master-0 kubenswrapper[4754]: I1203 21:49:44.345015 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d"] Dec 03 21:49:44.345616 master-0 kubenswrapper[4754]: I1203 21:49:44.345374 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5"] Dec 03 21:49:44.345920 master-0 kubenswrapper[4754]: I1203 21:49:44.345731 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:44.349239 master-0 kubenswrapper[4754]: I1203 21:49:44.346113 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.349239 master-0 kubenswrapper[4754]: I1203 21:49:44.347423 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 21:49:44.349239 master-0 kubenswrapper[4754]: I1203 21:49:44.347891 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.349239 master-0 kubenswrapper[4754]: I1203 21:49:44.348033 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 21:49:44.349239 master-0 kubenswrapper[4754]: I1203 21:49:44.348608 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 21:49:44.349239 master-0 kubenswrapper[4754]: I1203 21:49:44.348737 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 21:49:44.350759 master-0 kubenswrapper[4754]: I1203 21:49:44.349443 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j"] Dec 03 21:49:44.350759 master-0 kubenswrapper[4754]: I1203 21:49:44.349968 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:44.351405 master-0 kubenswrapper[4754]: W1203 21:49:44.351298 4754 reflector.go:561] object-"openshift-marketplace"/"marketplace-trusted-ca": failed to list *v1.ConfigMap: configmaps "marketplace-trusted-ca" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'master-0' and this object Dec 03 21:49:44.351405 master-0 kubenswrapper[4754]: E1203 21:49:44.351356 4754 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"marketplace-trusted-ca\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 21:49:44.351636 master-0 kubenswrapper[4754]: I1203 21:49:44.351521 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 21:49:44.351636 master-0 kubenswrapper[4754]: I1203 21:49:44.351591 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 21:49:44.352217 master-0 kubenswrapper[4754]: I1203 21:49:44.351881 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: W1203 21:49:44.353212 4754 reflector.go:561] object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-operator-lifecycle-manager": no relationship found between node 'master-0' and this object Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: E1203 21:49:44.353256 4754 reflector.go:158] "Unhandled Error" err="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-operator-lifecycle-manager\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.353330 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.354014 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.355021 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.355662 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh"] Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.356187 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.357213 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm"] Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.357624 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x"] Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.358054 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:44.359266 master-0 kubenswrapper[4754]: I1203 21:49:44.358159 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.364344 master-0 kubenswrapper[4754]: I1203 21:49:44.363861 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh"] Dec 03 21:49:44.364669 master-0 kubenswrapper[4754]: I1203 21:49:44.364629 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s"] Dec 03 21:49:44.365712 master-0 kubenswrapper[4754]: I1203 21:49:44.365304 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.374667 master-0 kubenswrapper[4754]: I1203 21:49:44.374469 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.376793 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.376997 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.377088 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.377211 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.377647 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.377922 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.378112 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.378174 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db"] Dec 03 21:49:44.379619 master-0 kubenswrapper[4754]: I1203 21:49:44.378733 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 21:49:44.380784 master-0 kubenswrapper[4754]: I1203 21:49:44.380484 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 21:49:44.388946 master-0 kubenswrapper[4754]: I1203 21:49:44.388917 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.391200 master-0 kubenswrapper[4754]: I1203 21:49:44.391178 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh"] Dec 03 21:49:44.394594 master-0 kubenswrapper[4754]: I1203 21:49:44.394574 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj"] Dec 03 21:49:44.399254 master-0 kubenswrapper[4754]: I1203 21:49:44.399212 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr"] Dec 03 21:49:44.399610 master-0 kubenswrapper[4754]: I1203 21:49:44.399594 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.400148 master-0 kubenswrapper[4754]: I1203 21:49:44.391361 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 21:49:44.400418 master-0 kubenswrapper[4754]: I1203 21:49:44.400404 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:44.400837 master-0 kubenswrapper[4754]: I1203 21:49:44.398344 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.401426 master-0 kubenswrapper[4754]: I1203 21:49:44.401220 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 21:49:44.401426 master-0 kubenswrapper[4754]: I1203 21:49:44.392486 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.401786 master-0 kubenswrapper[4754]: I1203 21:49:44.392536 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 21:49:44.401846 master-0 kubenswrapper[4754]: I1203 21:49:44.393273 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 21:49:44.401882 master-0 kubenswrapper[4754]: I1203 21:49:44.393540 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 21:49:44.401951 master-0 kubenswrapper[4754]: I1203 21:49:44.393578 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.402005 master-0 kubenswrapper[4754]: I1203 21:49:44.393608 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.402082 master-0 kubenswrapper[4754]: I1203 21:49:44.393639 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.402124 master-0 kubenswrapper[4754]: I1203 21:49:44.393673 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 21:49:44.402124 master-0 kubenswrapper[4754]: I1203 21:49:44.393711 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 21:49:44.402182 master-0 kubenswrapper[4754]: I1203 21:49:44.393743 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 21:49:44.402293 master-0 kubenswrapper[4754]: I1203 21:49:44.393826 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 21:49:44.402339 master-0 kubenswrapper[4754]: I1203 21:49:44.393859 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 21:49:44.402373 master-0 kubenswrapper[4754]: I1203 21:49:44.393889 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 21:49:44.402436 master-0 kubenswrapper[4754]: I1203 21:49:44.393918 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 21:49:44.402477 master-0 kubenswrapper[4754]: I1203 21:49:44.393947 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 21:49:44.402561 master-0 kubenswrapper[4754]: I1203 21:49:44.393978 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 21:49:44.402605 master-0 kubenswrapper[4754]: I1203 21:49:44.394008 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 21:49:44.402683 master-0 kubenswrapper[4754]: I1203 21:49:44.394067 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 21:49:44.413787 master-0 kubenswrapper[4754]: I1203 21:49:44.394095 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 21:49:44.413882 master-0 kubenswrapper[4754]: I1203 21:49:44.394138 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 21:49:44.413977 master-0 kubenswrapper[4754]: I1203 21:49:44.394876 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 21:49:44.414059 master-0 kubenswrapper[4754]: I1203 21:49:44.394934 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 21:49:44.414130 master-0 kubenswrapper[4754]: I1203 21:49:44.395208 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 21:49:44.414206 master-0 kubenswrapper[4754]: I1203 21:49:44.395421 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 21:49:44.414283 master-0 kubenswrapper[4754]: I1203 21:49:44.396869 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 21:49:44.414365 master-0 kubenswrapper[4754]: I1203 21:49:44.396965 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 21:49:44.414454 master-0 kubenswrapper[4754]: I1203 21:49:44.397039 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 21:49:44.414541 master-0 kubenswrapper[4754]: I1203 21:49:44.397151 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.414632 master-0 kubenswrapper[4754]: I1203 21:49:44.397253 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.414738 master-0 kubenswrapper[4754]: I1203 21:49:44.397438 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 21:49:44.414848 master-0 kubenswrapper[4754]: I1203 21:49:44.397817 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 21:49:44.414920 master-0 kubenswrapper[4754]: I1203 21:49:44.397915 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 21:49:44.415633 master-0 kubenswrapper[4754]: I1203 21:49:44.398059 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 21:49:44.415884 master-0 kubenswrapper[4754]: I1203 21:49:44.398690 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 21:49:44.416090 master-0 kubenswrapper[4754]: I1203 21:49:44.403378 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 21:49:44.416295 master-0 kubenswrapper[4754]: I1203 21:49:44.403417 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 21:49:44.416491 master-0 kubenswrapper[4754]: I1203 21:49:44.403433 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 21:49:44.416685 master-0 kubenswrapper[4754]: I1203 21:49:44.403451 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.416895 master-0 kubenswrapper[4754]: I1203 21:49:44.403619 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 21:49:44.416992 master-0 kubenswrapper[4754]: I1203 21:49:44.404135 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 21:49:44.417183 master-0 kubenswrapper[4754]: I1203 21:49:44.404165 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 21:49:44.417364 master-0 kubenswrapper[4754]: I1203 21:49:44.407973 4754 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 21:49:44.417478 master-0 kubenswrapper[4754]: I1203 21:49:44.410570 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 21:49:44.440680 master-0 kubenswrapper[4754]: I1203 21:49:44.440626 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.440680 master-0 kubenswrapper[4754]: I1203 21:49:44.440678 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440709 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440747 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440780 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440808 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440827 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440844 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.440855 master-0 kubenswrapper[4754]: I1203 21:49:44.440861 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.441092 master-0 kubenswrapper[4754]: I1203 21:49:44.440882 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.441092 master-0 kubenswrapper[4754]: I1203 21:49:44.440899 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.441092 master-0 kubenswrapper[4754]: I1203 21:49:44.440917 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.441092 master-0 kubenswrapper[4754]: I1203 21:49:44.440933 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:44.441092 master-0 kubenswrapper[4754]: I1203 21:49:44.440949 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.441092 master-0 kubenswrapper[4754]: I1203 21:49:44.441024 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.441342 master-0 kubenswrapper[4754]: I1203 21:49:44.441172 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.441342 master-0 kubenswrapper[4754]: I1203 21:49:44.441232 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.441342 master-0 kubenswrapper[4754]: I1203 21:49:44.441324 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.441456 master-0 kubenswrapper[4754]: I1203 21:49:44.441380 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.441493 master-0 kubenswrapper[4754]: I1203 21:49:44.441463 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.441524 master-0 kubenswrapper[4754]: I1203 21:49:44.441493 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.441555 master-0 kubenswrapper[4754]: I1203 21:49:44.441544 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.441592 master-0 kubenswrapper[4754]: I1203 21:49:44.441571 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.441626 master-0 kubenswrapper[4754]: I1203 21:49:44.441592 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:44.441660 master-0 kubenswrapper[4754]: I1203 21:49:44.441632 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.441699 master-0 kubenswrapper[4754]: I1203 21:49:44.441684 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.441728 master-0 kubenswrapper[4754]: I1203 21:49:44.441706 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.441759 master-0 kubenswrapper[4754]: I1203 21:49:44.441728 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.441759 master-0 kubenswrapper[4754]: I1203 21:49:44.441750 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.441759 master-0 kubenswrapper[4754]: I1203 21:49:44.441820 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.441978 master-0 kubenswrapper[4754]: I1203 21:49:44.441851 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.441978 master-0 kubenswrapper[4754]: I1203 21:49:44.441875 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.441978 master-0 kubenswrapper[4754]: I1203 21:49:44.441913 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.441978 master-0 kubenswrapper[4754]: I1203 21:49:44.441941 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.441978 master-0 kubenswrapper[4754]: I1203 21:49:44.441958 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.441978 master-0 kubenswrapper[4754]: I1203 21:49:44.441977 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.441996 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442025 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442045 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442063 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442083 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442101 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442117 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442134 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.442168 master-0 kubenswrapper[4754]: I1203 21:49:44.442151 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442182 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442201 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442218 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442236 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442258 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442274 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442302 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442317 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442333 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.442409 master-0 kubenswrapper[4754]: I1203 21:49:44.442349 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.543630 master-0 kubenswrapper[4754]: I1203 21:49:44.543583 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.543755 master-0 kubenswrapper[4754]: I1203 21:49:44.543638 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.543755 master-0 kubenswrapper[4754]: I1203 21:49:44.543674 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:44.543755 master-0 kubenswrapper[4754]: I1203 21:49:44.543709 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.543996 master-0 kubenswrapper[4754]: I1203 21:49:44.543737 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:44.544948 master-0 kubenswrapper[4754]: E1203 21:49:44.544896 4754 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:44.545085 master-0 kubenswrapper[4754]: E1203 21:49:44.545031 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.044979614 +0000 UTC m=+148.798077229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:44.545212 master-0 kubenswrapper[4754]: I1203 21:49:44.545172 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.545251 master-0 kubenswrapper[4754]: I1203 21:49:44.545230 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.545287 master-0 kubenswrapper[4754]: I1203 21:49:44.545276 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.545342 master-0 kubenswrapper[4754]: I1203 21:49:44.545317 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.545416 master-0 kubenswrapper[4754]: I1203 21:49:44.545394 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.545465 master-0 kubenswrapper[4754]: I1203 21:49:44.545429 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.545465 master-0 kubenswrapper[4754]: I1203 21:49:44.545461 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.545628 master-0 kubenswrapper[4754]: I1203 21:49:44.545493 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.545677 master-0 kubenswrapper[4754]: I1203 21:49:44.545650 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.545845 master-0 kubenswrapper[4754]: I1203 21:49:44.545681 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.545845 master-0 kubenswrapper[4754]: I1203 21:49:44.545705 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.545845 master-0 kubenswrapper[4754]: I1203 21:49:44.545786 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.545845 master-0 kubenswrapper[4754]: I1203 21:49:44.545816 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.545855 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.545880 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.545920 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.545950 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.545978 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.545987 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.546023 master-0 kubenswrapper[4754]: I1203 21:49:44.546012 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.546259 master-0 kubenswrapper[4754]: I1203 21:49:44.546038 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.546259 master-0 kubenswrapper[4754]: I1203 21:49:44.546115 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.546259 master-0 kubenswrapper[4754]: I1203 21:49:44.546147 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.546259 master-0 kubenswrapper[4754]: I1203 21:49:44.546179 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.546259 master-0 kubenswrapper[4754]: I1203 21:49:44.546220 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.546259 master-0 kubenswrapper[4754]: I1203 21:49:44.546252 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546308 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546341 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546374 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546406 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546427 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546455 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.546520 master-0 kubenswrapper[4754]: I1203 21:49:44.546479 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.547738 master-0 kubenswrapper[4754]: I1203 21:49:44.546490 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.547842 master-0 kubenswrapper[4754]: I1203 21:49:44.547798 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.547929 master-0 kubenswrapper[4754]: I1203 21:49:44.547896 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:44.548393 master-0 kubenswrapper[4754]: I1203 21:49:44.548361 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.549997 master-0 kubenswrapper[4754]: I1203 21:49:44.548690 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.550088 master-0 kubenswrapper[4754]: I1203 21:49:44.548705 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.550088 master-0 kubenswrapper[4754]: I1203 21:49:44.548933 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.550175 master-0 kubenswrapper[4754]: I1203 21:49:44.550100 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.550175 master-0 kubenswrapper[4754]: I1203 21:49:44.550137 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.550175 master-0 kubenswrapper[4754]: I1203 21:49:44.547082 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.550296 master-0 kubenswrapper[4754]: I1203 21:49:44.547125 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.550296 master-0 kubenswrapper[4754]: I1203 21:49:44.550215 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:44.550372 master-0 kubenswrapper[4754]: I1203 21:49:44.550353 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.550428 master-0 kubenswrapper[4754]: I1203 21:49:44.550400 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.550480 master-0 kubenswrapper[4754]: I1203 21:49:44.550442 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.550524 master-0 kubenswrapper[4754]: I1203 21:49:44.550480 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.550524 master-0 kubenswrapper[4754]: I1203 21:49:44.550520 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.550600 master-0 kubenswrapper[4754]: I1203 21:49:44.550556 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.550808 master-0 kubenswrapper[4754]: I1203 21:49:44.550749 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.550871 master-0 kubenswrapper[4754]: I1203 21:49:44.550824 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.551127 master-0 kubenswrapper[4754]: I1203 21:49:44.551091 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.551263 master-0 kubenswrapper[4754]: I1203 21:49:44.549425 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.551672 master-0 kubenswrapper[4754]: I1203 21:49:44.551638 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.551763 master-0 kubenswrapper[4754]: I1203 21:49:44.551735 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.551826 master-0 kubenswrapper[4754]: I1203 21:49:44.551805 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:44.552205 master-0 kubenswrapper[4754]: I1203 21:49:44.549926 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.553545 master-0 kubenswrapper[4754]: E1203 21:49:44.546565 4754 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:44.553623 master-0 kubenswrapper[4754]: I1203 21:49:44.553232 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.553670 master-0 kubenswrapper[4754]: E1203 21:49:44.553621 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.05358777 +0000 UTC m=+148.806685385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:44.553719 master-0 kubenswrapper[4754]: E1203 21:49:44.546687 4754 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:44.553760 master-0 kubenswrapper[4754]: E1203 21:49:44.547250 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:44.553843 master-0 kubenswrapper[4754]: I1203 21:49:44.553785 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.548623 4754 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: I1203 21:49:44.553997 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: I1203 21:49:44.554148 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.554198 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.054174849 +0000 UTC m=+148.807272464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.554233 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.05422186 +0000 UTC m=+148.807319475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.554280 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.054259602 +0000 UTC m=+148.807357217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.551394 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.553093 4754 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.553454 4754 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: E1203 21:49:44.554339 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.054331504 +0000 UTC m=+148.807429119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: I1203 21:49:44.554264 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: I1203 21:49:44.554448 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:44.554813 master-0 kubenswrapper[4754]: I1203 21:49:44.554480 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.554916 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: E1203 21:49:44.554451 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.054414457 +0000 UTC m=+148.807512072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.555084 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.555446 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.555466 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: E1203 21:49:44.555022 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.055008366 +0000 UTC m=+148.808105981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.555690 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.555843 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:44.556385 master-0 kubenswrapper[4754]: I1203 21:49:44.556178 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:44.557296 master-0 kubenswrapper[4754]: I1203 21:49:44.557253 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:44.557370 master-0 kubenswrapper[4754]: I1203 21:49:44.557320 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.558537 master-0 kubenswrapper[4754]: I1203 21:49:44.558469 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.558813 master-0 kubenswrapper[4754]: I1203 21:49:44.558761 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:44.559186 master-0 kubenswrapper[4754]: I1203 21:49:44.559157 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:44.559352 master-0 kubenswrapper[4754]: I1203 21:49:44.559323 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:44.559682 master-0 kubenswrapper[4754]: I1203 21:49:44.559645 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:44.560354 master-0 kubenswrapper[4754]: I1203 21:49:44.560307 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:44.560693 master-0 kubenswrapper[4754]: I1203 21:49:44.560633 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:44.661564 master-0 kubenswrapper[4754]: I1203 21:49:44.661484 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.661742 master-0 kubenswrapper[4754]: I1203 21:49:44.661626 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:44.662517 master-0 kubenswrapper[4754]: I1203 21:49:44.661936 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.662517 master-0 kubenswrapper[4754]: I1203 21:49:44.661987 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:44.662517 master-0 kubenswrapper[4754]: I1203 21:49:44.662094 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.663008 master-0 kubenswrapper[4754]: E1203 21:49:44.662863 4754 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:44.663008 master-0 kubenswrapper[4754]: E1203 21:49:44.662967 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:45.162941289 +0000 UTC m=+148.916038904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:44.663519 master-0 kubenswrapper[4754]: I1203 21:49:44.663479 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:44.665589 master-0 kubenswrapper[4754]: I1203 21:49:44.665523 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:45.015370 master-0 kubenswrapper[4754]: I1203 21:49:45.015148 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm"] Dec 03 21:49:45.017211 master-0 kubenswrapper[4754]: I1203 21:49:45.017185 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt"] Dec 03 21:49:45.019499 master-0 kubenswrapper[4754]: I1203 21:49:45.019434 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b"] Dec 03 21:49:45.032239 master-0 kubenswrapper[4754]: I1203 21:49:45.032183 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh"] Dec 03 21:49:45.033246 master-0 kubenswrapper[4754]: I1203 21:49:45.033220 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr"] Dec 03 21:49:45.034052 master-0 kubenswrapper[4754]: I1203 21:49:45.034024 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh"] Dec 03 21:49:45.035068 master-0 kubenswrapper[4754]: I1203 21:49:45.035026 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj"] Dec 03 21:49:45.042827 master-0 kubenswrapper[4754]: I1203 21:49:45.042481 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh"] Dec 03 21:49:45.043077 master-0 kubenswrapper[4754]: I1203 21:49:45.043062 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x"] Dec 03 21:49:45.043925 master-0 kubenswrapper[4754]: I1203 21:49:45.043909 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5"] Dec 03 21:49:45.045379 master-0 kubenswrapper[4754]: I1203 21:49:45.044935 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh"] Dec 03 21:49:45.046243 master-0 kubenswrapper[4754]: I1203 21:49:45.045485 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr"] Dec 03 21:49:45.049910 master-0 kubenswrapper[4754]: I1203 21:49:45.048571 4754 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-clt4v"] Dec 03 21:49:45.049910 master-0 kubenswrapper[4754]: I1203 21:49:45.049098 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s"] Dec 03 21:49:45.049910 master-0 kubenswrapper[4754]: I1203 21:49:45.049120 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d"] Dec 03 21:49:45.049910 master-0 kubenswrapper[4754]: I1203 21:49:45.049232 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.054843 master-0 kubenswrapper[4754]: I1203 21:49:45.054812 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 21:49:45.058284 master-0 kubenswrapper[4754]: I1203 21:49:45.058253 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:45.061653 master-0 kubenswrapper[4754]: I1203 21:49:45.061550 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db"] Dec 03 21:49:45.062903 master-0 kubenswrapper[4754]: I1203 21:49:45.062878 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j"] Dec 03 21:49:45.064184 master-0 kubenswrapper[4754]: I1203 21:49:45.064144 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm"] Dec 03 21:49:45.067117 master-0 kubenswrapper[4754]: I1203 21:49:45.067066 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:45.067220 master-0 kubenswrapper[4754]: I1203 21:49:45.067139 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:45.067220 master-0 kubenswrapper[4754]: I1203 21:49:45.067174 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:45.067220 master-0 kubenswrapper[4754]: I1203 21:49:45.067206 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:45.067338 master-0 kubenswrapper[4754]: I1203 21:49:45.067264 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:45.067338 master-0 kubenswrapper[4754]: I1203 21:49:45.067300 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:45.067426 master-0 kubenswrapper[4754]: E1203 21:49:45.067338 4754 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:45.067473 master-0 kubenswrapper[4754]: E1203 21:49:45.067441 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.06741658 +0000 UTC m=+149.820514205 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:45.067473 master-0 kubenswrapper[4754]: I1203 21:49:45.067353 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:45.067570 master-0 kubenswrapper[4754]: E1203 21:49:45.067478 4754 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:45.067570 master-0 kubenswrapper[4754]: E1203 21:49:45.067545 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.067525534 +0000 UTC m=+149.820623339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:45.067570 master-0 kubenswrapper[4754]: I1203 21:49:45.067564 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:45.067693 master-0 kubenswrapper[4754]: E1203 21:49:45.067596 4754 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:45.067693 master-0 kubenswrapper[4754]: E1203 21:49:45.067614 4754 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:45.067693 master-0 kubenswrapper[4754]: E1203 21:49:45.067657 4754 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:45.067693 master-0 kubenswrapper[4754]: E1203 21:49:45.067678 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:45.067871 master-0 kubenswrapper[4754]: E1203 21:49:45.067626 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.067617727 +0000 UTC m=+149.820715352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:45.067871 master-0 kubenswrapper[4754]: E1203 21:49:45.067710 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:45.067871 master-0 kubenswrapper[4754]: E1203 21:49:45.067743 4754 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:45.067871 master-0 kubenswrapper[4754]: E1203 21:49:45.067743 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.0677336 +0000 UTC m=+149.820831415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:45.068076 master-0 kubenswrapper[4754]: E1203 21:49:45.067902 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.067892056 +0000 UTC m=+149.820989681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:45.068076 master-0 kubenswrapper[4754]: E1203 21:49:45.067919 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.067912696 +0000 UTC m=+149.821010321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:45.068076 master-0 kubenswrapper[4754]: E1203 21:49:45.067938 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.067928807 +0000 UTC m=+149.821026432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:45.068076 master-0 kubenswrapper[4754]: E1203 21:49:45.067956 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.067946997 +0000 UTC m=+149.821044622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:45.075380 master-0 kubenswrapper[4754]: I1203 21:49:45.075337 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:45.075675 master-0 kubenswrapper[4754]: I1203 21:49:45.075497 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:45.077610 master-0 kubenswrapper[4754]: I1203 21:49:45.077586 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:45.078312 master-0 kubenswrapper[4754]: I1203 21:49:45.078232 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj"] Dec 03 21:49:45.078312 master-0 kubenswrapper[4754]: I1203 21:49:45.078303 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c"] Dec 03 21:49:45.079104 master-0 kubenswrapper[4754]: I1203 21:49:45.079061 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:45.079658 master-0 kubenswrapper[4754]: I1203 21:49:45.079614 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:45.079802 master-0 kubenswrapper[4754]: I1203 21:49:45.079753 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:45.081558 master-0 kubenswrapper[4754]: I1203 21:49:45.081521 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:45.082525 master-0 kubenswrapper[4754]: I1203 21:49:45.082499 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:45.083376 master-0 kubenswrapper[4754]: I1203 21:49:45.083254 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:45.083900 master-0 kubenswrapper[4754]: I1203 21:49:45.083843 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:45.084006 master-0 kubenswrapper[4754]: I1203 21:49:45.083817 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:45.085846 master-0 kubenswrapper[4754]: I1203 21:49:45.085200 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:45.085846 master-0 kubenswrapper[4754]: I1203 21:49:45.085578 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:45.086476 master-0 kubenswrapper[4754]: I1203 21:49:45.086450 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:45.086670 master-0 kubenswrapper[4754]: I1203 21:49:45.086472 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:45.089161 master-0 kubenswrapper[4754]: I1203 21:49:45.088664 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:45.089310 master-0 kubenswrapper[4754]: I1203 21:49:45.089290 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:45.091032 master-0 kubenswrapper[4754]: I1203 21:49:45.091012 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:45.091104 master-0 kubenswrapper[4754]: I1203 21:49:45.091004 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:45.091224 master-0 kubenswrapper[4754]: I1203 21:49:45.091201 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:45.093552 master-0 kubenswrapper[4754]: I1203 21:49:45.093263 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:45.106321 master-0 kubenswrapper[4754]: I1203 21:49:45.105907 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:45.120249 master-0 kubenswrapper[4754]: I1203 21:49:45.120184 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:45.125404 master-0 kubenswrapper[4754]: I1203 21:49:45.125342 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:45.132112 master-0 kubenswrapper[4754]: I1203 21:49:45.132049 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:45.144849 master-0 kubenswrapper[4754]: I1203 21:49:45.144511 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:45.169073 master-0 kubenswrapper[4754]: I1203 21:49:45.169014 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.169202 master-0 kubenswrapper[4754]: I1203 21:49:45.169133 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.169431 master-0 kubenswrapper[4754]: I1203 21:49:45.169363 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:45.169595 master-0 kubenswrapper[4754]: I1203 21:49:45.169549 4754 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.169643 master-0 kubenswrapper[4754]: E1203 21:49:45.169588 4754 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:45.169797 master-0 kubenswrapper[4754]: E1203 21:49:45.169717 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:46.169690142 +0000 UTC m=+149.922787767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:45.203441 master-0 kubenswrapper[4754]: I1203 21:49:45.202977 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 21:49:45.211640 master-0 kubenswrapper[4754]: I1203 21:49:45.211525 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:45.268522 master-0 kubenswrapper[4754]: I1203 21:49:45.262802 4754 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 21:49:45.271802 master-0 kubenswrapper[4754]: I1203 21:49:45.270674 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.271802 master-0 kubenswrapper[4754]: I1203 21:49:45.270746 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.271802 master-0 kubenswrapper[4754]: I1203 21:49:45.270998 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.272088 master-0 kubenswrapper[4754]: I1203 21:49:45.272029 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.272953 master-0 kubenswrapper[4754]: I1203 21:49:45.272898 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.275398 master-0 kubenswrapper[4754]: I1203 21:49:45.275309 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:45.292798 master-0 kubenswrapper[4754]: I1203 21:49:45.291935 4754 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.315441 master-0 kubenswrapper[4754]: E1203 21:49:45.315366 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.315441 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_csi-snapshot-controller-operator-7b795784b8-l9q2j_openshift-cluster-storage-operator_add88bf0-c88d-427d-94bb-897e088a1378_0(64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda): error adding pod openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-l9q2j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda" Netns:"/var/run/netns/3fd62fe4-1628-450b-a383-d0a39399fcde" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-storage-operator;K8S_POD_NAME=csi-snapshot-controller-operator-7b795784b8-l9q2j;K8S_POD_INFRA_CONTAINER_ID=64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda;K8S_POD_UID=add88bf0-c88d-427d-94bb-897e088a1378" Path:"" ERRORED: error configuring pod [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j] networking: [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j/add88bf0-c88d-427d-94bb-897e088a1378:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.315441 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.315441 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: E1203 21:49:45.315457 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_csi-snapshot-controller-operator-7b795784b8-l9q2j_openshift-cluster-storage-operator_add88bf0-c88d-427d-94bb-897e088a1378_0(64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda): error adding pod openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-l9q2j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda" Netns:"/var/run/netns/3fd62fe4-1628-450b-a383-d0a39399fcde" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-storage-operator;K8S_POD_NAME=csi-snapshot-controller-operator-7b795784b8-l9q2j;K8S_POD_INFRA_CONTAINER_ID=64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda;K8S_POD_UID=add88bf0-c88d-427d-94bb-897e088a1378" Path:"" ERRORED: error configuring pod [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j] networking: [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j/add88bf0-c88d-427d-94bb-897e088a1378:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: > pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: E1203 21:49:45.315484 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_csi-snapshot-controller-operator-7b795784b8-l9q2j_openshift-cluster-storage-operator_add88bf0-c88d-427d-94bb-897e088a1378_0(64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda): error adding pod openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-l9q2j to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda" Netns:"/var/run/netns/3fd62fe4-1628-450b-a383-d0a39399fcde" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-storage-operator;K8S_POD_NAME=csi-snapshot-controller-operator-7b795784b8-l9q2j;K8S_POD_INFRA_CONTAINER_ID=64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda;K8S_POD_UID=add88bf0-c88d-427d-94bb-897e088a1378" Path:"" ERRORED: error configuring pod [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j] networking: [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j/add88bf0-c88d-427d-94bb-897e088a1378:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.315614 master-0 kubenswrapper[4754]: > pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:45.315846 master-0 kubenswrapper[4754]: E1203 21:49:45.315742 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-snapshot-controller-operator-7b795784b8-l9q2j_openshift-cluster-storage-operator(add88bf0-c88d-427d-94bb-897e088a1378)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-snapshot-controller-operator-7b795784b8-l9q2j_openshift-cluster-storage-operator(add88bf0-c88d-427d-94bb-897e088a1378)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_csi-snapshot-controller-operator-7b795784b8-l9q2j_openshift-cluster-storage-operator_add88bf0-c88d-427d-94bb-897e088a1378_0(64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda): error adding pod openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-l9q2j to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda\\\" Netns:\\\"/var/run/netns/3fd62fe4-1628-450b-a383-d0a39399fcde\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-storage-operator;K8S_POD_NAME=csi-snapshot-controller-operator-7b795784b8-l9q2j;K8S_POD_INFRA_CONTAINER_ID=64b02fb3149833b5aa5bb1ba82c5b382718321e26f91a444977ec4eb4f6f4cda;K8S_POD_UID=add88bf0-c88d-427d-94bb-897e088a1378\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j] networking: [openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j/add88bf0-c88d-427d-94bb-897e088a1378:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" podUID="add88bf0-c88d-427d-94bb-897e088a1378" Dec 03 21:49:45.330190 master-0 kubenswrapper[4754]: I1203 21:49:45.330124 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:45.337433 master-0 kubenswrapper[4754]: I1203 21:49:45.337365 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:45.343880 master-0 kubenswrapper[4754]: I1203 21:49:45.343824 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:45.346847 master-0 kubenswrapper[4754]: I1203 21:49:45.346736 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:45.347150 master-0 kubenswrapper[4754]: I1203 21:49:45.347121 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:45.349002 master-0 kubenswrapper[4754]: E1203 21:49:45.348955 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.349002 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator_82055cfc-b4ce-4a00-a51d-141059947693_0(7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed): error adding pod openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed" Netns:"/var/run/netns/c0084efc-f47b-4510-8e36-54562b62326a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-etcd-operator;K8S_POD_NAME=etcd-operator-7978bf889c-w8hsm;K8S_POD_INFRA_CONTAINER_ID=7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed;K8S_POD_UID=82055cfc-b4ce-4a00-a51d-141059947693" Path:"" ERRORED: error configuring pod [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm] networking: [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm/82055cfc-b4ce-4a00-a51d-141059947693:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.349002 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.349002 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: E1203 21:49:45.349008 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator_82055cfc-b4ce-4a00-a51d-141059947693_0(7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed): error adding pod openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed" Netns:"/var/run/netns/c0084efc-f47b-4510-8e36-54562b62326a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-etcd-operator;K8S_POD_NAME=etcd-operator-7978bf889c-w8hsm;K8S_POD_INFRA_CONTAINER_ID=7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed;K8S_POD_UID=82055cfc-b4ce-4a00-a51d-141059947693" Path:"" ERRORED: error configuring pod [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm] networking: [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm/82055cfc-b4ce-4a00-a51d-141059947693:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: > pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: E1203 21:49:45.349030 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator_82055cfc-b4ce-4a00-a51d-141059947693_0(7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed): error adding pod openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed" Netns:"/var/run/netns/c0084efc-f47b-4510-8e36-54562b62326a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-etcd-operator;K8S_POD_NAME=etcd-operator-7978bf889c-w8hsm;K8S_POD_INFRA_CONTAINER_ID=7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed;K8S_POD_UID=82055cfc-b4ce-4a00-a51d-141059947693" Path:"" ERRORED: error configuring pod [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm] networking: [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm/82055cfc-b4ce-4a00-a51d-141059947693:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: > pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:45.349189 master-0 kubenswrapper[4754]: E1203 21:49:45.349086 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator_82055cfc-b4ce-4a00-a51d-141059947693_0(7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed): error adding pod openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed\\\" Netns:\\\"/var/run/netns/c0084efc-f47b-4510-8e36-54562b62326a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-etcd-operator;K8S_POD_NAME=etcd-operator-7978bf889c-w8hsm;K8S_POD_INFRA_CONTAINER_ID=7dcdc4e3522ce49d00e4e2b8f45a2c8c0207c176c2a100e7b2108e96e71e92ed;K8S_POD_UID=82055cfc-b4ce-4a00-a51d-141059947693\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm] networking: [openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm/82055cfc-b4ce-4a00-a51d-141059947693:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 21:49:45.352005 master-0 kubenswrapper[4754]: I1203 21:49:45.351948 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:45.366315 master-0 kubenswrapper[4754]: E1203 21:49:45.366257 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.366315 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator_fdfbaebe-d655-4c1e-a039-08802c5c35c5_0(cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2): error adding pod openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2" Netns:"/var/run/netns/c88ce66d-5b48-4f24-bf00-db0e4e271a6c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager-operator;K8S_POD_NAME=kube-controller-manager-operator-b5dddf8f5-llvrh;K8S_POD_INFRA_CONTAINER_ID=cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2;K8S_POD_UID=fdfbaebe-d655-4c1e-a039-08802c5c35c5" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh] networking: [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh/fdfbaebe-d655-4c1e-a039-08802c5c35c5:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.366315 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.366315 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: E1203 21:49:45.366338 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator_fdfbaebe-d655-4c1e-a039-08802c5c35c5_0(cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2): error adding pod openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2" Netns:"/var/run/netns/c88ce66d-5b48-4f24-bf00-db0e4e271a6c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager-operator;K8S_POD_NAME=kube-controller-manager-operator-b5dddf8f5-llvrh;K8S_POD_INFRA_CONTAINER_ID=cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2;K8S_POD_UID=fdfbaebe-d655-4c1e-a039-08802c5c35c5" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh] networking: [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh/fdfbaebe-d655-4c1e-a039-08802c5c35c5:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: > pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: E1203 21:49:45.366372 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator_fdfbaebe-d655-4c1e-a039-08802c5c35c5_0(cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2): error adding pod openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2" Netns:"/var/run/netns/c88ce66d-5b48-4f24-bf00-db0e4e271a6c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager-operator;K8S_POD_NAME=kube-controller-manager-operator-b5dddf8f5-llvrh;K8S_POD_INFRA_CONTAINER_ID=cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2;K8S_POD_UID=fdfbaebe-d655-4c1e-a039-08802c5c35c5" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh] networking: [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh/fdfbaebe-d655-4c1e-a039-08802c5c35c5:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.366520 master-0 kubenswrapper[4754]: > pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:45.366807 master-0 kubenswrapper[4754]: E1203 21:49:45.366437 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator_fdfbaebe-d655-4c1e-a039-08802c5c35c5_0(cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2): error adding pod openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2\\\" Netns:\\\"/var/run/netns/c88ce66d-5b48-4f24-bf00-db0e4e271a6c\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager-operator;K8S_POD_NAME=kube-controller-manager-operator-b5dddf8f5-llvrh;K8S_POD_INFRA_CONTAINER_ID=cc56623696d87e170e28fb9be741c2a91cbcc7e91d56f11aafb4cb9507d37bb2;K8S_POD_UID=fdfbaebe-d655-4c1e-a039-08802c5c35c5\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh] networking: [openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh/fdfbaebe-d655-4c1e-a039-08802c5c35c5:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 21:49:45.375754 master-0 kubenswrapper[4754]: E1203 21:49:45.375700 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.375754 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator_c8da5d44-680e-4169-abc6-607bdc37a64d_0(0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f): error adding pod openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f" Netns:"/var/run/netns/e1cbe992-5334-412d-a3a2-381a7ae6c18a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-olm-operator;K8S_POD_NAME=cluster-olm-operator-589f5cdc9d-25qxh;K8S_POD_INFRA_CONTAINER_ID=0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f;K8S_POD_UID=c8da5d44-680e-4169-abc6-607bdc37a64d" Path:"" ERRORED: error configuring pod [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh] networking: [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh/c8da5d44-680e-4169-abc6-607bdc37a64d:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.375754 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.375754 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: E1203 21:49:45.375789 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator_c8da5d44-680e-4169-abc6-607bdc37a64d_0(0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f): error adding pod openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f" Netns:"/var/run/netns/e1cbe992-5334-412d-a3a2-381a7ae6c18a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-olm-operator;K8S_POD_NAME=cluster-olm-operator-589f5cdc9d-25qxh;K8S_POD_INFRA_CONTAINER_ID=0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f;K8S_POD_UID=c8da5d44-680e-4169-abc6-607bdc37a64d" Path:"" ERRORED: error configuring pod [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh] networking: [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh/c8da5d44-680e-4169-abc6-607bdc37a64d:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: > pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: E1203 21:49:45.375814 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator_c8da5d44-680e-4169-abc6-607bdc37a64d_0(0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f): error adding pod openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f" Netns:"/var/run/netns/e1cbe992-5334-412d-a3a2-381a7ae6c18a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-olm-operator;K8S_POD_NAME=cluster-olm-operator-589f5cdc9d-25qxh;K8S_POD_INFRA_CONTAINER_ID=0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f;K8S_POD_UID=c8da5d44-680e-4169-abc6-607bdc37a64d" Path:"" ERRORED: error configuring pod [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh] networking: [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh/c8da5d44-680e-4169-abc6-607bdc37a64d:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.376182 master-0 kubenswrapper[4754]: > pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:45.376536 master-0 kubenswrapper[4754]: E1203 21:49:45.375872 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator_c8da5d44-680e-4169-abc6-607bdc37a64d_0(0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f): error adding pod openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f\\\" Netns:\\\"/var/run/netns/e1cbe992-5334-412d-a3a2-381a7ae6c18a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-cluster-olm-operator;K8S_POD_NAME=cluster-olm-operator-589f5cdc9d-25qxh;K8S_POD_INFRA_CONTAINER_ID=0ef8b1b6cbc761f32220c257b60c88379082e147b8f4ff619fe524358c77838f;K8S_POD_UID=c8da5d44-680e-4169-abc6-607bdc37a64d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh] networking: [openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh/c8da5d44-680e-4169-abc6-607bdc37a64d:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: E1203 21:49:45.388377 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator_5f088999-ec66-402e-9634-8c762206d6b4_0(eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb): error adding pod openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb" Netns:"/var/run/netns/0423b5c0-7cce-4eb1-93ee-dc087930613a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-service-ca-operator;K8S_POD_NAME=service-ca-operator-56f5898f45-mjdfr;K8S_POD_INFRA_CONTAINER_ID=eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb;K8S_POD_UID=5f088999-ec66-402e-9634-8c762206d6b4" Path:"" ERRORED: error configuring pod [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr] networking: [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr/5f088999-ec66-402e-9634-8c762206d6b4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: E1203 21:49:45.388481 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator_5f088999-ec66-402e-9634-8c762206d6b4_0(eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb): error adding pod openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb" Netns:"/var/run/netns/0423b5c0-7cce-4eb1-93ee-dc087930613a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-service-ca-operator;K8S_POD_NAME=service-ca-operator-56f5898f45-mjdfr;K8S_POD_INFRA_CONTAINER_ID=eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb;K8S_POD_UID=5f088999-ec66-402e-9634-8c762206d6b4" Path:"" ERRORED: error configuring pod [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr] networking: [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr/5f088999-ec66-402e-9634-8c762206d6b4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: > pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: E1203 21:49:45.388520 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator_5f088999-ec66-402e-9634-8c762206d6b4_0(eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb): error adding pod openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb" Netns:"/var/run/netns/0423b5c0-7cce-4eb1-93ee-dc087930613a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-service-ca-operator;K8S_POD_NAME=service-ca-operator-56f5898f45-mjdfr;K8S_POD_INFRA_CONTAINER_ID=eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb;K8S_POD_UID=5f088999-ec66-402e-9634-8c762206d6b4" Path:"" ERRORED: error configuring pod [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr] networking: [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr/5f088999-ec66-402e-9634-8c762206d6b4:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.389649 master-0 kubenswrapper[4754]: > pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:45.391034 master-0 kubenswrapper[4754]: E1203 21:49:45.388610 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator(5f088999-ec66-402e-9634-8c762206d6b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator(5f088999-ec66-402e-9634-8c762206d6b4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator_5f088999-ec66-402e-9634-8c762206d6b4_0(eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb): error adding pod openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb\\\" Netns:\\\"/var/run/netns/0423b5c0-7cce-4eb1-93ee-dc087930613a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-service-ca-operator;K8S_POD_NAME=service-ca-operator-56f5898f45-mjdfr;K8S_POD_INFRA_CONTAINER_ID=eb6a0d6fcefaef7afe780dc6c2b65ef59fa4b9e04f82630083688231e499fdbb;K8S_POD_UID=5f088999-ec66-402e-9634-8c762206d6b4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr] networking: [openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr/5f088999-ec66-402e-9634-8c762206d6b4:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" podUID="5f088999-ec66-402e-9634-8c762206d6b4" Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: E1203 21:49:45.411401 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator_6976b503-87da-48fc-b097-d1b315fbee3f_0(33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439): error adding pod openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439" Netns:"/var/run/netns/d3357c9e-a2a5-43d2-9a39-5ff0366ca38b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager-operator;K8S_POD_NAME=openshift-controller-manager-operator-7c4697b5f5-458zh;K8S_POD_INFRA_CONTAINER_ID=33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439;K8S_POD_UID=6976b503-87da-48fc-b097-d1b315fbee3f" Path:"" ERRORED: error configuring pod [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh] networking: [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh/6976b503-87da-48fc-b097-d1b315fbee3f:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: E1203 21:49:45.411493 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator_6976b503-87da-48fc-b097-d1b315fbee3f_0(33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439): error adding pod openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439" Netns:"/var/run/netns/d3357c9e-a2a5-43d2-9a39-5ff0366ca38b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager-operator;K8S_POD_NAME=openshift-controller-manager-operator-7c4697b5f5-458zh;K8S_POD_INFRA_CONTAINER_ID=33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439;K8S_POD_UID=6976b503-87da-48fc-b097-d1b315fbee3f" Path:"" ERRORED: error configuring pod [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh] networking: [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh/6976b503-87da-48fc-b097-d1b315fbee3f:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: > pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: E1203 21:49:45.411517 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator_6976b503-87da-48fc-b097-d1b315fbee3f_0(33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439): error adding pod openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439" Netns:"/var/run/netns/d3357c9e-a2a5-43d2-9a39-5ff0366ca38b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager-operator;K8S_POD_NAME=openshift-controller-manager-operator-7c4697b5f5-458zh;K8S_POD_INFRA_CONTAINER_ID=33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439;K8S_POD_UID=6976b503-87da-48fc-b097-d1b315fbee3f" Path:"" ERRORED: error configuring pod [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh] networking: [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh/6976b503-87da-48fc-b097-d1b315fbee3f:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.413112 master-0 kubenswrapper[4754]: > pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:45.413451 master-0 kubenswrapper[4754]: E1203 21:49:45.411609 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator_6976b503-87da-48fc-b097-d1b315fbee3f_0(33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439): error adding pod openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439\\\" Netns:\\\"/var/run/netns/d3357c9e-a2a5-43d2-9a39-5ff0366ca38b\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager-operator;K8S_POD_NAME=openshift-controller-manager-operator-7c4697b5f5-458zh;K8S_POD_INFRA_CONTAINER_ID=33f40f6982ee793735cbeda615e5932eabd721e861251c8c6aaee47125fd2439;K8S_POD_UID=6976b503-87da-48fc-b097-d1b315fbee3f\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh] networking: [openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh/6976b503-87da-48fc-b097-d1b315fbee3f:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 21:49:45.418451 master-0 kubenswrapper[4754]: I1203 21:49:45.418400 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: E1203 21:49:45.464516 4754 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator_f59094ec-47dd-4547-ad41-b15a7933f461_0(9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3): error adding pod openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3" Netns:"/var/run/netns/a502a843-26b0-44c8-be08-5831271439da" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-apiserver-operator;K8S_POD_NAME=openshift-apiserver-operator-667484ff5-st2db;K8S_POD_INFRA_CONTAINER_ID=9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3;K8S_POD_UID=f59094ec-47dd-4547-ad41-b15a7933f461" Path:"" ERRORED: error configuring pod [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db] networking: [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db/f59094ec-47dd-4547-ad41-b15a7933f461:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: > Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: E1203 21:49:45.464931 4754 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator_f59094ec-47dd-4547-ad41-b15a7933f461_0(9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3): error adding pod openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3" Netns:"/var/run/netns/a502a843-26b0-44c8-be08-5831271439da" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-apiserver-operator;K8S_POD_NAME=openshift-apiserver-operator-667484ff5-st2db;K8S_POD_INFRA_CONTAINER_ID=9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3;K8S_POD_UID=f59094ec-47dd-4547-ad41-b15a7933f461" Path:"" ERRORED: error configuring pod [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db] networking: [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db/f59094ec-47dd-4547-ad41-b15a7933f461:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: > pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: E1203 21:49:45.464955 4754 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator_f59094ec-47dd-4547-ad41-b15a7933f461_0(9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3): error adding pod openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3" Netns:"/var/run/netns/a502a843-26b0-44c8-be08-5831271439da" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-apiserver-operator;K8S_POD_NAME=openshift-apiserver-operator-667484ff5-st2db;K8S_POD_INFRA_CONTAINER_ID=9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3;K8S_POD_UID=f59094ec-47dd-4547-ad41-b15a7933f461" Path:"" ERRORED: error configuring pod [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db] networking: [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db/f59094ec-47dd-4547-ad41-b15a7933f461:ovn-kubernetes]: error adding container to network "ovn-kubernetes": failed to send CNI request: Post "http://dummy/": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:49:45.464980 master-0 kubenswrapper[4754]: > pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:45.465387 master-0 kubenswrapper[4754]: E1203 21:49:45.465021 4754 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator(f59094ec-47dd-4547-ad41-b15a7933f461)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator(f59094ec-47dd-4547-ad41-b15a7933f461)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator_f59094ec-47dd-4547-ad41-b15a7933f461_0(9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3): error adding pod openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3\\\" Netns:\\\"/var/run/netns/a502a843-26b0-44c8-be08-5831271439da\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-apiserver-operator;K8S_POD_NAME=openshift-apiserver-operator-667484ff5-st2db;K8S_POD_INFRA_CONTAINER_ID=9d4901667c030cb1318903ba8349b05db01e19a09dec9a82e535ac0e674c49e3;K8S_POD_UID=f59094ec-47dd-4547-ad41-b15a7933f461\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db] networking: [openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db/f59094ec-47dd-4547-ad41-b15a7933f461:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": failed to send CNI request: Post \\\"http://dummy/\\\": dial unix /var/run/ovn-kubernetes/cni//ovn-cni-server.sock: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" podUID="f59094ec-47dd-4547-ad41-b15a7933f461" Dec 03 21:49:45.643261 master-0 kubenswrapper[4754]: I1203 21:49:45.643196 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr"] Dec 03 21:49:45.814481 master-0 kubenswrapper[4754]: I1203 21:49:45.814200 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c"] Dec 03 21:49:45.818949 master-0 kubenswrapper[4754]: I1203 21:49:45.818882 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j"] Dec 03 21:49:45.822659 master-0 kubenswrapper[4754]: I1203 21:49:45.822569 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm"] Dec 03 21:49:45.825650 master-0 kubenswrapper[4754]: I1203 21:49:45.825553 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b"] Dec 03 21:49:45.828873 master-0 kubenswrapper[4754]: W1203 21:49:45.827122 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadd88bf0_c88d_427d_94bb_897e088a1378.slice/crio-8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5 WatchSource:0}: Error finding container 8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5: Status 404 returned error can't find the container with id 8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5 Dec 03 21:49:45.828873 master-0 kubenswrapper[4754]: W1203 21:49:45.828709 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08432be8_0086_48d2_a93d_7a474e96749d.slice/crio-0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688 WatchSource:0}: Error finding container 0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688: Status 404 returned error can't find the container with id 0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688 Dec 03 21:49:45.829588 master-0 kubenswrapper[4754]: W1203 21:49:45.829080 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50076985_bbaa_4bcf_9d1a_cc25bed016a7.slice/crio-2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b WatchSource:0}: Error finding container 2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b: Status 404 returned error can't find the container with id 2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b Dec 03 21:49:46.083926 master-0 kubenswrapper[4754]: I1203 21:49:46.083870 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:46.084021 master-0 kubenswrapper[4754]: I1203 21:49:46.083939 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:46.084021 master-0 kubenswrapper[4754]: I1203 21:49:46.083977 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:46.084021 master-0 kubenswrapper[4754]: I1203 21:49:46.084011 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:46.084153 master-0 kubenswrapper[4754]: I1203 21:49:46.084053 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:46.084153 master-0 kubenswrapper[4754]: I1203 21:49:46.084103 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:46.084153 master-0 kubenswrapper[4754]: I1203 21:49:46.084135 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:46.084263 master-0 kubenswrapper[4754]: I1203 21:49:46.084164 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.084409 4754 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.084467 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.084446599 +0000 UTC m=+151.837544234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.084902 4754 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.084944 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.084928374 +0000 UTC m=+151.838026009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.084997 4754 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085025 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.085016067 +0000 UTC m=+151.838113692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085075 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085104 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.08509368 +0000 UTC m=+151.838191305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085154 4754 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085182 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.085172362 +0000 UTC m=+151.838269987 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085227 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085256 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.085248785 +0000 UTC m=+151.838346410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085310 4754 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085338 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.085328707 +0000 UTC m=+151.838426342 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:46.085744 master-0 kubenswrapper[4754]: E1203 21:49:46.085393 4754 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:46.087200 master-0 kubenswrapper[4754]: E1203 21:49:46.085417 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.08540954 +0000 UTC m=+151.838507165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:46.185794 master-0 kubenswrapper[4754]: I1203 21:49:46.185463 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:46.185794 master-0 kubenswrapper[4754]: E1203 21:49:46.185694 4754 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:46.185794 master-0 kubenswrapper[4754]: E1203 21:49:46.185807 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:48.185760351 +0000 UTC m=+151.938857966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:46.352260 master-0 kubenswrapper[4754]: I1203 21:49:46.352043 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" event={"ID":"add88bf0-c88d-427d-94bb-897e088a1378","Type":"ContainerStarted","Data":"8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5"} Dec 03 21:49:46.353528 master-0 kubenswrapper[4754]: I1203 21:49:46.353277 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerStarted","Data":"19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a"} Dec 03 21:49:46.355117 master-0 kubenswrapper[4754]: I1203 21:49:46.355064 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerStarted","Data":"2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b"} Dec 03 21:49:46.356120 master-0 kubenswrapper[4754]: I1203 21:49:46.356067 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerStarted","Data":"d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09"} Dec 03 21:49:46.357211 master-0 kubenswrapper[4754]: I1203 21:49:46.357153 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-clt4v" event={"ID":"814c8acf-fb8d-4f57-b8db-21304402c1f1","Type":"ContainerStarted","Data":"0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311"} Dec 03 21:49:46.359077 master-0 kubenswrapper[4754]: I1203 21:49:46.358978 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:46.359163 master-0 kubenswrapper[4754]: I1203 21:49:46.359068 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:46.359163 master-0 kubenswrapper[4754]: I1203 21:49:46.358980 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerStarted","Data":"ffe4f10866cf1bc36713ebdb04a86f2cd5ff92ba34f253339cffa02ccd5a5e66"} Dec 03 21:49:46.359163 master-0 kubenswrapper[4754]: I1203 21:49:46.359145 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:46.359329 master-0 kubenswrapper[4754]: I1203 21:49:46.359159 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerStarted","Data":"0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688"} Dec 03 21:49:46.359329 master-0 kubenswrapper[4754]: I1203 21:49:46.359186 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:46.359329 master-0 kubenswrapper[4754]: I1203 21:49:46.359225 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:46.359329 master-0 kubenswrapper[4754]: I1203 21:49:46.359253 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:46.359538 master-0 kubenswrapper[4754]: I1203 21:49:46.359430 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:46.359538 master-0 kubenswrapper[4754]: I1203 21:49:46.359436 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:46.359838 master-0 kubenswrapper[4754]: I1203 21:49:46.359797 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:46.360060 master-0 kubenswrapper[4754]: I1203 21:49:46.360016 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:46.360060 master-0 kubenswrapper[4754]: I1203 21:49:46.360038 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:46.361526 master-0 kubenswrapper[4754]: I1203 21:49:46.360139 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:46.381609 master-0 kubenswrapper[4754]: I1203 21:49:46.378716 4754 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" podStartSLOduration=115.378684193 podStartE2EDuration="1m55.378684193s" podCreationTimestamp="2025-12-03 21:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:49:46.377538157 +0000 UTC m=+150.130635782" watchObservedRunningTime="2025-12-03 21:49:46.378684193 +0000 UTC m=+150.131781818" Dec 03 21:49:46.667877 master-0 kubenswrapper[4754]: I1203 21:49:46.666754 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr"] Dec 03 21:49:46.667877 master-0 kubenswrapper[4754]: I1203 21:49:46.667329 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db"] Dec 03 21:49:46.721826 master-0 kubenswrapper[4754]: I1203 21:49:46.721695 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh"] Dec 03 21:49:46.730183 master-0 kubenswrapper[4754]: W1203 21:49:46.730121 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8da5d44_680e_4169_abc6_607bdc37a64d.slice/crio-941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7 WatchSource:0}: Error finding container 941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7: Status 404 returned error can't find the container with id 941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7 Dec 03 21:49:46.741895 master-0 kubenswrapper[4754]: I1203 21:49:46.740780 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh"] Dec 03 21:49:46.741895 master-0 kubenswrapper[4754]: I1203 21:49:46.740830 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh"] Dec 03 21:49:46.741895 master-0 kubenswrapper[4754]: I1203 21:49:46.741150 4754 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm"] Dec 03 21:49:46.749682 master-0 kubenswrapper[4754]: W1203 21:49:46.749650 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdfbaebe_d655_4c1e_a039_08802c5c35c5.slice/crio-d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c WatchSource:0}: Error finding container d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c: Status 404 returned error can't find the container with id d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c Dec 03 21:49:46.752949 master-0 kubenswrapper[4754]: W1203 21:49:46.752922 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6976b503_87da_48fc_b097_d1b315fbee3f.slice/crio-5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8 WatchSource:0}: Error finding container 5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8: Status 404 returned error can't find the container with id 5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8 Dec 03 21:49:46.769498 master-0 kubenswrapper[4754]: W1203 21:49:46.769452 4754 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82055cfc_b4ce_4a00_a51d_141059947693.slice/crio-c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0 WatchSource:0}: Error finding container c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0: Status 404 returned error can't find the container with id c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0 Dec 03 21:49:47.364390 master-0 kubenswrapper[4754]: I1203 21:49:47.364329 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerStarted","Data":"941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7"} Dec 03 21:49:47.365677 master-0 kubenswrapper[4754]: I1203 21:49:47.365632 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerStarted","Data":"5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8"} Dec 03 21:49:47.367054 master-0 kubenswrapper[4754]: I1203 21:49:47.367021 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerStarted","Data":"c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0"} Dec 03 21:49:47.368991 master-0 kubenswrapper[4754]: I1203 21:49:47.368913 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerStarted","Data":"d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c"} Dec 03 21:49:47.370619 master-0 kubenswrapper[4754]: I1203 21:49:47.370563 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerStarted","Data":"ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f"} Dec 03 21:49:47.372850 master-0 kubenswrapper[4754]: I1203 21:49:47.372468 4754 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerStarted","Data":"bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36"} Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118223 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118293 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118326 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118365 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118398 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: E1203 21:49:48.118401 4754 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118429 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: E1203 21:49:48.118483 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118461477 +0000 UTC m=+155.871559092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118504 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: E1203 21:49:48.118535 4754 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: I1203 21:49:48.118560 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: E1203 21:49:48.118574 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.11856174 +0000 UTC m=+155.871659355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:48.118576 master-0 kubenswrapper[4754]: E1203 21:49:48.118565 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118628 4754 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118652 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118643453 +0000 UTC m=+155.871741068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118628 4754 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118674 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118668914 +0000 UTC m=+155.871766529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118686 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118680484 +0000 UTC m=+155.871778099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118714 4754 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118726 4754 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118673 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118745 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118735246 +0000 UTC m=+155.871832861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118762 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118754726 +0000 UTC m=+155.871852341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:48.123388 master-0 kubenswrapper[4754]: E1203 21:49:48.118801 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.118792598 +0000 UTC m=+155.871890223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:48.219389 master-0 kubenswrapper[4754]: I1203 21:49:48.219204 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:48.220756 master-0 kubenswrapper[4754]: E1203 21:49:48.220655 4754 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:48.225234 master-0 kubenswrapper[4754]: E1203 21:49:48.220874 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:52.220810731 +0000 UTC m=+155.973908346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:52.181656 master-0 kubenswrapper[4754]: I1203 21:49:52.181560 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181676 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181727 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181762 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181812 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181847 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181872 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: I1203 21:49:52.181895 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182049 4754 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182107 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.18208893 +0000 UTC m=+163.935186555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182160 4754 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182183 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.182175133 +0000 UTC m=+163.935272758 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182226 4754 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182249 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.182241765 +0000 UTC m=+163.935339390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182294 4754 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:52.182446 master-0 kubenswrapper[4754]: E1203 21:49:52.182316 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.182309337 +0000 UTC m=+163.935406962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182358 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182380 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.182374019 +0000 UTC m=+163.935471644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182423 4754 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182448 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.182440252 +0000 UTC m=+163.935537877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182493 4754 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182624 4754 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.182655 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.182645418 +0000 UTC m=+163.935743043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:52.183211 master-0 kubenswrapper[4754]: E1203 21:49:52.183150 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.183138384 +0000 UTC m=+163.936236009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:52.286871 master-0 kubenswrapper[4754]: I1203 21:49:52.284373 4754 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:52.286871 master-0 kubenswrapper[4754]: E1203 21:49:52.284645 4754 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:52.286871 master-0 kubenswrapper[4754]: E1203 21:49:52.284713 4754 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.284692204 +0000 UTC m=+164.037789819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:52.686413 master-0 kubenswrapper[4754]: I1203 21:49:52.685966 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:52.686896 master-0 kubenswrapper[4754]: I1203 21:49:52.686857 4754 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:53.565561 master-0 kubenswrapper[4754]: I1203 21:49:53.565295 4754 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 21:49:53.565739 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 03 21:49:53.613945 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 03 21:49:53.614284 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 03 21:49:53.616455 master-0 systemd[1]: kubelet.service: Consumed 12.383s CPU time. Dec 03 21:49:53.635012 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 03 21:49:53.764143 master-0 kubenswrapper[9136]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:49:53.764143 master-0 kubenswrapper[9136]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 21:49:53.764143 master-0 kubenswrapper[9136]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:49:53.764143 master-0 kubenswrapper[9136]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:49:53.764143 master-0 kubenswrapper[9136]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 21:49:53.764143 master-0 kubenswrapper[9136]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 21:49:53.765569 master-0 kubenswrapper[9136]: I1203 21:49:53.764130 9136 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 21:49:53.770333 master-0 kubenswrapper[9136]: W1203 21:49:53.769917 9136 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:49:53.770333 master-0 kubenswrapper[9136]: W1203 21:49:53.770320 9136 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:49:53.770333 master-0 kubenswrapper[9136]: W1203 21:49:53.770327 9136 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:49:53.770333 master-0 kubenswrapper[9136]: W1203 21:49:53.770332 9136 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:49:53.770333 master-0 kubenswrapper[9136]: W1203 21:49:53.770339 9136 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:49:53.770333 master-0 kubenswrapper[9136]: W1203 21:49:53.770343 9136 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770348 9136 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770353 9136 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770358 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770363 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770368 9136 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770372 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770377 9136 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770381 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770386 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770392 9136 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770397 9136 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770401 9136 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770406 9136 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770410 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770415 9136 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770419 9136 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770424 9136 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770428 9136 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770433 9136 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:49:53.770620 master-0 kubenswrapper[9136]: W1203 21:49:53.770437 9136 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770442 9136 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770447 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770451 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770456 9136 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770461 9136 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770466 9136 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770471 9136 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770476 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770483 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770491 9136 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770498 9136 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770503 9136 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770508 9136 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770523 9136 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770529 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770535 9136 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770539 9136 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770544 9136 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:49:53.771525 master-0 kubenswrapper[9136]: W1203 21:49:53.770549 9136 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770553 9136 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770558 9136 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770562 9136 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770567 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770572 9136 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770578 9136 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770583 9136 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770589 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770593 9136 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770597 9136 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770602 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770607 9136 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770614 9136 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770620 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770626 9136 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770630 9136 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770635 9136 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770640 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:49:53.772291 master-0 kubenswrapper[9136]: W1203 21:49:53.770644 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770648 9136 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770653 9136 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770658 9136 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770663 9136 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770668 9136 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770672 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770677 9136 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: W1203 21:49:53.770686 9136 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770858 9136 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770875 9136 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770887 9136 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770896 9136 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770904 9136 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770911 9136 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770919 9136 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770928 9136 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770934 9136 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770940 9136 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770947 9136 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770954 9136 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770960 9136 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 21:49:53.772975 master-0 kubenswrapper[9136]: I1203 21:49:53.770966 9136 flags.go:64] FLAG: --cgroup-root="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.770972 9136 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.770978 9136 flags.go:64] FLAG: --client-ca-file="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.770984 9136 flags.go:64] FLAG: --cloud-config="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.770989 9136 flags.go:64] FLAG: --cloud-provider="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.770995 9136 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771004 9136 flags.go:64] FLAG: --cluster-domain="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771009 9136 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771017 9136 flags.go:64] FLAG: --config-dir="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771022 9136 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771037 9136 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771050 9136 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771056 9136 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771063 9136 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771070 9136 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771076 9136 flags.go:64] FLAG: --contention-profiling="false" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771082 9136 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771089 9136 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771099 9136 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771105 9136 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771113 9136 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771119 9136 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771124 9136 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771130 9136 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771136 9136 flags.go:64] FLAG: --enable-server="true" Dec 03 21:49:53.773683 master-0 kubenswrapper[9136]: I1203 21:49:53.771141 9136 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771150 9136 flags.go:64] FLAG: --event-burst="100" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771156 9136 flags.go:64] FLAG: --event-qps="50" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771161 9136 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771167 9136 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771173 9136 flags.go:64] FLAG: --eviction-hard="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771181 9136 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771187 9136 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771195 9136 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771201 9136 flags.go:64] FLAG: --eviction-soft="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771207 9136 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771213 9136 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771219 9136 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771224 9136 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771230 9136 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771235 9136 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771241 9136 flags.go:64] FLAG: --feature-gates="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771249 9136 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771256 9136 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771262 9136 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771269 9136 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771276 9136 flags.go:64] FLAG: --healthz-port="10248" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771281 9136 flags.go:64] FLAG: --help="false" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771288 9136 flags.go:64] FLAG: --hostname-override="" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771294 9136 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771303 9136 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 21:49:53.774618 master-0 kubenswrapper[9136]: I1203 21:49:53.771309 9136 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771314 9136 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771320 9136 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771326 9136 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771331 9136 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771336 9136 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771342 9136 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771348 9136 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771354 9136 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771360 9136 flags.go:64] FLAG: --kube-reserved="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771366 9136 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771372 9136 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771377 9136 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771383 9136 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771389 9136 flags.go:64] FLAG: --lock-file="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771396 9136 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771401 9136 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771407 9136 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771418 9136 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771424 9136 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771430 9136 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771435 9136 flags.go:64] FLAG: --logging-format="text" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771441 9136 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771448 9136 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771454 9136 flags.go:64] FLAG: --manifest-url="" Dec 03 21:49:53.775565 master-0 kubenswrapper[9136]: I1203 21:49:53.771459 9136 flags.go:64] FLAG: --manifest-url-header="" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771467 9136 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771474 9136 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771482 9136 flags.go:64] FLAG: --max-pods="110" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771488 9136 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771494 9136 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771502 9136 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771507 9136 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771514 9136 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771520 9136 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771526 9136 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771547 9136 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771553 9136 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771559 9136 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771565 9136 flags.go:64] FLAG: --pod-cidr="" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771570 9136 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771581 9136 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771587 9136 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771594 9136 flags.go:64] FLAG: --pods-per-core="0" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771599 9136 flags.go:64] FLAG: --port="10250" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771606 9136 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771612 9136 flags.go:64] FLAG: --provider-id="" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771618 9136 flags.go:64] FLAG: --qos-reserved="" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771624 9136 flags.go:64] FLAG: --read-only-port="10255" Dec 03 21:49:53.776429 master-0 kubenswrapper[9136]: I1203 21:49:53.771630 9136 flags.go:64] FLAG: --register-node="true" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771635 9136 flags.go:64] FLAG: --register-schedulable="true" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771641 9136 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771653 9136 flags.go:64] FLAG: --registry-burst="10" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771659 9136 flags.go:64] FLAG: --registry-qps="5" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771665 9136 flags.go:64] FLAG: --reserved-cpus="" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771672 9136 flags.go:64] FLAG: --reserved-memory="" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771680 9136 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771686 9136 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771692 9136 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771698 9136 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771703 9136 flags.go:64] FLAG: --runonce="false" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771709 9136 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771714 9136 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771720 9136 flags.go:64] FLAG: --seccomp-default="false" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771728 9136 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771734 9136 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771740 9136 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771746 9136 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771752 9136 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771757 9136 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771762 9136 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771783 9136 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771789 9136 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771794 9136 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 21:49:53.777228 master-0 kubenswrapper[9136]: I1203 21:49:53.771800 9136 flags.go:64] FLAG: --system-cgroups="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771806 9136 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771816 9136 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771823 9136 flags.go:64] FLAG: --tls-cert-file="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771829 9136 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771839 9136 flags.go:64] FLAG: --tls-min-version="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771845 9136 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771851 9136 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771856 9136 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771862 9136 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771867 9136 flags.go:64] FLAG: --v="2" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771876 9136 flags.go:64] FLAG: --version="false" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771884 9136 flags.go:64] FLAG: --vmodule="" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771892 9136 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: I1203 21:49:53.771898 9136 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772080 9136 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772090 9136 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772095 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772100 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772105 9136 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772109 9136 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772114 9136 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772121 9136 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:49:53.778089 master-0 kubenswrapper[9136]: W1203 21:49:53.772125 9136 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772130 9136 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772135 9136 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772140 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772145 9136 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772150 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772155 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772160 9136 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772164 9136 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772169 9136 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772174 9136 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772179 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772184 9136 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772188 9136 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772192 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772197 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772201 9136 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772206 9136 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772211 9136 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772215 9136 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:49:53.778830 master-0 kubenswrapper[9136]: W1203 21:49:53.772221 9136 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772228 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772234 9136 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772240 9136 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772245 9136 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772250 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772255 9136 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772260 9136 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772265 9136 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772271 9136 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772276 9136 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772283 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772288 9136 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772297 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772301 9136 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772306 9136 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772310 9136 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772316 9136 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772320 9136 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772325 9136 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:49:53.779462 master-0 kubenswrapper[9136]: W1203 21:49:53.772330 9136 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772335 9136 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772340 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772344 9136 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772351 9136 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772357 9136 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772361 9136 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772366 9136 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772372 9136 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772378 9136 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772383 9136 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772387 9136 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772391 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772396 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772401 9136 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772405 9136 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772410 9136 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772414 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772418 9136 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:49:53.780108 master-0 kubenswrapper[9136]: W1203 21:49:53.772422 9136 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.772427 9136 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.772432 9136 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.772437 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.772446 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: I1203 21:49:53.772455 9136 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: I1203 21:49:53.779802 9136 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: I1203 21:49:53.779834 9136 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779933 9136 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779940 9136 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779945 9136 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779950 9136 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779956 9136 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779962 9136 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779966 9136 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:49:53.780726 master-0 kubenswrapper[9136]: W1203 21:49:53.779970 9136 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779974 9136 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779978 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779982 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779986 9136 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779990 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779994 9136 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.779997 9136 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780001 9136 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780005 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780009 9136 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780013 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780017 9136 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780021 9136 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780025 9136 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780029 9136 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780033 9136 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780037 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780041 9136 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780046 9136 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780049 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:49:53.781242 master-0 kubenswrapper[9136]: W1203 21:49:53.780054 9136 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780060 9136 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780064 9136 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780070 9136 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780074 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780078 9136 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780082 9136 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780086 9136 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780090 9136 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780094 9136 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780097 9136 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780101 9136 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780105 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780110 9136 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780114 9136 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780118 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780123 9136 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780127 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780131 9136 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780135 9136 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:49:53.781924 master-0 kubenswrapper[9136]: W1203 21:49:53.780139 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780142 9136 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780146 9136 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780151 9136 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780156 9136 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780160 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780164 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780168 9136 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780173 9136 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780177 9136 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780181 9136 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780186 9136 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780193 9136 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780198 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780202 9136 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780207 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780212 9136 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780218 9136 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:49:53.782572 master-0 kubenswrapper[9136]: W1203 21:49:53.780222 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780226 9136 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780231 9136 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780234 9136 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780238 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780242 9136 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: I1203 21:49:53.780249 9136 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780428 9136 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780435 9136 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780439 9136 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780443 9136 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780447 9136 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780451 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780455 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780460 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 21:49:53.783268 master-0 kubenswrapper[9136]: W1203 21:49:53.780464 9136 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780469 9136 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780473 9136 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780477 9136 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780482 9136 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780487 9136 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780490 9136 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780494 9136 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780498 9136 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780503 9136 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780509 9136 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780513 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780517 9136 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780521 9136 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780541 9136 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780548 9136 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780553 9136 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780558 9136 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780562 9136 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 21:49:53.783746 master-0 kubenswrapper[9136]: W1203 21:49:53.780569 9136 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780576 9136 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780581 9136 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780587 9136 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780591 9136 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780595 9136 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780600 9136 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780604 9136 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780608 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780612 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780616 9136 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780620 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780624 9136 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780627 9136 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780631 9136 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780635 9136 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780639 9136 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780643 9136 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780647 9136 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780651 9136 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 21:49:53.784362 master-0 kubenswrapper[9136]: W1203 21:49:53.780655 9136 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780659 9136 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780663 9136 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780667 9136 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780671 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780675 9136 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780679 9136 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780684 9136 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780688 9136 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780692 9136 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780696 9136 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780700 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780704 9136 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780708 9136 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780712 9136 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780716 9136 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780719 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780724 9136 feature_gate.go:330] unrecognized feature gate: Example Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780728 9136 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780733 9136 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 21:49:53.785109 master-0 kubenswrapper[9136]: W1203 21:49:53.780738 9136 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: W1203 21:49:53.780744 9136 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: W1203 21:49:53.780748 9136 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: W1203 21:49:53.780753 9136 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: W1203 21:49:53.780756 9136 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.780762 9136 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.780969 9136 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.782689 9136 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.782759 9136 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.782991 9136 server.go:997] "Starting client certificate rotation" Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.783001 9136 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.783262 9136 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 16:24:42.22568615 +0000 UTC Dec 03 21:49:53.785728 master-0 kubenswrapper[9136]: I1203 21:49:53.783347 9136 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h34m48.442343938s for next certificate rotation Dec 03 21:49:53.786291 master-0 kubenswrapper[9136]: I1203 21:49:53.783618 9136 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 21:49:53.786291 master-0 kubenswrapper[9136]: I1203 21:49:53.784903 9136 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 21:49:53.790360 master-0 kubenswrapper[9136]: I1203 21:49:53.790328 9136 log.go:25] "Validated CRI v1 runtime API" Dec 03 21:49:53.792552 master-0 kubenswrapper[9136]: I1203 21:49:53.792500 9136 log.go:25] "Validated CRI v1 image API" Dec 03 21:49:53.794053 master-0 kubenswrapper[9136]: I1203 21:49:53.793829 9136 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 21:49:53.799694 master-0 kubenswrapper[9136]: I1203 21:49:53.799633 9136 fs.go:135] Filesystem UUIDs: map[3c671a63-22b6-47f8-bf0c-b9acbe18afb0:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 03 21:49:53.800065 master-0 kubenswrapper[9136]: I1203 21:49:53.799676 9136 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688/userdata/shm major:0 minor:335 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311/userdata/shm major:0 minor:324 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977/userdata/shm major:0 minor:165 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a/userdata/shm major:0 minor:319 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b/userdata/shm major:0 minor:322 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9/userdata/shm major:0 minor:173 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8/userdata/shm major:0 minor:359 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00/userdata/shm major:0 minor:127 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7/userdata/shm major:0 minor:361 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36/userdata/shm major:0 minor:353 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0/userdata/shm major:0 minor:355 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09/userdata/shm major:0 minor:318 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c/userdata/shm major:0 minor:354 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e/userdata/shm major:0 minor:147 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1/userdata/shm major:0 minor:189 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95/userdata/shm major:0 minor:123 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f/userdata/shm major:0 minor:360 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~projected/kube-api-access-tgddm:{mountpoint:/var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~projected/kube-api-access-tgddm major:0 minor:332 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~projected/kube-api-access major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~secret/serving-cert major:0 minor:287 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/kube-api-access-2kfg5:{mountpoint:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/kube-api-access-2kfg5 major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/134c10ef-9f37-4a77-8e8b-4f8326bc8f40/volumes/kubernetes.io~projected/kube-api-access-fkwjb:{mountpoint:/var/lib/kubelet/pods/134c10ef-9f37-4a77-8e8b-4f8326bc8f40/volumes/kubernetes.io~projected/kube-api-access-fkwjb major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~projected/kube-api-access-lsr8k:{mountpoint:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~projected/kube-api-access-lsr8k major:0 minor:188 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~secret/webhook-cert major:0 minor:187 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~projected/kube-api-access-p4mbz:{mountpoint:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~projected/kube-api-access-p4mbz major:0 minor:164 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:163 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9/volumes/kubernetes.io~projected/kube-api-access major:0 minor:67 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/kube-api-access-4fsxc:{mountpoint:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/kube-api-access-4fsxc major:0 minor:317 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~projected/kube-api-access-jvw7p:{mountpoint:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~projected/kube-api-access-jvw7p major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~secret/serving-cert major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~projected/kube-api-access-nvkz7:{mountpoint:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~projected/kube-api-access-nvkz7 major:0 minor:172 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:171 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~projected/kube-api-access-94vvg:{mountpoint:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~projected/kube-api-access-94vvg major:0 minor:316 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~secret/serving-cert major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~projected/kube-api-access-vx6rt:{mountpoint:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~projected/kube-api-access-vx6rt major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~secret/serving-cert major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~projected/kube-api-access-j6m8f:{mountpoint:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~projected/kube-api-access-j6m8f major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~secret/serving-cert major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~projected/kube-api-access-mxl5r:{mountpoint:/var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~projected/kube-api-access-mxl5r major:0 minor:151 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/814c8acf-fb8d-4f57-b8db-21304402c1f1/volumes/kubernetes.io~projected/kube-api-access-x7fsz:{mountpoint:/var/lib/kubelet/pods/814c8acf-fb8d-4f57-b8db-21304402c1f1/volumes/kubernetes.io~projected/kube-api-access-x7fsz major:0 minor:333 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~projected/kube-api-access-4clxk:{mountpoint:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~projected/kube-api-access-4clxk major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/etcd-client major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~projected/kube-api-access-bvmxp:{mountpoint:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~projected/kube-api-access-bvmxp major:0 minor:66 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~projected/kube-api-access-2krlg:{mountpoint:/var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~projected/kube-api-access-2krlg major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~projected/kube-api-access-575vn:{mountpoint:/var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~projected/kube-api-access-575vn major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/add88bf0-c88d-427d-94bb-897e088a1378/volumes/kubernetes.io~projected/kube-api-access-hkhtr:{mountpoint:/var/lib/kubelet/pods/add88bf0-c88d-427d-94bb-897e088a1378/volumes/kubernetes.io~projected/kube-api-access-hkhtr major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~projected/kube-api-access-t4c4r:{mountpoint:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~projected/kube-api-access-t4c4r major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8194009-3743-4da7-baf1-f9bb0afd6187/volumes/kubernetes.io~projected/kube-api-access-rd9vn:{mountpoint:/var/lib/kubelet/pods/b8194009-3743-4da7-baf1-f9bb0afd6187/volumes/kubernetes.io~projected/kube-api-access-rd9vn major:0 minor:122 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~projected/kube-api-access-n7qqf:{mountpoint:/var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~projected/kube-api-access-n7qqf major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~projected/kube-api-access major:0 minor:315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~secret/serving-cert major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~projected/kube-api-access-pm2l8:{mountpoint:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~projected/kube-api-access-pm2l8 major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e6d5d61a-c5de-4619-9afb-7fad63ba0525/volumes/kubernetes.io~projected/kube-api-access-s68fd:{mountpoint:/var/lib/kubelet/pods/e6d5d61a-c5de-4619-9afb-7fad63ba0525/volumes/kubernetes.io~projected/kube-api-access-s68fd major:0 minor:281 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~projected/kube-api-access-mzklx:{mountpoint:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~projected/kube-api-access-mzklx major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~secret/serving-cert major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~projected/kube-api-access major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~secret/serving-cert major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ffad8fc8-4378-44de-8864-dd2f666ade68/volumes/kubernetes.io~projected/kube-api-access-xcq9j:{mountpoint:/var/lib/kubelet/pods/ffad8fc8-4378-44de-8864-dd2f666ade68/volumes/kubernetes.io~projected/kube-api-access-xcq9j major:0 minor:146 fsType:tmpfs blockSize:0} overlay_0-104:{mountpoint:/var/lib/containers/storage/overlay/0b835f026c057b4812acdde5289b1861c7b6aa73bc7b49ce7812662f18d1fdad/merged major:0 minor:104 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/2a133686b0389b2b972239064f13e4628899632ca36309c2d1918173899e7102/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/fef38fb1272fc4ca3411f86b4e86a029fd771892d80dce51fa3b9184d5bc115d/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-125:{mountpoint:/var/lib/containers/storage/overlay/c78474e73745df18fc4119e551e0a7ee0a0f86e3be2eea8c2995b259c9b2e7c2/merged major:0 minor:125 fsType:overlay blockSize:0} overlay_0-129:{mountpoint:/var/lib/containers/storage/overlay/ce5fceb8ef24fb11f6bd52653b19cd0a25ee077f2e896cadc9daefda4a7c7d13/merged major:0 minor:129 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/670172869ec86f15fc22c4d3cc63f01f4f7c54bfd1e6688b7d38bdf8fc04a8f9/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/b7a9fb4b42f21fdc24b3ffe0478d437e3234ccee7d919ba71666c62fb57e50bc/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-144:{mountpoint:/var/lib/containers/storage/overlay/3f2c80f9d36c1214227d6c455044025129b5bdb1db3ccf98b1e430095e0ba503/merged major:0 minor:144 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/f3b3a86dfaad08c9f6fd6775630eb930f4f331fbab06d418ebca7a7c0f1c88e8/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/70e163bd10a7208244060625759610af887d2dc1579c6d63af517e075d054fdc/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/48709c21f3b233abb92173be272ef9a899bb14db874090ff0da5df2752ba1852/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/48931cb0ccc6d2e83f761bc34f71f6c7739f9c81e58294cab3cb582976d50501/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/f6282a842692f0f551a0715a0573943cfae876484143693c96c1b50dfd7c0e04/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-167:{mountpoint:/var/lib/containers/storage/overlay/ae75840c61385c5258e4899df0519b4fc1bd42057cbb230354e66b75d367058f/merged major:0 minor:167 fsType:overlay blockSize:0} overlay_0-169:{mountpoint:/var/lib/containers/storage/overlay/3d2ea3ba6aa2eedd59e0ddd94af4b0be82da6c6f6a993fd08bc02f95d3e2f755/merged major:0 minor:169 fsType:overlay blockSize:0} overlay_0-175:{mountpoint:/var/lib/containers/storage/overlay/9657116123c72154776220102c999d69450f0f27e96206be1cfc89776e64341f/merged major:0 minor:175 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/04494bcbf20385417053af1b64a08fb3152ffe5dc436ef336583e68a5e70cf78/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-185:{mountpoint:/var/lib/containers/storage/overlay/d8c35ed9e289828a058a37888a3f608e2b8a81f948b18171e9fa3bf3081cc96b/merged major:0 minor:185 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/b55d61767c34d3779531161ebb5ce207629fd0223f94d80090602ec8b2346372/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/9436f93540c593f61efefffeee729587d3c39a874576f9824beef1963400f3a8/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/afc78a9499b6b05c49b45a6b9b841a267b8774d2cadcb00b6441f083e3b9a520/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/2624ff8655e84b1b9d16a3c838e7bcc6fadb6ad147a54812b33a1d091f6d4000/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/9075e26dd4918d536c65d7dfb72f454bac15a5820cbc32b196010ed7e78e1c37/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-207:{mountpoint:/var/lib/containers/storage/overlay/cecadc37bfe270f9ea4fbfdebb5738483ba1bdb8d11f7725699a9dd02e1a9717/merged major:0 minor:207 fsType:overlay blockSize:0} overlay_0-215:{mountpoint:/var/lib/containers/storage/overlay/f6bbc1c38ac949d8b86ee9c49f228896209d14217f05b7f4df0381546670f512/merged major:0 minor:215 fsType:overlay blockSize:0} overlay_0-216:{mountpoint:/var/lib/containers/storage/overlay/8dbaf00860ded0ccc1c8c68bb619e6eac56b8975a3ecd7e149bc177d9e3babc5/merged major:0 minor:216 fsType:overlay blockSize:0} overlay_0-234:{mountpoint:/var/lib/containers/storage/overlay/a4a92a504395e42a41d06894832717265578006fe34fc39609125eb1c2b88a77/merged major:0 minor:234 fsType:overlay blockSize:0} overlay_0-242:{mountpoint:/var/lib/containers/storage/overlay/ed5929c73cfe6aee4c73efbeb1ee350fe3a9fdc038927b1c2a4f44531981f251/merged major:0 minor:242 fsType:overlay blockSize:0} overlay_0-250:{mountpoint:/var/lib/containers/storage/overlay/c5a1f58678f64ba6c7c48de01046065da1042e4ad5fa3012c036642d7bbcec9f/merged major:0 minor:250 fsType:overlay blockSize:0} overlay_0-258:{mountpoint:/var/lib/containers/storage/overlay/eb34961304626bdb305888ccd53ff1d7fda87f1e17b287f3165305d2e5a99b64/merged major:0 minor:258 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/b445c92596425989acedddffc5e8d685ca202e2ed201157087e8ea57e771db2c/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-268:{mountpoint:/var/lib/containers/storage/overlay/391629bf5309395cf5220ffd2d0431e657f46b8c978e3c564a5c70e7075fcf43/merged major:0 minor:268 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/f9de57d207775bc31331b29eccbafc5435a1b2ade0950320c9c8e0b7a91a8577/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-282:{mountpoint:/var/lib/containers/storage/overlay/4b45a9cf3cd3696113fd5ef2be01012b560625e7a2d4a640ce7d63d86f0828e2/merged major:0 minor:282 fsType:overlay blockSize:0} overlay_0-325:{mountpoint:/var/lib/containers/storage/overlay/491d62c17bbf975642a16104c22aeca1e5f9471f6e20e437a7fff7f26b536843/merged major:0 minor:325 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/5f9efb85b82089ad9e6dc5101304368f65b4828c11598f9e2d6c25077990d94a/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-331:{mountpoint:/var/lib/containers/storage/overlay/d22c0d96ffa4b3a76bd4cde8c8b73eca758897759ab324ca141f6a1da2631a43/merged major:0 minor:331 fsType:overlay blockSize:0} overlay_0-338:{mountpoint:/var/lib/containers/storage/overlay/442ff403c23d374db51dd6e266ecb381a14c3ca42957ce4647c57421ca48ea6a/merged major:0 minor:338 fsType:overlay blockSize:0} overlay_0-340:{mountpoint:/var/lib/containers/storage/overlay/eec39460c398a0ab3c2f75ee9c8b9750a0cb42ee3910f03cca59b647a4c9b2d0/merged major:0 minor:340 fsType:overlay blockSize:0} overlay_0-342:{mountpoint:/var/lib/containers/storage/overlay/74bcbe88e852483b8ddbae485282b7e996612893370f341ead0e70911e6fb8cc/merged major:0 minor:342 fsType:overlay blockSize:0} overlay_0-344:{mountpoint:/var/lib/containers/storage/overlay/07f77ee5885c413c8655244f5024ed8f02a016f470283798088fc6a0075351ce/merged major:0 minor:344 fsType:overlay blockSize:0} overlay_0-365:{mountpoint:/var/lib/containers/storage/overlay/c20d70107a41357862503fe8119a772dc12bfb905128ffc55cd4b0c3fa1fdf13/merged major:0 minor:365 fsType:overlay blockSize:0} overlay_0-367:{mountpoint:/var/lib/containers/storage/overlay/92836f0969e4f5c5a7e9f48ee95c144ab95f2be5a7629fa831315b74d198775f/merged major:0 minor:367 fsType:overlay blockSize:0} overlay_0-369:{mountpoint:/var/lib/containers/storage/overlay/131762c92a195cfc40f1b154ec252900e54c22cb4d7393690cf866419282ce6b/merged major:0 minor:369 fsType:overlay blockSize:0} overlay_0-371:{mountpoint:/var/lib/containers/storage/overlay/ced1b2fb93774536673c0a838679b65783435dd49a16eb2120e630bc760e169c/merged major:0 minor:371 fsType:overlay blockSize:0} overlay_0-373:{mountpoint:/var/lib/containers/storage/overlay/bb91aa3247cc91d349c02bb268333b9e911fc85be6b73108352b8547af53f0c4/merged major:0 minor:373 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/c8d9e3cc6f4825415f76ae06b60e7f449cd2dd00a6d497b0879587a09a535edc/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/8139ba9234997e3b1c4ee7b5c2fb95591ae44c9f6e7af10042450918cc458dab/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/4d7c88dd821ee0fd45a3d3a3ff0c66342085549b1c90784945db8ce38d7b489c/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/0045cd701bfa5cc1770b8f7150819ebb90b85a90de2785c0a82debfbbcaec268/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/f2b8b1a80178106765403cff9ebb04424a4436ba2b927b6a4200cef9199337fb/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/156232b9606db3a64c78d8f0c79f0cbd4eab7aca30e01afc2be1714f3307ed7c/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/6a5889e75ea70adf3f7acf9c8e161d969b5764f45e222ab8b0ebd29d18618a49/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/2c822a3ee7fb92cb9518a1f3b56b8b9d71cb8ff6d400d506ee98fba57653a87b/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-68:{mountpoint:/var/lib/containers/storage/overlay/ec4d900cfc11a315625158ca3c288f91d1b62b6aceae3c1e041c71aacb91d756/merged major:0 minor:68 fsType:overlay blockSize:0} overlay_0-70:{mountpoint:/var/lib/containers/storage/overlay/116673f47535c1c3675f0e24b6ba42fa70029da50ad997354f7113cbd7f7ccfb/merged major:0 minor:70 fsType:overlay blockSize:0} overlay_0-72:{mountpoint:/var/lib/containers/storage/overlay/35ee205ab705c3dbe110a3c5307f13e75b16fe0623f49f794718beed8320ad57/merged major:0 minor:72 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/4ffb30a769aba94ab213aadb559904412abd0b791db6e9b6b3ec678c8358af65/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/b4410773ec93e4f48f6a74cdc649f1a64e700f42f62ce6ef541d30c70b7a2ddc/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/58f311d1f3702a8b25145063e62ae3f943ad2058863972a88c5b7e765e92e939/merged major:0 minor:96 fsType:overlay blockSize:0}] Dec 03 21:49:53.826609 master-0 kubenswrapper[9136]: I1203 21:49:53.825730 9136 manager.go:217] Machine: {Timestamp:2025-12-03 21:49:53.82452468 +0000 UTC m=+0.099701092 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:4df8bb8220024c58895f94eb9cdab694 SystemUUID:4df8bb82-2002-4c58-895f-94eb9cdab694 BootID:a203903b-0841-4def-8d3c-caca39fd1aed Filesystems:[{Device:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:295 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-129 DeviceMajor:0 DeviceMinor:129 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~projected/kube-api-access-nvkz7 DeviceMajor:0 DeviceMinor:172 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-258 DeviceMajor:0 DeviceMinor:258 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/kube-api-access-2kfg5 DeviceMajor:0 DeviceMinor:304 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~projected/kube-api-access-94vvg DeviceMajor:0 DeviceMinor:316 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-325 DeviceMajor:0 DeviceMinor:325 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-367 DeviceMajor:0 DeviceMinor:367 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b8194009-3743-4da7-baf1-f9bb0afd6187/volumes/kubernetes.io~projected/kube-api-access-rd9vn DeviceMajor:0 DeviceMinor:122 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:297 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/ffad8fc8-4378-44de-8864-dd2f666ade68/volumes/kubernetes.io~projected/kube-api-access-xcq9j DeviceMajor:0 DeviceMinor:146 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:163 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:291 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~projected/kube-api-access-2krlg DeviceMajor:0 DeviceMinor:310 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09/userdata/shm DeviceMajor:0 DeviceMinor:318 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:298 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~projected/kube-api-access-575vn DeviceMajor:0 DeviceMinor:300 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~projected/kube-api-access-4clxk DeviceMajor:0 DeviceMinor:306 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7/userdata/shm DeviceMajor:0 DeviceMinor:361 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-373 DeviceMajor:0 DeviceMinor:373 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688/userdata/shm DeviceMajor:0 DeviceMinor:335 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:314 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e6d5d61a-c5de-4619-9afb-7fad63ba0525/volumes/kubernetes.io~projected/kube-api-access-s68fd DeviceMajor:0 DeviceMinor:281 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:289 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b/userdata/shm DeviceMajor:0 DeviceMinor:322 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-185 DeviceMajor:0 DeviceMinor:185 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-215 DeviceMajor:0 DeviceMinor:215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:293 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~projected/kube-api-access-vx6rt DeviceMajor:0 DeviceMinor:311 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-70 DeviceMajor:0 DeviceMinor:70 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:187 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:294 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9/userdata/shm DeviceMajor:0 DeviceMinor:173 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/kube-api-access-4fsxc DeviceMajor:0 DeviceMinor:317 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311/userdata/shm DeviceMajor:0 DeviceMinor:324 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~projected/kube-api-access-j6m8f DeviceMajor:0 DeviceMinor:308 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/134c10ef-9f37-4a77-8e8b-4f8326bc8f40/volumes/kubernetes.io~projected/kube-api-access-fkwjb DeviceMajor:0 DeviceMinor:302 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-72 DeviceMajor:0 DeviceMinor:72 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a/userdata/shm DeviceMajor:0 DeviceMinor:319 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-340 DeviceMajor:0 DeviceMinor:340 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~projected/kube-api-access-mzklx DeviceMajor:0 DeviceMinor:313 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0/userdata/shm DeviceMajor:0 DeviceMinor:355 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8/userdata/shm DeviceMajor:0 DeviceMinor:359 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-144 DeviceMajor:0 DeviceMinor:144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95/userdata/shm DeviceMajor:0 DeviceMinor:123 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977/userdata/shm DeviceMajor:0 DeviceMinor:165 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1/userdata/shm DeviceMajor:0 DeviceMinor:189 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:290 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~projected/kube-api-access-bvmxp DeviceMajor:0 DeviceMinor:66 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-242 DeviceMajor:0 DeviceMinor:242 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~projected/kube-api-access-t4c4r DeviceMajor:0 DeviceMinor:299 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-371 DeviceMajor:0 DeviceMinor:371 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-68 DeviceMajor:0 DeviceMinor:68 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-169 DeviceMajor:0 DeviceMinor:169 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-234 DeviceMajor:0 DeviceMinor:234 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:296 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:309 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-175 DeviceMajor:0 DeviceMinor:175 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~projected/kube-api-access-p4mbz DeviceMajor:0 DeviceMinor:164 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-282 DeviceMajor:0 DeviceMinor:282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-342 DeviceMajor:0 DeviceMinor:342 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36/userdata/shm DeviceMajor:0 DeviceMinor:353 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-365 DeviceMajor:0 DeviceMinor:365 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f/userdata/shm DeviceMajor:0 DeviceMinor:360 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:307 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~projected/kube-api-access-pm2l8 DeviceMajor:0 DeviceMinor:312 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-268 DeviceMajor:0 DeviceMinor:268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:315 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/814c8acf-fb8d-4f57-b8db-21304402c1f1/volumes/kubernetes.io~projected/kube-api-access-x7fsz DeviceMajor:0 DeviceMinor:333 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c/userdata/shm DeviceMajor:0 DeviceMinor:354 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-216 DeviceMajor:0 DeviceMinor:216 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e/userdata/shm DeviceMajor:0 DeviceMinor:147 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~projected/kube-api-access-mxl5r DeviceMajor:0 DeviceMinor:151 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~projected/kube-api-access-lsr8k DeviceMajor:0 DeviceMinor:188 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/add88bf0-c88d-427d-94bb-897e088a1378/volumes/kubernetes.io~projected/kube-api-access-hkhtr DeviceMajor:0 DeviceMinor:305 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-344 DeviceMajor:0 DeviceMinor:344 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-331 DeviceMajor:0 DeviceMinor:331 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-167 DeviceMajor:0 DeviceMinor:167 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-369 DeviceMajor:0 DeviceMinor:369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~projected/kube-api-access-n7qqf DeviceMajor:0 DeviceMinor:303 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:67 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~projected/kube-api-access-jvw7p DeviceMajor:0 DeviceMinor:301 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-338 DeviceMajor:0 DeviceMinor:338 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-207 DeviceMajor:0 DeviceMinor:207 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:171 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00/userdata/shm DeviceMajor:0 DeviceMinor:127 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~projected/kube-api-access-tgddm DeviceMajor:0 DeviceMinor:332 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-104 DeviceMajor:0 DeviceMinor:104 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-125 DeviceMajor:0 DeviceMinor:125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-250 DeviceMajor:0 DeviceMinor:250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0b613104230c7fa MacAddress:0e:aa:70:50:21:17 Speed:10000 Mtu:8900} {Name:19084c46f9e406b MacAddress:c2:b1:72:86:e1:0d Speed:10000 Mtu:8900} {Name:2ec578e3da37e24 MacAddress:52:f4:84:84:77:ae Speed:10000 Mtu:8900} {Name:5d92800ba121e99 MacAddress:0e:8b:ef:d8:55:3b Speed:10000 Mtu:8900} {Name:8d88d9135730dcd MacAddress:0e:ef:cc:f3:02:26 Speed:10000 Mtu:8900} {Name:941d02fc970e3b1 MacAddress:32:78:a4:0f:dd:51 Speed:10000 Mtu:8900} {Name:bff07e162c1a61e MacAddress:d2:18:de:d8:44:9f Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:ae:f3:08:a3:10:cf Speed:0 Mtu:8900} {Name:c04e4f1ef981dcb MacAddress:3a:d4:3a:7c:9f:67 Speed:10000 Mtu:8900} {Name:d4dcd3a3c921a75 MacAddress:9e:5b:ac:fc:3e:70 Speed:10000 Mtu:8900} {Name:d8495441bff4f99 MacAddress:e6:ac:fa:ec:4f:b4 Speed:10000 Mtu:8900} {Name:ead64d1f04c9817 MacAddress:ce:96:d4:f5:00:44 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:95:f0:47 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:56:de:4c:1d:30:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 21:49:53.826609 master-0 kubenswrapper[9136]: I1203 21:49:53.826564 9136 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 21:49:53.827172 master-0 kubenswrapper[9136]: I1203 21:49:53.826755 9136 manager.go:233] Version: {KernelVersion:5.14.0-427.97.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511041748-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 21:49:53.827333 master-0 kubenswrapper[9136]: I1203 21:49:53.827281 9136 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 21:49:53.827516 master-0 kubenswrapper[9136]: I1203 21:49:53.827457 9136 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 21:49:53.827730 master-0 kubenswrapper[9136]: I1203 21:49:53.827507 9136 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 21:49:53.827809 master-0 kubenswrapper[9136]: I1203 21:49:53.827745 9136 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 21:49:53.827809 master-0 kubenswrapper[9136]: I1203 21:49:53.827755 9136 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 21:49:53.827809 master-0 kubenswrapper[9136]: I1203 21:49:53.827765 9136 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 21:49:53.827809 master-0 kubenswrapper[9136]: I1203 21:49:53.827810 9136 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 21:49:53.828164 master-0 kubenswrapper[9136]: I1203 21:49:53.828138 9136 state_mem.go:36] "Initialized new in-memory state store" Dec 03 21:49:53.828252 master-0 kubenswrapper[9136]: I1203 21:49:53.828236 9136 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 21:49:53.828322 master-0 kubenswrapper[9136]: I1203 21:49:53.828307 9136 kubelet.go:418] "Attempting to sync node with API server" Dec 03 21:49:53.828362 master-0 kubenswrapper[9136]: I1203 21:49:53.828324 9136 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 21:49:53.828362 master-0 kubenswrapper[9136]: I1203 21:49:53.828340 9136 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 21:49:53.828362 master-0 kubenswrapper[9136]: I1203 21:49:53.828352 9136 kubelet.go:324] "Adding apiserver pod source" Dec 03 21:49:53.828362 master-0 kubenswrapper[9136]: I1203 21:49:53.828364 9136 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 21:49:53.830047 master-0 kubenswrapper[9136]: I1203 21:49:53.830008 9136 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 03 21:49:53.830228 master-0 kubenswrapper[9136]: I1203 21:49:53.830200 9136 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 21:49:53.830579 master-0 kubenswrapper[9136]: I1203 21:49:53.830549 9136 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 21:49:53.830729 master-0 kubenswrapper[9136]: I1203 21:49:53.830701 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 21:49:53.830789 master-0 kubenswrapper[9136]: I1203 21:49:53.830731 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 21:49:53.830789 master-0 kubenswrapper[9136]: I1203 21:49:53.830742 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 21:49:53.830789 master-0 kubenswrapper[9136]: I1203 21:49:53.830752 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 21:49:53.830789 master-0 kubenswrapper[9136]: I1203 21:49:53.830787 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830799 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830811 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830822 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830833 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830845 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830860 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830877 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 21:49:53.830933 master-0 kubenswrapper[9136]: I1203 21:49:53.830925 9136 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 21:49:53.832844 master-0 kubenswrapper[9136]: I1203 21:49:53.832172 9136 server.go:1280] "Started kubelet" Dec 03 21:49:53.833660 master-0 kubenswrapper[9136]: I1203 21:49:53.833307 9136 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 21:49:53.833660 master-0 kubenswrapper[9136]: I1203 21:49:53.833059 9136 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 21:49:53.837070 master-0 kubenswrapper[9136]: I1203 21:49:53.834287 9136 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 03 21:49:53.834989 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 03 21:49:53.846622 master-0 kubenswrapper[9136]: I1203 21:49:53.845431 9136 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 21:49:53.846622 master-0 kubenswrapper[9136]: I1203 21:49:53.846473 9136 server.go:449] "Adding debug handlers to kubelet server" Dec 03 21:49:53.849374 master-0 kubenswrapper[9136]: I1203 21:49:53.848429 9136 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 21:49:53.849567 master-0 kubenswrapper[9136]: I1203 21:49:53.849531 9136 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851596 9136 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851730 9136 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851766 9136 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 15:47:48.351693425 +0000 UTC Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851807 9136 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h57m54.499889423s for next certificate rotation Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851814 9136 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851831 9136 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 21:49:53.852029 master-0 kubenswrapper[9136]: I1203 21:49:53.851939 9136 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 03 21:49:53.855123 master-0 kubenswrapper[9136]: I1203 21:49:53.854164 9136 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 21:49:53.856334 master-0 kubenswrapper[9136]: I1203 21:49:53.856292 9136 factory.go:55] Registering systemd factory Dec 03 21:49:53.856334 master-0 kubenswrapper[9136]: I1203 21:49:53.856326 9136 factory.go:221] Registration of the systemd container factory successfully Dec 03 21:49:53.856864 master-0 kubenswrapper[9136]: I1203 21:49:53.856817 9136 factory.go:153] Registering CRI-O factory Dec 03 21:49:53.856864 master-0 kubenswrapper[9136]: I1203 21:49:53.856863 9136 factory.go:221] Registration of the crio container factory successfully Dec 03 21:49:53.857052 master-0 kubenswrapper[9136]: I1203 21:49:53.857018 9136 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 21:49:53.857140 master-0 kubenswrapper[9136]: I1203 21:49:53.857062 9136 factory.go:103] Registering Raw factory Dec 03 21:49:53.857140 master-0 kubenswrapper[9136]: I1203 21:49:53.857093 9136 manager.go:1196] Started watching for new ooms in manager Dec 03 21:49:53.858057 master-0 kubenswrapper[9136]: I1203 21:49:53.857848 9136 manager.go:319] Starting recovery of all containers Dec 03 21:49:53.862098 master-0 kubenswrapper[9136]: I1203 21:49:53.861992 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk" seLinuxMountContext="" Dec 03 21:49:53.862177 master-0 kubenswrapper[9136]: I1203 21:49:53.862103 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4399d20-f9a6-4ab1-86be-e2845394eaba" volumeName="kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg" seLinuxMountContext="" Dec 03 21:49:53.862177 master-0 kubenswrapper[9136]: I1203 21:49:53.862126 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" volumeName="kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config" seLinuxMountContext="" Dec 03 21:49:53.862177 master-0 kubenswrapper[9136]: I1203 21:49:53.862143 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8da5d44-680e-4169-abc6-607bdc37a64d" volumeName="kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8" seLinuxMountContext="" Dec 03 21:49:53.862177 master-0 kubenswrapper[9136]: I1203 21:49:53.862160 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862176 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862203 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862221 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdfbaebe-d655-4c1e-a039-08802c5c35c5" volumeName="kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862244 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862261 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862277 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862293 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862308 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862327 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862346 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" volumeName="kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.862356 master-0 kubenswrapper[9136]: I1203 21:49:53.862365 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f59094ec-47dd-4547-ad41-b15a7933f461" volumeName="kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862383 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4399d20-f9a6-4ab1-86be-e2845394eaba" volumeName="kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862402 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e6d5d61a-c5de-4619-9afb-7fad63ba0525" volumeName="kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862418 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08432be8-0086-48d2-a93d-7a474e96749d" volumeName="kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862433 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6976b503-87da-48fc-b097-d1b315fbee3f" volumeName="kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862448 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="812401c0-d1ac-4857-b939-217b7b07f8bc" volumeName="kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862465 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8da5d44-680e-4169-abc6-607bdc37a64d" volumeName="kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862482 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862499 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862514 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f088999-ec66-402e-9634-8c762206d6b4" volumeName="kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862526 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="892d5611-debf-402f-abc5-3f99aa080159" volumeName="kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862543 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" volumeName="kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862557 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f088999-ec66-402e-9634-8c762206d6b4" volumeName="kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862569 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="814c8acf-fb8d-4f57-b8db-21304402c1f1" volumeName="kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862582 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" volumeName="kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862597 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f59094ec-47dd-4547-ad41-b15a7933f461" volumeName="kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862611 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="892d5611-debf-402f-abc5-3f99aa080159" volumeName="kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862622 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04f5fc52-4ec2-48c3-8441-2b15ad632233" volumeName="kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862635 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862647 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862667 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862698 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f088999-ec66-402e-9634-8c762206d6b4" volumeName="kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862713 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6976b503-87da-48fc-b097-d1b315fbee3f" volumeName="kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862731 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862756 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7f68d19-71d4-4129-a575-3ee57fa53493" volumeName="kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r" seLinuxMountContext="" Dec 03 21:49:53.862746 master-0 kubenswrapper[9136]: I1203 21:49:53.862805 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8194009-3743-4da7-baf1-f9bb0afd6187" volumeName="kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862827 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdfbaebe-d655-4c1e-a039-08802c5c35c5" volumeName="kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862847 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdfbaebe-d655-4c1e-a039-08802c5c35c5" volumeName="kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862862 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08432be8-0086-48d2-a93d-7a474e96749d" volumeName="kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862876 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862895 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50076985-bbaa-4bcf-9d1a-cc25bed016a7" volumeName="kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862910 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862924 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862939 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7f68d19-71d4-4129-a575-3ee57fa53493" volumeName="kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862956 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862971 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.862986 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863009 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8194009-3743-4da7-baf1-f9bb0afd6187" volumeName="kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863027 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bebd69d2-5b0f-4b66-8722-d6861eba3e12" volumeName="kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863044 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8da5d44-680e-4169-abc6-607bdc37a64d" volumeName="kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863062 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="814c8acf-fb8d-4f57-b8db-21304402c1f1" volumeName="kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863078 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863094 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f59094ec-47dd-4547-ad41-b15a7933f461" volumeName="kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863110 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863124 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863139 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50076985-bbaa-4bcf-9d1a-cc25bed016a7" volumeName="kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863158 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863173 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="add88bf0-c88d-427d-94bb-897e088a1378" volumeName="kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863188 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8194009-3743-4da7-baf1-f9bb0afd6187" volumeName="kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863206 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bebd69d2-5b0f-4b66-8722-d6861eba3e12" volumeName="kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863222 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863237 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50076985-bbaa-4bcf-9d1a-cc25bed016a7" volumeName="kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863253 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863271 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6976b503-87da-48fc-b097-d1b315fbee3f" volumeName="kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863289 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863303 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863317 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" volumeName="kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863332 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" volumeName="kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863347 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863369 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863382 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a3f403-a742-4977-901a-cf4a8eb7df5a" volumeName="kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863397 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08432be8-0086-48d2-a93d-7a474e96749d" volumeName="kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863413 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863431 9136 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config" seLinuxMountContext="" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863445 9136 reconstruct.go:97] "Volume reconstruction finished" Dec 03 21:49:53.863758 master-0 kubenswrapper[9136]: I1203 21:49:53.863458 9136 reconciler.go:26] "Reconciler: start to sync state" Dec 03 21:49:53.866998 master-0 kubenswrapper[9136]: I1203 21:49:53.866971 9136 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 21:49:53.903324 master-0 kubenswrapper[9136]: I1203 21:49:53.903254 9136 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 21:49:53.906390 master-0 kubenswrapper[9136]: I1203 21:49:53.906361 9136 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 21:49:53.906599 master-0 kubenswrapper[9136]: I1203 21:49:53.906409 9136 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 21:49:53.906599 master-0 kubenswrapper[9136]: I1203 21:49:53.906438 9136 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 21:49:53.906687 master-0 kubenswrapper[9136]: E1203 21:49:53.906636 9136 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 21:49:53.908292 master-0 kubenswrapper[9136]: I1203 21:49:53.908257 9136 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 21:49:53.920489 master-0 kubenswrapper[9136]: I1203 21:49:53.920376 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffdea166-7cd6-4319-966e-43579d960fc1" containerID="5e573ccc03a6c280986237abcd7396968c1017f2190b689fe94d1f1000629079" exitCode=0 Dec 03 21:49:53.923698 master-0 kubenswrapper[9136]: I1203 21:49:53.923647 9136 generic.go:334] "Generic (PLEG): container finished" podID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerID="a2582bc7c1924ecfbd971e3c3eba302e8128d7abf1b76de440d5c3eb71052837" exitCode=0 Dec 03 21:49:53.926987 master-0 kubenswrapper[9136]: I1203 21:49:53.926936 9136 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="0d4f1a1f19ffb465b7ec7db2a36e7c6f6dc0dc2fd64d69d37c5c1b12205b376b" exitCode=1 Dec 03 21:49:53.938490 master-0 kubenswrapper[9136]: I1203 21:49:53.938446 9136 generic.go:334] "Generic (PLEG): container finished" podID="53713eab-c920-4d5a-ae05-7cdb59ace852" containerID="e2f2192c61ed1d621d2aff90353ee42f69de43d1e563bba5ffc2fd9223a2ba8a" exitCode=0 Dec 03 21:49:53.941481 master-0 kubenswrapper[9136]: I1203 21:49:53.941426 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 21:49:53.942166 master-0 kubenswrapper[9136]: I1203 21:49:53.942128 9136 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" exitCode=1 Dec 03 21:49:53.942228 master-0 kubenswrapper[9136]: I1203 21:49:53.942173 9136 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4" exitCode=0 Dec 03 21:49:53.968055 master-0 kubenswrapper[9136]: I1203 21:49:53.967966 9136 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="ebd2204fe249e4ff0cd7edb5edf639c7a4971288017457fe2c22236d90e0112a" exitCode=0 Dec 03 21:49:53.973335 master-0 kubenswrapper[9136]: I1203 21:49:53.973290 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="fcf20cc5fbcb1ad49ee0d23a982171e11d3207eebc2515ce8bb2b5502de1eab0" exitCode=0 Dec 03 21:49:53.973335 master-0 kubenswrapper[9136]: I1203 21:49:53.973319 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="e085324cd58339aaa2ff69b985958fab513139997fb20fc1324a7c3c052fa89d" exitCode=0 Dec 03 21:49:53.973335 master-0 kubenswrapper[9136]: I1203 21:49:53.973329 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="742bc32f3346ea0ae584753ab21aedef0a29aac40cae443f406076aeae30033d" exitCode=0 Dec 03 21:49:53.973335 master-0 kubenswrapper[9136]: I1203 21:49:53.973337 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="684c7729ba85fd8ac6a78b633cc41d781048493dac0127949a3ebcf247c67f5b" exitCode=0 Dec 03 21:49:53.973536 master-0 kubenswrapper[9136]: I1203 21:49:53.973344 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="c4e06171c55f7e5fa99ea1abceeee733b284cf3a34d0a86bf2f243d3f655f7a5" exitCode=0 Dec 03 21:49:53.973536 master-0 kubenswrapper[9136]: I1203 21:49:53.973350 9136 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="2378e0ccca63367b0b1f9fee7b2b6b1c87516db57078ddfc7d8d7140f0467b96" exitCode=0 Dec 03 21:49:54.002031 master-0 kubenswrapper[9136]: I1203 21:49:54.001974 9136 manager.go:324] Recovery completed Dec 03 21:49:54.006857 master-0 kubenswrapper[9136]: E1203 21:49:54.006808 9136 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 03 21:49:54.043686 master-0 kubenswrapper[9136]: I1203 21:49:54.043501 9136 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 21:49:54.043686 master-0 kubenswrapper[9136]: I1203 21:49:54.043534 9136 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 21:49:54.043686 master-0 kubenswrapper[9136]: I1203 21:49:54.043566 9136 state_mem.go:36] "Initialized new in-memory state store" Dec 03 21:49:54.044052 master-0 kubenswrapper[9136]: I1203 21:49:54.043788 9136 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 03 21:49:54.044052 master-0 kubenswrapper[9136]: I1203 21:49:54.043803 9136 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 03 21:49:54.044052 master-0 kubenswrapper[9136]: I1203 21:49:54.043827 9136 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 03 21:49:54.044052 master-0 kubenswrapper[9136]: I1203 21:49:54.043836 9136 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 03 21:49:54.044052 master-0 kubenswrapper[9136]: I1203 21:49:54.043846 9136 policy_none.go:49] "None policy: Start" Dec 03 21:49:54.045784 master-0 kubenswrapper[9136]: I1203 21:49:54.045736 9136 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 21:49:54.045846 master-0 kubenswrapper[9136]: I1203 21:49:54.045828 9136 state_mem.go:35] "Initializing new in-memory state store" Dec 03 21:49:54.046120 master-0 kubenswrapper[9136]: I1203 21:49:54.046105 9136 state_mem.go:75] "Updated machine memory state" Dec 03 21:49:54.046187 master-0 kubenswrapper[9136]: I1203 21:49:54.046121 9136 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 03 21:49:54.056722 master-0 kubenswrapper[9136]: I1203 21:49:54.056691 9136 manager.go:334] "Starting Device Plugin manager" Dec 03 21:49:54.056942 master-0 kubenswrapper[9136]: I1203 21:49:54.056902 9136 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 21:49:54.056942 master-0 kubenswrapper[9136]: I1203 21:49:54.056934 9136 server.go:79] "Starting device plugin registration server" Dec 03 21:49:54.057545 master-0 kubenswrapper[9136]: I1203 21:49:54.057513 9136 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 21:49:54.057603 master-0 kubenswrapper[9136]: I1203 21:49:54.057542 9136 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 21:49:54.057722 master-0 kubenswrapper[9136]: I1203 21:49:54.057698 9136 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 21:49:54.057861 master-0 kubenswrapper[9136]: I1203 21:49:54.057837 9136 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 21:49:54.057861 master-0 kubenswrapper[9136]: I1203 21:49:54.057859 9136 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 21:49:54.158311 master-0 kubenswrapper[9136]: I1203 21:49:54.158248 9136 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 21:49:54.161579 master-0 kubenswrapper[9136]: I1203 21:49:54.161539 9136 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 21:49:54.161702 master-0 kubenswrapper[9136]: I1203 21:49:54.161687 9136 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 21:49:54.161803 master-0 kubenswrapper[9136]: I1203 21:49:54.161790 9136 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 21:49:54.161967 master-0 kubenswrapper[9136]: I1203 21:49:54.161954 9136 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 21:49:54.174074 master-0 kubenswrapper[9136]: I1203 21:49:54.174008 9136 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 03 21:49:54.174219 master-0 kubenswrapper[9136]: I1203 21:49:54.174190 9136 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 03 21:49:54.208463 master-0 kubenswrapper[9136]: I1203 21:49:54.207956 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 21:49:54.209313 master-0 kubenswrapper[9136]: I1203 21:49:54.209212 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160c955e6bc6fbc37a201275978850561279b3beae60311f7da8181d0840a469" Dec 03 21:49:54.209396 master-0 kubenswrapper[9136]: I1203 21:49:54.209312 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dedfd07e611933a48dc94c229db18e3af516a0acc63d3043dc43da21b7e099f" Dec 03 21:49:54.209515 master-0 kubenswrapper[9136]: I1203 21:49:54.209341 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"a5db70d8c04a9fdadcb37e94b566026a87f2170f5e692e5d3ce48ba377b79800"} Dec 03 21:49:54.209681 master-0 kubenswrapper[9136]: I1203 21:49:54.209538 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9"} Dec 03 21:49:54.209681 master-0 kubenswrapper[9136]: I1203 21:49:54.209628 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd41923c7718b750c80c184d862535442462bd815561e9bbfb7bb52e77b97884" Dec 03 21:49:54.209681 master-0 kubenswrapper[9136]: I1203 21:49:54.209657 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133"} Dec 03 21:49:54.209883 master-0 kubenswrapper[9136]: I1203 21:49:54.209687 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"f211b15e8d62153f4deaa1bf7dfc87de0781805cce6c2aabbe56ed6f61fa1aa7"} Dec 03 21:49:54.209944 master-0 kubenswrapper[9136]: I1203 21:49:54.209892 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"0d4f1a1f19ffb465b7ec7db2a36e7c6f6dc0dc2fd64d69d37c5c1b12205b376b"} Dec 03 21:49:54.209944 master-0 kubenswrapper[9136]: I1203 21:49:54.209931 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a"} Dec 03 21:49:54.210070 master-0 kubenswrapper[9136]: I1203 21:49:54.210006 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"a66c1588269465b26d7305b3f03876e778d4729a5541123141a6ed0d5a7e9d38"} Dec 03 21:49:54.210070 master-0 kubenswrapper[9136]: I1203 21:49:54.210035 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864"} Dec 03 21:49:54.210158 master-0 kubenswrapper[9136]: I1203 21:49:54.210110 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4"} Dec 03 21:49:54.212719 master-0 kubenswrapper[9136]: I1203 21:49:54.211273 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57"} Dec 03 21:49:54.212920 master-0 kubenswrapper[9136]: I1203 21:49:54.212819 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"6c639bc69c380d96355863b219dedf37b52fea28072ae5913704ecc349c5d8c1"} Dec 03 21:49:54.212920 master-0 kubenswrapper[9136]: I1203 21:49:54.212869 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"70791961c08656ffa1017f6f966c1f2ba603d16610929ed1a69c881343d1bfec"} Dec 03 21:49:54.212920 master-0 kubenswrapper[9136]: I1203 21:49:54.212918 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerDied","Data":"ebd2204fe249e4ff0cd7edb5edf639c7a4971288017457fe2c22236d90e0112a"} Dec 03 21:49:54.213452 master-0 kubenswrapper[9136]: I1203 21:49:54.212954 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"13238af3704fe583f617f61e755cf4c2","Type":"ContainerStarted","Data":"2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2"} Dec 03 21:49:54.213452 master-0 kubenswrapper[9136]: I1203 21:49:54.213021 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"6c365c0b34d496df5874092d2be756d0f4503f99c75f8af416569e515090fd7c"} Dec 03 21:49:54.213452 master-0 kubenswrapper[9136]: I1203 21:49:54.213045 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"d089dbd1d00b21007f2d7f87058c4506fbc7b6ac8e4051768c22497e2ce3a3f4"} Dec 03 21:49:54.214109 master-0 kubenswrapper[9136]: I1203 21:49:54.213068 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"41b95a38663dd6fe34e183818a475977","Type":"ContainerStarted","Data":"671a5ab15340eb87443f5b304e7164498b0964947204dd22840f55587629291e"} Dec 03 21:49:54.224465 master-0 kubenswrapper[9136]: E1203 21:49:54.224398 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.224465 master-0 kubenswrapper[9136]: E1203 21:49:54.224422 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.224711 master-0 kubenswrapper[9136]: W1203 21:49:54.224514 9136 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 03 21:49:54.224711 master-0 kubenswrapper[9136]: E1203 21:49:54.224593 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.224711 master-0 kubenswrapper[9136]: E1203 21:49:54.224599 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.224711 master-0 kubenswrapper[9136]: E1203 21:49:54.224658 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269829 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269873 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269895 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269909 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269926 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269940 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269955 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269970 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.269991 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270005 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270020 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270040 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270057 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270074 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270089 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270103 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.270254 master-0 kubenswrapper[9136]: I1203 21:49:54.270120 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.370789 master-0 kubenswrapper[9136]: I1203 21:49:54.370684 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.370794 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.370835 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.370870 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.370904 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.370941 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.370973 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371005 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371037 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371068 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371099 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371129 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371172 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371175 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371202 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371237 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371268 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371278 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371286 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371349 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371320 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371392 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371389 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371446 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371456 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371473 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371484 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371302 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371509 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371558 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.371533 master-0 kubenswrapper[9136]: I1203 21:49:54.371593 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:54.372721 master-0 kubenswrapper[9136]: I1203 21:49:54.371622 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"etcd-master-0-master-0\" (UID: \"41b95a38663dd6fe34e183818a475977\") " pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:54.372721 master-0 kubenswrapper[9136]: I1203 21:49:54.371651 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.372721 master-0 kubenswrapper[9136]: I1203 21:49:54.371681 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:54.828963 master-0 kubenswrapper[9136]: I1203 21:49:54.828886 9136 apiserver.go:52] "Watching apiserver" Dec 03 21:49:54.837388 master-0 kubenswrapper[9136]: I1203 21:49:54.837351 9136 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 21:49:54.838451 master-0 kubenswrapper[9136]: I1203 21:49:54.838393 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db","openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj","openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh","openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr","openshift-multus/multus-additional-cni-plugins-qz5vh","openshift-network-operator/iptables-alerter-clt4v","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt","openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm","openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d","openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr","openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s","openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj","openshift-network-node-identity/network-node-identity-r24k4","openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j","openshift-multus/multus-6jlh8","openshift-network-operator/network-operator-6cbf58c977-zk7jw","openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w","openshift-ovn-kubernetes/ovnkube-node-k2j45","openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x","openshift-etcd/etcd-master-0-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b","assisted-installer/assisted-installer-controller-q7jjz","openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh","openshift-multus/network-metrics-daemon-h6569","openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5","openshift-cluster-version/cluster-version-operator-869c786959-2bnjf","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-network-diagnostics/network-check-target-78hts"] Dec 03 21:49:54.839158 master-0 kubenswrapper[9136]: I1203 21:49:54.838730 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.839158 master-0 kubenswrapper[9136]: I1203 21:49:54.838731 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 21:49:54.839158 master-0 kubenswrapper[9136]: I1203 21:49:54.838906 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:54.839483 master-0 kubenswrapper[9136]: I1203 21:49:54.839344 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.846590 master-0 kubenswrapper[9136]: I1203 21:49:54.844193 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:54.846590 master-0 kubenswrapper[9136]: I1203 21:49:54.845420 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:54.846590 master-0 kubenswrapper[9136]: I1203 21:49:54.846189 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 21:49:54.848763 master-0 kubenswrapper[9136]: I1203 21:49:54.846684 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.848763 master-0 kubenswrapper[9136]: I1203 21:49:54.848261 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:54.848763 master-0 kubenswrapper[9136]: I1203 21:49:54.848601 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.848763 master-0 kubenswrapper[9136]: I1203 21:49:54.848693 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 21:49:54.849008 master-0 kubenswrapper[9136]: I1203 21:49:54.848957 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 21:49:54.853912 master-0 kubenswrapper[9136]: I1203 21:49:54.852824 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.854059 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.854303 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.854423 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.854529 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.855101 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.855205 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.855418 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 21:49:54.855813 master-0 kubenswrapper[9136]: I1203 21:49:54.855721 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.855829 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.855884 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.855920 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.856072 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.856112 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.856164 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.856276 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 21:49:54.856860 master-0 kubenswrapper[9136]: I1203 21:49:54.856276 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.857872 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.857993 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858106 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858202 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858273 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858328 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858351 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858472 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858493 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858536 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858477 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858664 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858664 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858727 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 21:49:54.858757 master-0 kubenswrapper[9136]: I1203 21:49:54.858819 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 21:49:54.859364 master-0 kubenswrapper[9136]: I1203 21:49:54.858825 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 21:49:54.859364 master-0 kubenswrapper[9136]: I1203 21:49:54.858944 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 21:49:54.859364 master-0 kubenswrapper[9136]: I1203 21:49:54.859063 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 21:49:54.859364 master-0 kubenswrapper[9136]: I1203 21:49:54.859161 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 21:49:54.859364 master-0 kubenswrapper[9136]: I1203 21:49:54.859214 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 21:49:54.859364 master-0 kubenswrapper[9136]: I1203 21:49:54.859351 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 21:49:54.859589 master-0 kubenswrapper[9136]: I1203 21:49:54.859399 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860063 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860092 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860159 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860174 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860248 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860257 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860303 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860428 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860508 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860515 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860559 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860561 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860627 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 21:49:54.861092 master-0 kubenswrapper[9136]: I1203 21:49:54.860832 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 21:49:54.862114 master-0 kubenswrapper[9136]: I1203 21:49:54.862089 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.862211 master-0 kubenswrapper[9136]: I1203 21:49:54.862089 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 21:49:54.862905 master-0 kubenswrapper[9136]: I1203 21:49:54.862835 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 21:49:54.872889 master-0 kubenswrapper[9136]: I1203 21:49:54.872733 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874087 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874321 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874363 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874396 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874427 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874453 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874530 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874590 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874632 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874667 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:54.874828 master-0 kubenswrapper[9136]: I1203 21:49:54.874783 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.874989 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875029 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875057 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875087 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875116 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875141 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875293 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875371 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.875450 master-0 kubenswrapper[9136]: I1203 21:49:54.875440 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875481 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875553 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875620 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875687 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875723 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875816 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875818 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.876175 master-0 kubenswrapper[9136]: I1203 21:49:54.875914 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.878261 master-0 kubenswrapper[9136]: I1203 21:49:54.875950 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:54.878261 master-0 kubenswrapper[9136]: I1203 21:49:54.875975 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.878297 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.876257 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.878336 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.878362 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.878388 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.877415 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 21:49:54.878402 master-0 kubenswrapper[9136]: I1203 21:49:54.878415 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878445 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878474 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878504 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878531 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878560 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878587 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.878676 master-0 kubenswrapper[9136]: I1203 21:49:54.878617 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:54.878933 master-0 kubenswrapper[9136]: I1203 21:49:54.878815 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.879108 master-0 kubenswrapper[9136]: I1203 21:49:54.879048 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.879205 master-0 kubenswrapper[9136]: I1203 21:49:54.879171 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:54.879243 master-0 kubenswrapper[9136]: I1203 21:49:54.879216 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:54.879281 master-0 kubenswrapper[9136]: I1203 21:49:54.879241 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.879281 master-0 kubenswrapper[9136]: I1203 21:49:54.879263 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:54.879342 master-0 kubenswrapper[9136]: I1203 21:49:54.879284 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:54.879342 master-0 kubenswrapper[9136]: I1203 21:49:54.879305 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.879342 master-0 kubenswrapper[9136]: I1203 21:49:54.879324 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.879437 master-0 kubenswrapper[9136]: I1203 21:49:54.879345 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:54.879437 master-0 kubenswrapper[9136]: I1203 21:49:54.879367 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.879437 master-0 kubenswrapper[9136]: I1203 21:49:54.879387 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:54.879437 master-0 kubenswrapper[9136]: I1203 21:49:54.879404 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.879437 master-0 kubenswrapper[9136]: I1203 21:49:54.879422 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:54.879437 master-0 kubenswrapper[9136]: I1203 21:49:54.879439 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879465 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879492 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879512 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879529 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879568 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879598 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:54.879633 master-0 kubenswrapper[9136]: I1203 21:49:54.879625 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:54.879839 master-0 kubenswrapper[9136]: I1203 21:49:54.879651 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.879839 master-0 kubenswrapper[9136]: I1203 21:49:54.879679 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.879839 master-0 kubenswrapper[9136]: I1203 21:49:54.879710 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.879839 master-0 kubenswrapper[9136]: I1203 21:49:54.879743 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.879839 master-0 kubenswrapper[9136]: I1203 21:49:54.879805 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:54.879839 master-0 kubenswrapper[9136]: I1203 21:49:54.879835 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879846 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879861 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879894 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879923 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879953 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879983 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.880060 master-0 kubenswrapper[9136]: I1203 21:49:54.879639 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:54.880293 master-0 kubenswrapper[9136]: I1203 21:49:54.879752 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.880293 master-0 kubenswrapper[9136]: I1203 21:49:54.880132 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.880293 master-0 kubenswrapper[9136]: I1203 21:49:54.880032 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.880293 master-0 kubenswrapper[9136]: I1203 21:49:54.880201 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:54.880462 master-0 kubenswrapper[9136]: I1203 21:49:54.880307 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:54.880462 master-0 kubenswrapper[9136]: I1203 21:49:54.880345 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:54.880462 master-0 kubenswrapper[9136]: I1203 21:49:54.880368 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.880462 master-0 kubenswrapper[9136]: I1203 21:49:54.880395 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.880462 master-0 kubenswrapper[9136]: I1203 21:49:54.880425 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.880462 master-0 kubenswrapper[9136]: I1203 21:49:54.880453 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880487 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880519 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880548 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880580 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880609 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880636 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880646 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880663 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.880684 master-0 kubenswrapper[9136]: I1203 21:49:54.880692 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880721 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880751 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880794 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880849 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880577 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880858 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.880964 master-0 kubenswrapper[9136]: I1203 21:49:54.880954 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.881170 master-0 kubenswrapper[9136]: I1203 21:49:54.881054 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:54.881170 master-0 kubenswrapper[9136]: I1203 21:49:54.881109 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:54.881170 master-0 kubenswrapper[9136]: I1203 21:49:54.881112 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.881273 master-0 kubenswrapper[9136]: I1203 21:49:54.881174 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.881307 master-0 kubenswrapper[9136]: I1203 21:49:54.881278 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.881339 master-0 kubenswrapper[9136]: I1203 21:49:54.881318 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:54.881373 master-0 kubenswrapper[9136]: I1203 21:49:54.881354 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:54.881460 master-0 kubenswrapper[9136]: I1203 21:49:54.881419 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.881496 master-0 kubenswrapper[9136]: I1203 21:49:54.881459 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:54.881535 master-0 kubenswrapper[9136]: I1203 21:49:54.881493 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.881574 master-0 kubenswrapper[9136]: I1203 21:49:54.881561 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.881626 master-0 kubenswrapper[9136]: I1203 21:49:54.881606 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.881665 master-0 kubenswrapper[9136]: I1203 21:49:54.881649 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:54.881720 master-0 kubenswrapper[9136]: I1203 21:49:54.881676 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.881720 master-0 kubenswrapper[9136]: I1203 21:49:54.881700 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:54.881822 master-0 kubenswrapper[9136]: I1203 21:49:54.881761 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.881822 master-0 kubenswrapper[9136]: I1203 21:49:54.881806 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.881892 master-0 kubenswrapper[9136]: I1203 21:49:54.881827 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.881892 master-0 kubenswrapper[9136]: I1203 21:49:54.881852 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.881892 master-0 kubenswrapper[9136]: I1203 21:49:54.881873 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.881974 master-0 kubenswrapper[9136]: I1203 21:49:54.881892 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.881974 master-0 kubenswrapper[9136]: I1203 21:49:54.881936 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.881974 master-0 kubenswrapper[9136]: I1203 21:49:54.881956 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.882053 master-0 kubenswrapper[9136]: I1203 21:49:54.881977 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:54.882053 master-0 kubenswrapper[9136]: I1203 21:49:54.882020 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 21:49:54.882053 master-0 kubenswrapper[9136]: I1203 21:49:54.882032 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:54.882128 master-0 kubenswrapper[9136]: I1203 21:49:54.882056 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:54.882128 master-0 kubenswrapper[9136]: I1203 21:49:54.882082 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.882128 master-0 kubenswrapper[9136]: I1203 21:49:54.882101 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.882128 master-0 kubenswrapper[9136]: I1203 21:49:54.882121 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.882231 master-0 kubenswrapper[9136]: I1203 21:49:54.882145 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.882261 master-0 kubenswrapper[9136]: I1203 21:49:54.882237 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.882290 master-0 kubenswrapper[9136]: I1203 21:49:54.882268 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:54.882322 master-0 kubenswrapper[9136]: I1203 21:49:54.882294 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:54.882322 master-0 kubenswrapper[9136]: I1203 21:49:54.882313 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.882378 master-0 kubenswrapper[9136]: I1203 21:49:54.882316 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.882408 master-0 kubenswrapper[9136]: I1203 21:49:54.882389 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.882454 master-0 kubenswrapper[9136]: I1203 21:49:54.882432 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.882454 master-0 kubenswrapper[9136]: I1203 21:49:54.882431 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.882514 master-0 kubenswrapper[9136]: I1203 21:49:54.882478 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.882514 master-0 kubenswrapper[9136]: I1203 21:49:54.882499 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.882566 master-0 kubenswrapper[9136]: I1203 21:49:54.882521 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:54.882566 master-0 kubenswrapper[9136]: I1203 21:49:54.882550 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:54.882626 master-0 kubenswrapper[9136]: I1203 21:49:54.882592 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:54.882626 master-0 kubenswrapper[9136]: I1203 21:49:54.882610 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 21:49:54.882850 master-0 kubenswrapper[9136]: I1203 21:49:54.882756 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:54.883335 master-0 kubenswrapper[9136]: I1203 21:49:54.882968 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:54.883335 master-0 kubenswrapper[9136]: I1203 21:49:54.883054 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:54.885630 master-0 kubenswrapper[9136]: I1203 21:49:54.885578 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.886009 master-0 kubenswrapper[9136]: I1203 21:49:54.885972 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:54.886262 master-0 kubenswrapper[9136]: I1203 21:49:54.886229 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 21:49:54.886386 master-0 kubenswrapper[9136]: I1203 21:49:54.886359 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 21:49:54.886926 master-0 kubenswrapper[9136]: I1203 21:49:54.886898 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 21:49:54.887669 master-0 kubenswrapper[9136]: I1203 21:49:54.887640 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 21:49:54.888040 master-0 kubenswrapper[9136]: I1203 21:49:54.887837 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.889630 master-0 kubenswrapper[9136]: I1203 21:49:54.888053 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 21:49:54.890151 master-0 kubenswrapper[9136]: I1203 21:49:54.889930 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 21:49:54.890151 master-0 kubenswrapper[9136]: I1203 21:49:54.889951 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 21:49:54.890151 master-0 kubenswrapper[9136]: I1203 21:49:54.889954 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 21:49:54.890151 master-0 kubenswrapper[9136]: I1203 21:49:54.890042 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 21:49:54.890309 master-0 kubenswrapper[9136]: I1203 21:49:54.890053 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 21:49:54.890309 master-0 kubenswrapper[9136]: I1203 21:49:54.890204 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 21:49:54.890309 master-0 kubenswrapper[9136]: I1203 21:49:54.889945 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.890703 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.891040 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.891110 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.891366 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.891610 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.891954 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.892103 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.892491 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.892808 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.893105 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.893206 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.893513 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 21:49:54.894973 master-0 kubenswrapper[9136]: I1203 21:49:54.893536 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.895579 master-0 kubenswrapper[9136]: I1203 21:49:54.895208 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.895579 master-0 kubenswrapper[9136]: I1203 21:49:54.895414 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 21:49:54.895579 master-0 kubenswrapper[9136]: I1203 21:49:54.895433 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 21:49:54.895579 master-0 kubenswrapper[9136]: I1203 21:49:54.895472 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 21:49:54.895579 master-0 kubenswrapper[9136]: I1203 21:49:54.895477 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 21:49:54.895730 master-0 kubenswrapper[9136]: I1203 21:49:54.895624 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 21:49:54.895783 master-0 kubenswrapper[9136]: I1203 21:49:54.895739 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.896165 master-0 kubenswrapper[9136]: I1203 21:49:54.896131 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 21:49:54.896318 master-0 kubenswrapper[9136]: I1203 21:49:54.896290 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 21:49:54.896357 master-0 kubenswrapper[9136]: I1203 21:49:54.896316 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.896357 master-0 kubenswrapper[9136]: I1203 21:49:54.896339 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 21:49:54.896443 master-0 kubenswrapper[9136]: I1203 21:49:54.896400 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.896752 master-0 kubenswrapper[9136]: I1203 21:49:54.896726 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 21:49:54.896849 master-0 kubenswrapper[9136]: I1203 21:49:54.895840 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:54.896886 master-0 kubenswrapper[9136]: I1203 21:49:54.896846 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.896886 master-0 kubenswrapper[9136]: I1203 21:49:54.896872 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 21:49:54.897652 master-0 kubenswrapper[9136]: I1203 21:49:54.897619 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.897737 master-0 kubenswrapper[9136]: I1203 21:49:54.897700 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.898306 master-0 kubenswrapper[9136]: I1203 21:49:54.898267 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 21:49:54.899571 master-0 kubenswrapper[9136]: I1203 21:49:54.899269 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:54.899885 master-0 kubenswrapper[9136]: I1203 21:49:54.899845 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 21:49:54.900584 master-0 kubenswrapper[9136]: I1203 21:49:54.900076 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 21:49:54.901712 master-0 kubenswrapper[9136]: I1203 21:49:54.901405 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.902575 master-0 kubenswrapper[9136]: I1203 21:49:54.902477 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 21:49:54.902923 master-0 kubenswrapper[9136]: I1203 21:49:54.902897 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 21:49:54.903221 master-0 kubenswrapper[9136]: I1203 21:49:54.903191 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.903344 master-0 kubenswrapper[9136]: I1203 21:49:54.903204 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 21:49:54.904173 master-0 kubenswrapper[9136]: I1203 21:49:54.904145 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 21:49:54.904173 master-0 kubenswrapper[9136]: I1203 21:49:54.904158 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 21:49:54.904334 master-0 kubenswrapper[9136]: I1203 21:49:54.904308 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.904486 master-0 kubenswrapper[9136]: I1203 21:49:54.904465 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 21:49:54.910098 master-0 kubenswrapper[9136]: I1203 21:49:54.910010 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:54.911526 master-0 kubenswrapper[9136]: I1203 21:49:54.911493 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.913014 master-0 kubenswrapper[9136]: I1203 21:49:54.912976 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.913596 master-0 kubenswrapper[9136]: I1203 21:49:54.913545 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.914159 master-0 kubenswrapper[9136]: I1203 21:49:54.914096 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:54.919925 master-0 kubenswrapper[9136]: I1203 21:49:54.919328 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 21:49:54.923480 master-0 kubenswrapper[9136]: I1203 21:49:54.923443 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 21:49:54.943018 master-0 kubenswrapper[9136]: I1203 21:49:54.942956 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:49:54.952950 master-0 kubenswrapper[9136]: I1203 21:49:54.952904 9136 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 03 21:49:54.968979 master-0 kubenswrapper[9136]: I1203 21:49:54.968894 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:54.982890 master-0 kubenswrapper[9136]: I1203 21:49:54.982844 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.982890 master-0 kubenswrapper[9136]: I1203 21:49:54.982890 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.982926 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.982947 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.982981 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.983016 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.983043 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.983067 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.983074 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:54.983119 master-0 kubenswrapper[9136]: I1203 21:49:54.983106 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: I1203 21:49:54.983143 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: I1203 21:49:54.983167 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: E1203 21:49:54.983168 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: I1203 21:49:54.983187 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: I1203 21:49:54.983208 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: E1203 21:49:54.983244 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.483223842 +0000 UTC m=+1.758400224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: E1203 21:49:54.983282 9136 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: E1203 21:49:54.983315 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.483303355 +0000 UTC m=+1.758479737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:54.983328 master-0 kubenswrapper[9136]: I1203 21:49:54.983281 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983350 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983384 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983408 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983429 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983459 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983485 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983523 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.983574 master-0 kubenswrapper[9136]: I1203 21:49:54.983553 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.983920 master-0 kubenswrapper[9136]: I1203 21:49:54.983884 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: I1203 21:49:54.983897 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: I1203 21:49:54.983924 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: E1203 21:49:54.983323 9136 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: I1203 21:49:54.983938 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: I1203 21:49:54.983987 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: E1203 21:49:54.984006 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.483990857 +0000 UTC m=+1.759167259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: I1203 21:49:54.984024 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.984034 master-0 kubenswrapper[9136]: I1203 21:49:54.984030 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: I1203 21:49:54.984053 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: E1203 21:49:54.984078 9136 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: I1203 21:49:54.984092 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: E1203 21:49:54.984108 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.484098691 +0000 UTC m=+1.759275083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: E1203 21:49:54.984115 9136 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: I1203 21:49:54.984128 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: I1203 21:49:54.984142 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: E1203 21:49:54.984160 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.484134931 +0000 UTC m=+1.759311313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: I1203 21:49:54.984172 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.984269 master-0 kubenswrapper[9136]: I1203 21:49:54.984186 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.984611 master-0 kubenswrapper[9136]: I1203 21:49:54.984303 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.985517 master-0 kubenswrapper[9136]: I1203 21:49:54.985487 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.985595 master-0 kubenswrapper[9136]: I1203 21:49:54.985541 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.985595 master-0 kubenswrapper[9136]: I1203 21:49:54.985566 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.985595 master-0 kubenswrapper[9136]: I1203 21:49:54.985582 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: I1203 21:49:54.985597 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: I1203 21:49:54.985600 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: E1203 21:49:54.985656 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: I1203 21:49:54.985690 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: E1203 21:49:54.985698 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.485686721 +0000 UTC m=+1.760863153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: I1203 21:49:54.985716 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.985723 master-0 kubenswrapper[9136]: I1203 21:49:54.985718 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985736 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985786 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985794 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985751 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985908 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985941 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.985962 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.986003 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.986000 master-0 kubenswrapper[9136]: I1203 21:49:54.986007 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986010 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986039 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986096 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986125 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986125 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986148 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986156 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: E1203 21:49:54.986190 9136 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: E1203 21:49:54.986204 9136 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986210 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: E1203 21:49:54.986237 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.486221658 +0000 UTC m=+1.761398030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: E1203 21:49:54.986248 9136 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986257 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: E1203 21:49:54.986262 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.486250249 +0000 UTC m=+1.761426831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986287 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986292 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: E1203 21:49:54.986300 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.48628469 +0000 UTC m=+1.761461252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986252 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986316 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986323 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986352 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986384 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986355 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986427 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.986407 master-0 kubenswrapper[9136]: I1203 21:49:54.986447 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986473 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986479 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986507 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986539 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986590 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986649 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986686 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986701 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: E1203 21:49:54.986722 9136 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986737 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: E1203 21:49:54.986827 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.486803877 +0000 UTC m=+1.761980319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986839 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986869 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: E1203 21:49:54.986901 9136 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: I1203 21:49:54.986948 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:54.987312 master-0 kubenswrapper[9136]: E1203 21:49:54.986987 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:55.486945251 +0000 UTC m=+1.762121823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : secret "metrics-daemon-secret" not found Dec 03 21:49:55.015270 master-0 kubenswrapper[9136]: I1203 21:49:55.015138 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 21:49:55.096443 master-0 kubenswrapper[9136]: I1203 21:49:55.096394 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:55.096810 master-0 kubenswrapper[9136]: I1203 21:49:55.096686 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:55.099022 master-0 kubenswrapper[9136]: I1203 21:49:55.098998 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 21:49:55.099809 master-0 kubenswrapper[9136]: I1203 21:49:55.099719 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:55.123591 master-0 kubenswrapper[9136]: I1203 21:49:55.118153 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 21:49:55.130385 master-0 kubenswrapper[9136]: I1203 21:49:55.130237 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:55.143169 master-0 kubenswrapper[9136]: I1203 21:49:55.143112 9136 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 21:49:55.154097 master-0 kubenswrapper[9136]: I1203 21:49:55.154032 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:55.167576 master-0 kubenswrapper[9136]: I1203 21:49:55.167463 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:55.296650 master-0 kubenswrapper[9136]: I1203 21:49:55.296571 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:55.297605 master-0 kubenswrapper[9136]: I1203 21:49:55.297560 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 21:49:55.298758 master-0 kubenswrapper[9136]: I1203 21:49:55.298672 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 21:49:55.299272 master-0 kubenswrapper[9136]: I1203 21:49:55.299218 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:55.299905 master-0 kubenswrapper[9136]: I1203 21:49:55.299858 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:55.311322 master-0 kubenswrapper[9136]: I1203 21:49:55.311282 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 21:49:55.316279 master-0 kubenswrapper[9136]: I1203 21:49:55.316218 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:49:55.325674 master-0 kubenswrapper[9136]: I1203 21:49:55.325613 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 21:49:55.343004 master-0 kubenswrapper[9136]: I1203 21:49:55.342934 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:55.365726 master-0 kubenswrapper[9136]: I1203 21:49:55.365680 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 21:49:55.387579 master-0 kubenswrapper[9136]: I1203 21:49:55.387293 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 21:49:55.405466 master-0 kubenswrapper[9136]: I1203 21:49:55.405407 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 21:49:55.426738 master-0 kubenswrapper[9136]: I1203 21:49:55.426703 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:55.467042 master-0 kubenswrapper[9136]: I1203 21:49:55.448483 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 21:49:55.467042 master-0 kubenswrapper[9136]: I1203 21:49:55.457847 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:49:55.492257 master-0 kubenswrapper[9136]: I1203 21:49:55.482953 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:55.492257 master-0 kubenswrapper[9136]: I1203 21:49:55.484270 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 21:49:55.492257 master-0 kubenswrapper[9136]: I1203 21:49:55.488310 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501219 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501293 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501325 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501346 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501364 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501393 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501418 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501444 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501464 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501480 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: I1203 21:49:55.501498 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: E1203 21:49:55.501632 9136 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:55.501669 master-0 kubenswrapper[9136]: E1203 21:49:55.501693 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.50167709 +0000 UTC m=+2.776853472 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503637 9136 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503755 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.503737646 +0000 UTC m=+2.778914028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503823 9136 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503849 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.503841339 +0000 UTC m=+2.779017721 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503882 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503901 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.503894571 +0000 UTC m=+2.779070953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503933 9136 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503951 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.503945662 +0000 UTC m=+2.779122044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503980 9136 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.503995 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.503990354 +0000 UTC m=+2.779166736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504025 9136 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504047 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.504040576 +0000 UTC m=+2.779217068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504086 9136 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504107 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.504099157 +0000 UTC m=+2.779275639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504149 9136 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504169 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.504163739 +0000 UTC m=+2.779340121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : secret "metrics-daemon-secret" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504199 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504215 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.504209801 +0000 UTC m=+2.779386183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504258 9136 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:55.505564 master-0 kubenswrapper[9136]: E1203 21:49:55.504282 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:56.504274633 +0000 UTC m=+2.779451015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:55.508194 master-0 kubenswrapper[9136]: I1203 21:49:55.507339 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:55.537688 master-0 kubenswrapper[9136]: I1203 21:49:55.537633 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-78hts"] Dec 03 21:49:55.541304 master-0 kubenswrapper[9136]: E1203 21:49:55.541119 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 21:49:55.545621 master-0 kubenswrapper[9136]: W1203 21:49:55.545583 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 WatchSource:0}: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 21:49:55.560383 master-0 kubenswrapper[9136]: E1203 21:49:55.560302 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 21:49:55.578932 master-0 kubenswrapper[9136]: W1203 21:49:55.578886 9136 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Dec 03 21:49:55.579064 master-0 kubenswrapper[9136]: E1203 21:49:55.579000 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:49:55.597414 master-0 kubenswrapper[9136]: E1203 21:49:55.597359 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:55.619784 master-0 kubenswrapper[9136]: E1203 21:49:55.619707 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:55.991263 master-0 kubenswrapper[9136]: I1203 21:49:55.990602 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerStarted","Data":"bfa48bdaae3ed0aa3b0f3696051d4e5dea6449cb42ff81d3ed5c9026e7ac1908"} Dec 03 21:49:56.008661 master-0 kubenswrapper[9136]: I1203 21:49:56.008554 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerStarted","Data":"7abe60a608c2243a0171688d4d7d094c30c03b0afcc0cdcafa8f846a9926432b"} Dec 03 21:49:56.012340 master-0 kubenswrapper[9136]: I1203 21:49:56.011471 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerStarted","Data":"ad5b230b5b0a6050c3e00f60f378eb9862c13b3605b44eb150bd273e89f0bd98"} Dec 03 21:49:56.019135 master-0 kubenswrapper[9136]: I1203 21:49:56.014317 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerStarted","Data":"f8f51cc4e951397e53befcebae88ea2052970977c20019cfe472b3c9dcc46778"} Dec 03 21:49:56.019135 master-0 kubenswrapper[9136]: I1203 21:49:56.016184 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" event={"ID":"add88bf0-c88d-427d-94bb-897e088a1378","Type":"ContainerStarted","Data":"50ea9b3d8d8a684066d6791cddb4680be7db2c4667be5e468a6d5e22cffb259f"} Dec 03 21:49:56.019135 master-0 kubenswrapper[9136]: I1203 21:49:56.018675 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerStarted","Data":"f1f81a88c3e9df6920f7399941c169fb03094cedcb1ae2463c0147ff10995db1"} Dec 03 21:49:56.023225 master-0 kubenswrapper[9136]: I1203 21:49:56.022579 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerStarted","Data":"930ace1c47675e8f2e46f1361fdd688d5b7098c0fe077d099c56f05d2371a225"} Dec 03 21:49:56.045951 master-0 kubenswrapper[9136]: I1203 21:49:56.044379 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerStarted","Data":"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce"} Dec 03 21:49:56.048142 master-0 kubenswrapper[9136]: I1203 21:49:56.048067 9136 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="fa08d3f519d75f25a074a139ff466f2e3733733b2aac82151c4b8dd0843b2a0d" exitCode=0 Dec 03 21:49:56.049107 master-0 kubenswrapper[9136]: I1203 21:49:56.049070 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerDied","Data":"fa08d3f519d75f25a074a139ff466f2e3733733b2aac82151c4b8dd0843b2a0d"} Dec 03 21:49:56.450845 master-0 kubenswrapper[9136]: I1203 21:49:56.450450 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:56.517581 master-0 kubenswrapper[9136]: I1203 21:49:56.517523 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:56.517581 master-0 kubenswrapper[9136]: I1203 21:49:56.517577 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:56.517581 master-0 kubenswrapper[9136]: I1203 21:49:56.517604 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: I1203 21:49:56.517633 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: I1203 21:49:56.517660 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: I1203 21:49:56.517689 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: I1203 21:49:56.517715 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: I1203 21:49:56.517748 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: E1203 21:49:56.517746 9136 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: E1203 21:49:56.517883 9136 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: E1203 21:49:56.517914 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:56.518003 master-0 kubenswrapper[9136]: E1203 21:49:56.517976 9136 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518011 9136 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.517886 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.517856702 +0000 UTC m=+4.793033084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518077 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518060999 +0000 UTC m=+4.793237381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518091 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.51808407 +0000 UTC m=+4.793260452 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518092 9136 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518113 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.51809801 +0000 UTC m=+4.793274392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518130 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518122071 +0000 UTC m=+4.793298453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: I1203 21:49:56.517790 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518147 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518137212 +0000 UTC m=+4.793313594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : secret "metrics-daemon-secret" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518172 9136 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518214 9136 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: I1203 21:49:56.518209 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:56.518247 master-0 kubenswrapper[9136]: E1203 21:49:56.518230 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518201564 +0000 UTC m=+4.793377946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518043 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518284 9136 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518287 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518274996 +0000 UTC m=+4.793451368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518348 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518322677 +0000 UTC m=+4.793499059 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518385 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.518364218 +0000 UTC m=+4.793540610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518397 9136 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: E1203 21:49:56.518435 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:58.51842727 +0000 UTC m=+4.793603642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:56.519702 master-0 kubenswrapper[9136]: I1203 21:49:56.518343 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:56.719515 master-0 kubenswrapper[9136]: I1203 21:49:56.719312 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:56.735587 master-0 kubenswrapper[9136]: I1203 21:49:56.735524 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: I1203 21:49:56.856509 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp"] Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: E1203 21:49:56.856685 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: I1203 21:49:56.856703 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: E1203 21:49:56.856714 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" containerName="prober" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: I1203 21:49:56.856721 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" containerName="prober" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: I1203 21:49:56.856837 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffdea166-7cd6-4319-966e-43579d960fc1" containerName="prober" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: I1203 21:49:56.856849 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 21:49:56.858952 master-0 kubenswrapper[9136]: I1203 21:49:56.857143 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 21:49:56.886844 master-0 kubenswrapper[9136]: I1203 21:49:56.883434 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp"] Dec 03 21:49:56.924509 master-0 kubenswrapper[9136]: I1203 21:49:56.924464 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89p9d\" (UniqueName: \"kubernetes.io/projected/28c42112-a09e-4b7a-b23b-c06bef69cbfb-kube-api-access-89p9d\") pod \"csi-snapshot-controller-86897dd478-g4ldp\" (UID: \"28c42112-a09e-4b7a-b23b-c06bef69cbfb\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 21:49:57.026888 master-0 kubenswrapper[9136]: I1203 21:49:57.023147 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc"] Dec 03 21:49:57.026888 master-0 kubenswrapper[9136]: I1203 21:49:57.023834 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 21:49:57.026888 master-0 kubenswrapper[9136]: I1203 21:49:57.025792 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89p9d\" (UniqueName: \"kubernetes.io/projected/28c42112-a09e-4b7a-b23b-c06bef69cbfb-kube-api-access-89p9d\") pod \"csi-snapshot-controller-86897dd478-g4ldp\" (UID: \"28c42112-a09e-4b7a-b23b-c06bef69cbfb\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 21:49:57.036306 master-0 kubenswrapper[9136]: I1203 21:49:57.035758 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 21:49:57.037668 master-0 kubenswrapper[9136]: I1203 21:49:57.037600 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc"] Dec 03 21:49:57.056673 master-0 kubenswrapper[9136]: I1203 21:49:57.052153 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 21:49:57.061525 master-0 kubenswrapper[9136]: I1203 21:49:57.061475 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerStarted","Data":"7abc4d8635b4469a4776710f14e691f06a0b7b60d5e937f6ea27f069d519024a"} Dec 03 21:49:57.069658 master-0 kubenswrapper[9136]: I1203 21:49:57.069443 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 21:49:57.089141 master-0 kubenswrapper[9136]: I1203 21:49:57.089071 9136 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 21:49:57.127635 master-0 kubenswrapper[9136]: I1203 21:49:57.127563 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntg2z\" (UniqueName: \"kubernetes.io/projected/2b014bee-5931-4856-b9e8-e38a134a1b6b-kube-api-access-ntg2z\") pod \"migrator-5bcf58cf9c-qc9zc\" (UID: \"2b014bee-5931-4856-b9e8-e38a134a1b6b\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 21:49:57.149353 master-0 kubenswrapper[9136]: I1203 21:49:57.146997 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89p9d\" (UniqueName: \"kubernetes.io/projected/28c42112-a09e-4b7a-b23b-c06bef69cbfb-kube-api-access-89p9d\") pod \"csi-snapshot-controller-86897dd478-g4ldp\" (UID: \"28c42112-a09e-4b7a-b23b-c06bef69cbfb\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 21:49:57.231316 master-0 kubenswrapper[9136]: I1203 21:49:57.231240 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntg2z\" (UniqueName: \"kubernetes.io/projected/2b014bee-5931-4856-b9e8-e38a134a1b6b-kube-api-access-ntg2z\") pod \"migrator-5bcf58cf9c-qc9zc\" (UID: \"2b014bee-5931-4856-b9e8-e38a134a1b6b\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 21:49:57.239050 master-0 kubenswrapper[9136]: I1203 21:49:57.238938 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 21:49:57.276367 master-0 kubenswrapper[9136]: I1203 21:49:57.275951 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntg2z\" (UniqueName: \"kubernetes.io/projected/2b014bee-5931-4856-b9e8-e38a134a1b6b-kube-api-access-ntg2z\") pod \"migrator-5bcf58cf9c-qc9zc\" (UID: \"2b014bee-5931-4856-b9e8-e38a134a1b6b\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 21:49:57.362510 master-0 kubenswrapper[9136]: I1203 21:49:57.361868 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 21:49:57.435830 master-0 kubenswrapper[9136]: I1203 21:49:57.435117 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp"] Dec 03 21:49:57.603152 master-0 kubenswrapper[9136]: I1203 21:49:57.603052 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc"] Dec 03 21:49:58.079797 master-0 kubenswrapper[9136]: I1203 21:49:58.079662 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"33442f434ca7c4c52187f1b4d1183975995d4960e8685cc35c4e0acc9b058c70"} Dec 03 21:49:58.082192 master-0 kubenswrapper[9136]: I1203 21:49:58.082150 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" event={"ID":"2b014bee-5931-4856-b9e8-e38a134a1b6b","Type":"ContainerStarted","Data":"788f405f215c107f0aaae844e0357bd2af8b176524b9b27b9d45876c6c07c516"} Dec 03 21:49:58.301670 master-0 kubenswrapper[9136]: I1203 21:49:58.301602 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-spm87"] Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.302576 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.305161 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.305386 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.306221 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-spm87"] Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.306668 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.306763 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 21:49:58.307056 master-0 kubenswrapper[9136]: I1203 21:49:58.306993 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 21:49:58.309407 master-0 kubenswrapper[9136]: I1203 21:49:58.308132 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 21:49:58.325875 master-0 kubenswrapper[9136]: I1203 21:49:58.323159 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:58.330373 master-0 kubenswrapper[9136]: I1203 21:49:58.330258 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:58.394435 master-0 kubenswrapper[9136]: I1203 21:49:58.394365 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:58.404535 master-0 kubenswrapper[9136]: I1203 21:49:58.404500 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:49:58.449490 master-0 kubenswrapper[9136]: I1203 21:49:58.449425 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.449490 master-0 kubenswrapper[9136]: I1203 21:49:58.449484 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.449746 master-0 kubenswrapper[9136]: I1203 21:49:58.449536 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.449746 master-0 kubenswrapper[9136]: I1203 21:49:58.449560 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kw57\" (UniqueName: \"kubernetes.io/projected/b57d454a-53e5-4c37-b1db-16ecf763dde1-kube-api-access-5kw57\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.449746 master-0 kubenswrapper[9136]: I1203 21:49:58.449625 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.550887 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.550941 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.550974 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551016 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551043 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551077 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551101 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551121 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551119 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551149 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551154 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551177 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: I1203 21:49:58.551205 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551204 9136 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551259 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551264 9136 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551204 9136 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551296 9136 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551204 9136 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551154 9136 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:49:58.551313 master-0 kubenswrapper[9136]: E1203 21:49:58.551215 9136 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551236 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:59.051210908 +0000 UTC m=+5.326387290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "openshift-global-ca" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551405 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551379683 +0000 UTC m=+8.826556065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "performance-addon-operator-webhook-cert" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: I1203 21:49:58.551429 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kw57\" (UniqueName: \"kubernetes.io/projected/b57d454a-53e5-4c37-b1db-16ecf763dde1-kube-api-access-5kw57\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: I1203 21:49:58.551461 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551505 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert podName:2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551483266 +0000 UTC m=+8.826659848 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert") pod "cluster-version-operator-869c786959-2bnjf" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9") : secret "cluster-version-operator-serving-cert" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551527 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:59.051518098 +0000 UTC m=+5.326694720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "client-ca" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551530 9136 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551544 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551536948 +0000 UTC m=+8.826713560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : secret "metrics-daemon-secret" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551566 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551553279 +0000 UTC m=+8.826729901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551584 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls podName:b7f68d19-71d4-4129-a575-3ee57fa53493 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.55157677 +0000 UTC m=+8.826753392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls") pod "cluster-node-tuning-operator-bbd9b9dff-96glt" (UID: "b7f68d19-71d4-4129-a575-3ee57fa53493") : secret "node-tuning-operator-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551598 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:59.05159223 +0000 UTC m=+5.326768852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : secret "serving-cert" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551612 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.55160562 +0000 UTC m=+8.826782252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551626 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551619611 +0000 UTC m=+8.826796243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: I1203 21:49:58.551646 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: I1203 21:49:58.551681 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: I1203 21:49:58.551712 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551736 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551723034 +0000 UTC m=+8.826899416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551806 9136 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551832 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551825888 +0000 UTC m=+8.827002270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551834 9136 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551864 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551855539 +0000 UTC m=+8.827032141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551866 9136 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551881 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551890 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.551881159 +0000 UTC m=+8.827057541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:49:58.553959 master-0 kubenswrapper[9136]: E1203 21:49:58.551941 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:49:59.051933041 +0000 UTC m=+5.327109423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "config" not found Dec 03 21:49:58.577962 master-0 kubenswrapper[9136]: I1203 21:49:58.577900 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kw57\" (UniqueName: \"kubernetes.io/projected/b57d454a-53e5-4c37-b1db-16ecf763dde1-kube-api-access-5kw57\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:58.843076 master-0 kubenswrapper[9136]: I1203 21:49:58.842998 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:58.875005 master-0 kubenswrapper[9136]: I1203 21:49:58.874946 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:49:59.059465 master-0 kubenswrapper[9136]: I1203 21:49:59.059375 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:59.059823 master-0 kubenswrapper[9136]: I1203 21:49:59.059477 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:59.059823 master-0 kubenswrapper[9136]: I1203 21:49:59.059566 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:59.059823 master-0 kubenswrapper[9136]: E1203 21:49:59.059587 9136 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:49:59.059823 master-0 kubenswrapper[9136]: I1203 21:49:59.059676 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:49:59.059994 master-0 kubenswrapper[9136]: E1203 21:49:59.059862 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.059750269 +0000 UTC m=+6.334926671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : secret "serving-cert" not found Dec 03 21:49:59.060049 master-0 kubenswrapper[9136]: E1203 21:49:59.059990 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:49:59.060090 master-0 kubenswrapper[9136]: E1203 21:49:59.060069 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.060050529 +0000 UTC m=+6.335226931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "client-ca" not found Dec 03 21:49:59.060200 master-0 kubenswrapper[9136]: E1203 21:49:59.060170 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Dec 03 21:49:59.060251 master-0 kubenswrapper[9136]: E1203 21:49:59.060221 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.060206784 +0000 UTC m=+6.335383186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "config" not found Dec 03 21:49:59.060299 master-0 kubenswrapper[9136]: E1203 21:49:59.060286 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Dec 03 21:49:59.060349 master-0 kubenswrapper[9136]: E1203 21:49:59.060334 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.060321206 +0000 UTC m=+6.335497608 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "openshift-global-ca" not found Dec 03 21:49:59.084087 master-0 kubenswrapper[9136]: I1203 21:49:59.083977 9136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 21:49:59.084087 master-0 kubenswrapper[9136]: I1203 21:49:59.084014 9136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 21:49:59.388248 master-0 kubenswrapper[9136]: I1203 21:49:59.387762 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-spm87"] Dec 03 21:49:59.389651 master-0 kubenswrapper[9136]: E1203 21:49:59.388544 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" podUID="b57d454a-53e5-4c37-b1db-16ecf763dde1" Dec 03 21:49:59.414488 master-0 kubenswrapper[9136]: I1203 21:49:59.414225 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6"] Dec 03 21:49:59.418793 master-0 kubenswrapper[9136]: I1203 21:49:59.414843 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.418950 master-0 kubenswrapper[9136]: I1203 21:49:59.418898 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 21:49:59.422789 master-0 kubenswrapper[9136]: I1203 21:49:59.419596 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 21:49:59.422789 master-0 kubenswrapper[9136]: I1203 21:49:59.422014 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 21:49:59.422789 master-0 kubenswrapper[9136]: I1203 21:49:59.422292 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 21:49:59.422789 master-0 kubenswrapper[9136]: I1203 21:49:59.422433 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 21:49:59.469827 master-0 kubenswrapper[9136]: I1203 21:49:59.468438 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-config\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.469827 master-0 kubenswrapper[9136]: I1203 21:49:59.468532 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4h8v\" (UniqueName: \"kubernetes.io/projected/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-kube-api-access-j4h8v\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.469827 master-0 kubenswrapper[9136]: I1203 21:49:59.468652 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.469827 master-0 kubenswrapper[9136]: I1203 21:49:59.468682 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.485515 master-0 kubenswrapper[9136]: I1203 21:49:59.485445 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6"] Dec 03 21:49:59.497091 master-0 kubenswrapper[9136]: I1203 21:49:59.495086 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-6b8bb995f7-69t6v"] Dec 03 21:49:59.497091 master-0 kubenswrapper[9136]: I1203 21:49:59.495612 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.501843 master-0 kubenswrapper[9136]: I1203 21:49:59.498717 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 21:49:59.501843 master-0 kubenswrapper[9136]: I1203 21:49:59.498786 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 21:49:59.501843 master-0 kubenswrapper[9136]: I1203 21:49:59.499405 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 21:49:59.501843 master-0 kubenswrapper[9136]: I1203 21:49:59.499586 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 21:49:59.505818 master-0 kubenswrapper[9136]: I1203 21:49:59.503806 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-6b8bb995f7-69t6v"] Dec 03 21:49:59.569548 master-0 kubenswrapper[9136]: I1203 21:49:59.569475 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-config\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.569739 master-0 kubenswrapper[9136]: I1203 21:49:59.569559 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-cabundle\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.569739 master-0 kubenswrapper[9136]: I1203 21:49:59.569648 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4h8v\" (UniqueName: \"kubernetes.io/projected/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-kube-api-access-j4h8v\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.569739 master-0 kubenswrapper[9136]: I1203 21:49:59.569685 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9sj\" (UniqueName: \"kubernetes.io/projected/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-kube-api-access-dx9sj\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.570332 master-0 kubenswrapper[9136]: I1203 21:49:59.570275 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-key\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.570493 master-0 kubenswrapper[9136]: I1203 21:49:59.570456 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.570539 master-0 kubenswrapper[9136]: I1203 21:49:59.570509 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-config\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.570539 master-0 kubenswrapper[9136]: I1203 21:49:59.570525 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.570640 master-0 kubenswrapper[9136]: E1203 21:49:59.570603 9136 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:49:59.570721 master-0 kubenswrapper[9136]: E1203 21:49:59.570701 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.070680245 +0000 UTC m=+6.345856627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : configmap "client-ca" not found Dec 03 21:49:59.570762 master-0 kubenswrapper[9136]: E1203 21:49:59.570702 9136 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:49:59.570855 master-0 kubenswrapper[9136]: E1203 21:49:59.570836 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:00.07081542 +0000 UTC m=+6.345991802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : secret "serving-cert" not found Dec 03 21:49:59.588700 master-0 kubenswrapper[9136]: I1203 21:49:59.588654 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4h8v\" (UniqueName: \"kubernetes.io/projected/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-kube-api-access-j4h8v\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:49:59.671603 master-0 kubenswrapper[9136]: I1203 21:49:59.671540 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-key\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.671866 master-0 kubenswrapper[9136]: I1203 21:49:59.671828 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-cabundle\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.672014 master-0 kubenswrapper[9136]: I1203 21:49:59.671978 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9sj\" (UniqueName: \"kubernetes.io/projected/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-kube-api-access-dx9sj\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.672968 master-0 kubenswrapper[9136]: I1203 21:49:59.672921 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-cabundle\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.675913 master-0 kubenswrapper[9136]: I1203 21:49:59.675863 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-key\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.690882 master-0 kubenswrapper[9136]: I1203 21:49:59.690821 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9sj\" (UniqueName: \"kubernetes.io/projected/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-kube-api-access-dx9sj\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:49:59.966037 master-0 kubenswrapper[9136]: I1203 21:49:59.965958 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: I1203 21:50:00.077286 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: I1203 21:50:00.077397 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: I1203 21:50:00.077472 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: I1203 21:50:00.077497 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: I1203 21:50:00.077554 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: I1203 21:50:00.077586 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: E1203 21:50:00.077713 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:00.077801 master-0 kubenswrapper[9136]: E1203 21:50:00.077792 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.077755179 +0000 UTC m=+8.352931581 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : configmap "client-ca" not found Dec 03 21:50:00.081788 master-0 kubenswrapper[9136]: I1203 21:50:00.079142 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.081788 master-0 kubenswrapper[9136]: E1203 21:50:00.080371 9136 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:00.081788 master-0 kubenswrapper[9136]: E1203 21:50:00.080507 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:01.080479216 +0000 UTC m=+7.355655588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : secret "serving-cert" not found Dec 03 21:50:00.081788 master-0 kubenswrapper[9136]: E1203 21:50:00.080571 9136 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:00.081788 master-0 kubenswrapper[9136]: E1203 21:50:00.080594 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:01.08058639 +0000 UTC m=+7.355762772 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : configmap "client-ca" not found Dec 03 21:50:00.081788 master-0 kubenswrapper[9136]: I1203 21:50:00.080614 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") pod \"controller-manager-56fb5cd58b-spm87\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.087903 master-0 kubenswrapper[9136]: E1203 21:50:00.084256 9136 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:00.087903 master-0 kubenswrapper[9136]: E1203 21:50:00.084354 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert podName:b57d454a-53e5-4c37-b1db-16ecf763dde1 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:02.08432825 +0000 UTC m=+8.359504642 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert") pod "controller-manager-56fb5cd58b-spm87" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1") : secret "serving-cert" not found Dec 03 21:50:00.094085 master-0 kubenswrapper[9136]: I1203 21:50:00.092534 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" event={"ID":"2b014bee-5931-4856-b9e8-e38a134a1b6b","Type":"ContainerStarted","Data":"551bab6592719c817f7e318bdc22ce8f83b492f14d4a35c24897cd6e4987f595"} Dec 03 21:50:00.094085 master-0 kubenswrapper[9136]: I1203 21:50:00.092581 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" event={"ID":"2b014bee-5931-4856-b9e8-e38a134a1b6b","Type":"ContainerStarted","Data":"294d0e4c5ef3027436be452e7066ada493981600193ac9cfc00637ef1bf208d6"} Dec 03 21:50:00.103205 master-0 kubenswrapper[9136]: I1203 21:50:00.103152 9136 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="cdd730b5e809fc70ef1f6320b094ee33892a49867b7b0591e6a94822a48a578e" exitCode=0 Dec 03 21:50:00.103399 master-0 kubenswrapper[9136]: I1203 21:50:00.103325 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerDied","Data":"cdd730b5e809fc70ef1f6320b094ee33892a49867b7b0591e6a94822a48a578e"} Dec 03 21:50:00.105620 master-0 kubenswrapper[9136]: I1203 21:50:00.105578 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.106133 master-0 kubenswrapper[9136]: I1203 21:50:00.106095 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-clt4v" event={"ID":"814c8acf-fb8d-4f57-b8db-21304402c1f1","Type":"ContainerStarted","Data":"cfd4001e116df20bc2813c87a3311a921f10719f00a9b9b08e4bccfdba9d07c0"} Dec 03 21:50:00.118993 master-0 kubenswrapper[9136]: I1203 21:50:00.118895 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" podStartSLOduration=1.488708688 podStartE2EDuration="3.118868504s" podCreationTimestamp="2025-12-03 21:49:57 +0000 UTC" firstStartedPulling="2025-12-03 21:49:57.612480664 +0000 UTC m=+3.887657046" lastFinishedPulling="2025-12-03 21:49:59.24264048 +0000 UTC m=+5.517816862" observedRunningTime="2025-12-03 21:50:00.11556766 +0000 UTC m=+6.390744082" watchObservedRunningTime="2025-12-03 21:50:00.118868504 +0000 UTC m=+6.394044896" Dec 03 21:50:00.122843 master-0 kubenswrapper[9136]: I1203 21:50:00.121922 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:00.178405 master-0 kubenswrapper[9136]: I1203 21:50:00.178354 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kw57\" (UniqueName: \"kubernetes.io/projected/b57d454a-53e5-4c37-b1db-16ecf763dde1-kube-api-access-5kw57\") pod \"b57d454a-53e5-4c37-b1db-16ecf763dde1\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " Dec 03 21:50:00.178405 master-0 kubenswrapper[9136]: I1203 21:50:00.178403 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") pod \"b57d454a-53e5-4c37-b1db-16ecf763dde1\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " Dec 03 21:50:00.179401 master-0 kubenswrapper[9136]: I1203 21:50:00.178432 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") pod \"b57d454a-53e5-4c37-b1db-16ecf763dde1\" (UID: \"b57d454a-53e5-4c37-b1db-16ecf763dde1\") " Dec 03 21:50:00.179878 master-0 kubenswrapper[9136]: I1203 21:50:00.179851 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config" (OuterVolumeSpecName: "config") pod "b57d454a-53e5-4c37-b1db-16ecf763dde1" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:00.180450 master-0 kubenswrapper[9136]: I1203 21:50:00.180418 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b57d454a-53e5-4c37-b1db-16ecf763dde1" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:00.191132 master-0 kubenswrapper[9136]: I1203 21:50:00.191088 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57d454a-53e5-4c37-b1db-16ecf763dde1-kube-api-access-5kw57" (OuterVolumeSpecName: "kube-api-access-5kw57") pod "b57d454a-53e5-4c37-b1db-16ecf763dde1" (UID: "b57d454a-53e5-4c37-b1db-16ecf763dde1"). InnerVolumeSpecName "kube-api-access-5kw57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:00.280065 master-0 kubenswrapper[9136]: I1203 21:50:00.279918 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kw57\" (UniqueName: \"kubernetes.io/projected/b57d454a-53e5-4c37-b1db-16ecf763dde1-kube-api-access-5kw57\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:00.280065 master-0 kubenswrapper[9136]: I1203 21:50:00.279954 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:00.280065 master-0 kubenswrapper[9136]: I1203 21:50:00.279964 9136 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:00.483041 master-0 kubenswrapper[9136]: I1203 21:50:00.482416 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-6b8bb995f7-69t6v"] Dec 03 21:50:00.491927 master-0 kubenswrapper[9136]: W1203 21:50:00.491885 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b47f2ef_9923_411f_9f2f_ddaea8bc7053.slice/crio-e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795 WatchSource:0}: Error finding container e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795: Status 404 returned error can't find the container with id e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795 Dec 03 21:50:01.087820 master-0 kubenswrapper[9136]: I1203 21:50:01.087701 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:01.087820 master-0 kubenswrapper[9136]: I1203 21:50:01.087788 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:01.088871 master-0 kubenswrapper[9136]: E1203 21:50:01.087994 9136 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:01.088871 master-0 kubenswrapper[9136]: E1203 21:50:01.088082 9136 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:01.088871 master-0 kubenswrapper[9136]: E1203 21:50:01.088130 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:03.088102915 +0000 UTC m=+9.363279487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : configmap "client-ca" not found Dec 03 21:50:01.088871 master-0 kubenswrapper[9136]: E1203 21:50:01.088188 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:03.088167077 +0000 UTC m=+9.363343449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : secret "serving-cert" not found Dec 03 21:50:01.109127 master-0 kubenswrapper[9136]: I1203 21:50:01.109057 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:50:01.112382 master-0 kubenswrapper[9136]: I1203 21:50:01.112334 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"12901db2ae1fd13e3c0aef0f6572f021784e96590f0cc7bfafbbf5cbabb162c5"} Dec 03 21:50:01.115441 master-0 kubenswrapper[9136]: I1203 21:50:01.114742 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" event={"ID":"1b47f2ef-9923-411f-9f2f-ddaea8bc7053","Type":"ContainerStarted","Data":"3d0456f7cf5468e13600eb8bcd1915c3323422019a6601c2df115d88e550552b"} Dec 03 21:50:01.115441 master-0 kubenswrapper[9136]: I1203 21:50:01.114814 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" event={"ID":"1b47f2ef-9923-411f-9f2f-ddaea8bc7053","Type":"ContainerStarted","Data":"e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795"} Dec 03 21:50:01.115532 master-0 kubenswrapper[9136]: I1203 21:50:01.114902 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56fb5cd58b-spm87" Dec 03 21:50:01.115692 master-0 kubenswrapper[9136]: I1203 21:50:01.115635 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:50:01.125353 master-0 kubenswrapper[9136]: I1203 21:50:01.125275 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podStartSLOduration=2.338622857 podStartE2EDuration="5.125255764s" podCreationTimestamp="2025-12-03 21:49:56 +0000 UTC" firstStartedPulling="2025-12-03 21:49:57.529678255 +0000 UTC m=+3.804854637" lastFinishedPulling="2025-12-03 21:50:00.316311152 +0000 UTC m=+6.591487544" observedRunningTime="2025-12-03 21:50:01.122738863 +0000 UTC m=+7.397915255" watchObservedRunningTime="2025-12-03 21:50:01.125255764 +0000 UTC m=+7.400432146" Dec 03 21:50:01.171464 master-0 kubenswrapper[9136]: I1203 21:50:01.171361 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" podStartSLOduration=2.171331707 podStartE2EDuration="2.171331707s" podCreationTimestamp="2025-12-03 21:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:01.169752687 +0000 UTC m=+7.444929099" watchObservedRunningTime="2025-12-03 21:50:01.171331707 +0000 UTC m=+7.446508089" Dec 03 21:50:01.622078 master-0 kubenswrapper[9136]: I1203 21:50:01.621528 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-spm87"] Dec 03 21:50:01.629405 master-0 kubenswrapper[9136]: I1203 21:50:01.629350 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56fb5cd58b-spm87"] Dec 03 21:50:01.695950 master-0 kubenswrapper[9136]: I1203 21:50:01.695893 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57d454a-53e5-4c37-b1db-16ecf763dde1-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:01.695950 master-0 kubenswrapper[9136]: I1203 21:50:01.695947 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b57d454a-53e5-4c37-b1db-16ecf763dde1-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:01.922961 master-0 kubenswrapper[9136]: I1203 21:50:01.921243 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57d454a-53e5-4c37-b1db-16ecf763dde1" path="/var/lib/kubelet/pods/b57d454a-53e5-4c37-b1db-16ecf763dde1/volumes" Dec 03 21:50:02.131108 master-0 kubenswrapper[9136]: I1203 21:50:02.131045 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:50:02.523572 master-0 kubenswrapper[9136]: I1203 21:50:02.523523 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6db9498b78-xvrlr"] Dec 03 21:50:02.524628 master-0 kubenswrapper[9136]: I1203 21:50:02.524610 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.530430 master-0 kubenswrapper[9136]: I1203 21:50:02.530388 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 21:50:02.530919 master-0 kubenswrapper[9136]: I1203 21:50:02.530905 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 21:50:02.531217 master-0 kubenswrapper[9136]: I1203 21:50:02.531204 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 21:50:02.531431 master-0 kubenswrapper[9136]: I1203 21:50:02.531418 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 21:50:02.532958 master-0 kubenswrapper[9136]: I1203 21:50:02.532941 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 21:50:02.540797 master-0 kubenswrapper[9136]: I1203 21:50:02.537761 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 21:50:02.554510 master-0 kubenswrapper[9136]: I1203 21:50:02.554466 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db9498b78-xvrlr"] Dec 03 21:50:02.611873 master-0 kubenswrapper[9136]: I1203 21:50:02.611668 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:50:02.611873 master-0 kubenswrapper[9136]: I1203 21:50:02.611733 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:50:02.611873 master-0 kubenswrapper[9136]: I1203 21:50:02.611799 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:02.611873 master-0 kubenswrapper[9136]: I1203 21:50:02.611838 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6sn\" (UniqueName: \"kubernetes.io/projected/0778622f-e8ed-4eb0-9317-b4e95c135a48-kube-api-access-5x6sn\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.611873 master-0 kubenswrapper[9136]: I1203 21:50:02.611897 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.611922 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-config\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.611945 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.611977 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612010 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612033 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612061 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-proxy-ca-bundles\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612087 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: E1203 21:50:02.612094 9136 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612140 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612166 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: E1203 21:50:02.612210 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.612179907 +0000 UTC m=+16.887356289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:50:02.612279 master-0 kubenswrapper[9136]: I1203 21:50:02.612255 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:50:02.612723 master-0 kubenswrapper[9136]: I1203 21:50:02.612326 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:50:02.612723 master-0 kubenswrapper[9136]: E1203 21:50:02.612348 9136 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:50:02.612723 master-0 kubenswrapper[9136]: E1203 21:50:02.612411 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.612389124 +0000 UTC m=+16.887565716 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:50:02.612723 master-0 kubenswrapper[9136]: E1203 21:50:02.612459 9136 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:50:02.612723 master-0 kubenswrapper[9136]: E1203 21:50:02.612480 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.612474337 +0000 UTC m=+16.887650719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:50:02.613233 master-0 kubenswrapper[9136]: E1203 21:50:02.613191 9136 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:50:02.613298 master-0 kubenswrapper[9136]: E1203 21:50:02.613241 9136 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:50:02.613401 master-0 kubenswrapper[9136]: E1203 21:50:02.613373 9136 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 21:50:02.613447 master-0 kubenswrapper[9136]: E1203 21:50:02.613379 9136 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Dec 03 21:50:02.613485 master-0 kubenswrapper[9136]: E1203 21:50:02.613266 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls podName:a9a3f403-a742-4977-901a-cf4a8eb7df5a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.613248411 +0000 UTC m=+16.888424993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls") pod "dns-operator-6b7bcd6566-qcg9x" (UID: "a9a3f403-a742-4977-901a-cf4a8eb7df5a") : secret "metrics-tls" not found Dec 03 21:50:02.613485 master-0 kubenswrapper[9136]: E1203 21:50:02.613480 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.613466458 +0000 UTC m=+16.888643050 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:50:02.613574 master-0 kubenswrapper[9136]: E1203 21:50:02.613497 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.613488659 +0000 UTC m=+16.888665261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : secret "metrics-daemon-secret" not found Dec 03 21:50:02.613574 master-0 kubenswrapper[9136]: E1203 21:50:02.613514 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls podName:39f0e973-7864-4842-af8e-47718ab1804c nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.613505439 +0000 UTC m=+16.888682061 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls") pod "cluster-image-registry-operator-65dc4bcb88-2vvsj" (UID: "39f0e973-7864-4842-af8e-47718ab1804c") : secret "image-registry-operator-tls" not found Dec 03 21:50:02.613669 master-0 kubenswrapper[9136]: E1203 21:50:02.613595 9136 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Dec 03 21:50:02.613669 master-0 kubenswrapper[9136]: E1203 21:50:02.613627 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls podName:0869de9b-6f5b-4c31-81ad-02a9c8888193 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.613618293 +0000 UTC m=+16.888794885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls") pod "ingress-operator-85dbd94574-2hxlh" (UID: "0869de9b-6f5b-4c31-81ad-02a9c8888193") : secret "metrics-tls" not found Dec 03 21:50:02.619739 master-0 kubenswrapper[9136]: I1203 21:50:02.619681 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"cluster-version-operator-869c786959-2bnjf\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:50:02.630846 master-0 kubenswrapper[9136]: I1203 21:50:02.628672 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:50:02.638338 master-0 kubenswrapper[9136]: I1203 21:50:02.637251 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:50:02.641684 master-0 kubenswrapper[9136]: I1203 21:50:02.641615 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 21:50:02.658398 master-0 kubenswrapper[9136]: I1203 21:50:02.658330 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:50:02.713890 master-0 kubenswrapper[9136]: I1203 21:50:02.713829 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.714179 master-0 kubenswrapper[9136]: I1203 21:50:02.713942 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6sn\" (UniqueName: \"kubernetes.io/projected/0778622f-e8ed-4eb0-9317-b4e95c135a48-kube-api-access-5x6sn\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.714179 master-0 kubenswrapper[9136]: I1203 21:50:02.713982 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-config\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.714179 master-0 kubenswrapper[9136]: I1203 21:50:02.714006 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.714179 master-0 kubenswrapper[9136]: I1203 21:50:02.714051 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-proxy-ca-bundles\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.714345 master-0 kubenswrapper[9136]: E1203 21:50:02.713978 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:02.714413 master-0 kubenswrapper[9136]: E1203 21:50:02.714394 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:03.214370066 +0000 UTC m=+9.489546448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : configmap "client-ca" not found Dec 03 21:50:02.715066 master-0 kubenswrapper[9136]: E1203 21:50:02.714989 9136 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:02.715136 master-0 kubenswrapper[9136]: E1203 21:50:02.715117 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:03.21509049 +0000 UTC m=+9.490266872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : secret "serving-cert" not found Dec 03 21:50:02.715281 master-0 kubenswrapper[9136]: I1203 21:50:02.715252 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-proxy-ca-bundles\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.715536 master-0 kubenswrapper[9136]: I1203 21:50:02.715510 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-config\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.733143 master-0 kubenswrapper[9136]: I1203 21:50:02.730417 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6sn\" (UniqueName: \"kubernetes.io/projected/0778622f-e8ed-4eb0-9317-b4e95c135a48-kube-api-access-5x6sn\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:02.907569 master-0 kubenswrapper[9136]: I1203 21:50:02.907515 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt"] Dec 03 21:50:02.919864 master-0 kubenswrapper[9136]: W1203 21:50:02.919748 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7f68d19_71d4_4129_a575_3ee57fa53493.slice/crio-39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a WatchSource:0}: Error finding container 39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a: Status 404 returned error can't find the container with id 39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a Dec 03 21:50:03.118492 master-0 kubenswrapper[9136]: I1203 21:50:03.117998 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:03.118492 master-0 kubenswrapper[9136]: I1203 21:50:03.118440 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:03.118862 master-0 kubenswrapper[9136]: E1203 21:50:03.118560 9136 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:03.118862 master-0 kubenswrapper[9136]: E1203 21:50:03.118610 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:07.11859504 +0000 UTC m=+13.393771422 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : secret "serving-cert" not found Dec 03 21:50:03.118862 master-0 kubenswrapper[9136]: E1203 21:50:03.118718 9136 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:03.119289 master-0 kubenswrapper[9136]: E1203 21:50:03.118881 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:07.118843328 +0000 UTC m=+13.394019940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : configmap "client-ca" not found Dec 03 21:50:03.123558 master-0 kubenswrapper[9136]: I1203 21:50:03.123509 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" event={"ID":"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9","Type":"ContainerStarted","Data":"fad8816e9bad40e33a61a6c3879fbe4c9632a314c3eede6ca1d36dcf479c7e09"} Dec 03 21:50:03.126110 master-0 kubenswrapper[9136]: I1203 21:50:03.125841 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerStarted","Data":"129bb34723f1a3b3bc6376ce7e4a5163dea17767bc888149557663566136439c"} Dec 03 21:50:03.127178 master-0 kubenswrapper[9136]: I1203 21:50:03.127071 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" event={"ID":"b7f68d19-71d4-4129-a575-3ee57fa53493","Type":"ContainerStarted","Data":"39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a"} Dec 03 21:50:03.220795 master-0 kubenswrapper[9136]: I1203 21:50:03.219806 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:03.220795 master-0 kubenswrapper[9136]: I1203 21:50:03.219955 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:03.220795 master-0 kubenswrapper[9136]: E1203 21:50:03.220164 9136 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:03.220795 master-0 kubenswrapper[9136]: E1203 21:50:03.220244 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:04.220220111 +0000 UTC m=+10.495396493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : secret "serving-cert" not found Dec 03 21:50:03.227790 master-0 kubenswrapper[9136]: E1203 21:50:03.222321 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:03.227790 master-0 kubenswrapper[9136]: E1203 21:50:03.222368 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:04.22235575 +0000 UTC m=+10.497532132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : configmap "client-ca" not found Dec 03 21:50:04.248595 master-0 kubenswrapper[9136]: I1203 21:50:04.238870 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:04.248595 master-0 kubenswrapper[9136]: I1203 21:50:04.239034 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:04.248595 master-0 kubenswrapper[9136]: E1203 21:50:04.239390 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:04.248595 master-0 kubenswrapper[9136]: E1203 21:50:04.239461 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:06.239438131 +0000 UTC m=+12.514614533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : configmap "client-ca" not found Dec 03 21:50:04.257025 master-0 kubenswrapper[9136]: I1203 21:50:04.254222 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:04.898176 master-0 kubenswrapper[9136]: I1203 21:50:04.897682 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:50:04.898417 master-0 kubenswrapper[9136]: I1203 21:50:04.898340 9136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 21:50:04.898417 master-0 kubenswrapper[9136]: I1203 21:50:04.898359 9136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 21:50:04.928825 master-0 kubenswrapper[9136]: I1203 21:50:04.928724 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:50:05.134196 master-0 kubenswrapper[9136]: I1203 21:50:05.134135 9136 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 21:50:05.402276 master-0 kubenswrapper[9136]: I1203 21:50:05.402198 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:50:05.429932 master-0 kubenswrapper[9136]: I1203 21:50:05.429676 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 21:50:06.143851 master-0 kubenswrapper[9136]: I1203 21:50:06.140192 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" event={"ID":"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9","Type":"ContainerStarted","Data":"f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3"} Dec 03 21:50:06.268904 master-0 kubenswrapper[9136]: I1203 21:50:06.268807 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:06.269538 master-0 kubenswrapper[9136]: E1203 21:50:06.269491 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:06.269654 master-0 kubenswrapper[9136]: E1203 21:50:06.269565 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.269542812 +0000 UTC m=+16.544719194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : configmap "client-ca" not found Dec 03 21:50:07.188586 master-0 kubenswrapper[9136]: I1203 21:50:07.188457 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:07.189271 master-0 kubenswrapper[9136]: E1203 21:50:07.188855 9136 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:07.189271 master-0 kubenswrapper[9136]: E1203 21:50:07.189041 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:15.188928758 +0000 UTC m=+21.464105230 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : configmap "client-ca" not found Dec 03 21:50:07.189271 master-0 kubenswrapper[9136]: E1203 21:50:07.189120 9136 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Dec 03 21:50:07.189271 master-0 kubenswrapper[9136]: E1203 21:50:07.189263 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:15.189232388 +0000 UTC m=+21.464408770 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : secret "serving-cert" not found Dec 03 21:50:07.189550 master-0 kubenswrapper[9136]: I1203 21:50:07.188533 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:08.600716 master-0 kubenswrapper[9136]: I1203 21:50:08.600224 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-5b4c744c84-xgxft"] Dec 03 21:50:08.601640 master-0 kubenswrapper[9136]: I1203 21:50:08.601603 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.605297 master-0 kubenswrapper[9136]: I1203 21:50:08.605250 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 21:50:08.606519 master-0 kubenswrapper[9136]: I1203 21:50:08.606320 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 21:50:08.610803 master-0 kubenswrapper[9136]: I1203 21:50:08.610717 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Dec 03 21:50:08.612187 master-0 kubenswrapper[9136]: I1203 21:50:08.611187 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 21:50:08.612187 master-0 kubenswrapper[9136]: I1203 21:50:08.611360 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 21:50:08.612187 master-0 kubenswrapper[9136]: I1203 21:50:08.611573 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 21:50:08.612187 master-0 kubenswrapper[9136]: I1203 21:50:08.611822 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 21:50:08.612187 master-0 kubenswrapper[9136]: I1203 21:50:08.611993 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 21:50:08.612879 master-0 kubenswrapper[9136]: I1203 21:50:08.612661 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Dec 03 21:50:08.624128 master-0 kubenswrapper[9136]: I1203 21:50:08.623958 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 21:50:08.635632 master-0 kubenswrapper[9136]: I1203 21:50:08.635567 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-5b4c744c84-xgxft"] Dec 03 21:50:08.713425 master-0 kubenswrapper[9136]: I1203 21:50:08.713325 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-node-pullsecrets\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713425 master-0 kubenswrapper[9136]: I1203 21:50:08.713418 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m6w9\" (UniqueName: \"kubernetes.io/projected/553d3958-b25d-4855-a2da-1d94c0196f4a-kube-api-access-9m6w9\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713447 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-image-import-ca\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713505 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-serving-ca\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713540 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-encryption-config\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713563 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713586 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-client\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713615 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-audit-dir\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713653 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-trusted-ca-bundle\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713690 master-0 kubenswrapper[9136]: I1203 21:50:08.713688 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-config\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.713954 master-0 kubenswrapper[9136]: I1203 21:50:08.713746 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816023 master-0 kubenswrapper[9136]: I1203 21:50:08.815320 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m6w9\" (UniqueName: \"kubernetes.io/projected/553d3958-b25d-4855-a2da-1d94c0196f4a-kube-api-access-9m6w9\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816401 master-0 kubenswrapper[9136]: I1203 21:50:08.816109 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-image-import-ca\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816401 master-0 kubenswrapper[9136]: I1203 21:50:08.816180 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-serving-ca\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816401 master-0 kubenswrapper[9136]: I1203 21:50:08.816261 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-encryption-config\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816401 master-0 kubenswrapper[9136]: I1203 21:50:08.816307 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-client\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816401 master-0 kubenswrapper[9136]: I1203 21:50:08.816356 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.816401 master-0 kubenswrapper[9136]: I1203 21:50:08.816397 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-audit-dir\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.816530 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-trusted-ca-bundle\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.816628 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-config\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.816754 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-audit-dir\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: E1203 21:50:08.816834 9136 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.816842 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: E1203 21:50:08.816950 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:09.316914244 +0000 UTC m=+15.592090666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : secret "serving-cert" not found Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: E1203 21:50:08.817005 9136 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.817066 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-node-pullsecrets\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.817012 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-node-pullsecrets\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: E1203 21:50:08.817103 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:09.317072629 +0000 UTC m=+15.592249101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : configmap "audit-0" not found Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.817384 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-image-import-ca\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.818028 master-0 kubenswrapper[9136]: I1203 21:50:08.817424 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-serving-ca\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.819556 master-0 kubenswrapper[9136]: I1203 21:50:08.818448 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-config\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.819556 master-0 kubenswrapper[9136]: I1203 21:50:08.819202 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-trusted-ca-bundle\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.822660 master-0 kubenswrapper[9136]: I1203 21:50:08.822595 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-client\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.823381 master-0 kubenswrapper[9136]: I1203 21:50:08.823350 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-encryption-config\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:08.906270 master-0 kubenswrapper[9136]: I1203 21:50:08.906068 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m6w9\" (UniqueName: \"kubernetes.io/projected/553d3958-b25d-4855-a2da-1d94c0196f4a-kube-api-access-9m6w9\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:09.166286 master-0 kubenswrapper[9136]: I1203 21:50:09.166117 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" event={"ID":"b7f68d19-71d4-4129-a575-3ee57fa53493","Type":"ContainerStarted","Data":"ddc4b2406902641b1427a12cb2394dec3bff8dffab1d17cd293d7a2306efb279"} Dec 03 21:50:09.203258 master-0 kubenswrapper[9136]: I1203 21:50:09.203200 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-fvghq"] Dec 03 21:50:09.204318 master-0 kubenswrapper[9136]: I1203 21:50:09.204299 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327403 master-0 kubenswrapper[9136]: I1203 21:50:09.327330 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-lib-modules\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327403 master-0 kubenswrapper[9136]: I1203 21:50:09.327386 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-tuned\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327403 master-0 kubenswrapper[9136]: I1203 21:50:09.327410 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-conf\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327493 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-var-lib-kubelet\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327528 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-kubernetes\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327553 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327570 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-host\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327605 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327644 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-systemd\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327668 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-run\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327710 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-modprobe-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.327750 master-0 kubenswrapper[9136]: I1203 21:50:09.327748 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-sys\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.328125 master-0 kubenswrapper[9136]: I1203 21:50:09.327782 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-tmp\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.328125 master-0 kubenswrapper[9136]: I1203 21:50:09.327813 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysconfig\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.328125 master-0 kubenswrapper[9136]: I1203 21:50:09.327837 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:09.328125 master-0 kubenswrapper[9136]: I1203 21:50:09.327870 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df8nl\" (UniqueName: \"kubernetes.io/projected/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-kube-api-access-df8nl\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.328125 master-0 kubenswrapper[9136]: E1203 21:50:09.328066 9136 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 21:50:09.328125 master-0 kubenswrapper[9136]: E1203 21:50:09.328114 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.32809788 +0000 UTC m=+16.603274272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : secret "serving-cert" not found Dec 03 21:50:09.328624 master-0 kubenswrapper[9136]: E1203 21:50:09.328589 9136 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 21:50:09.328673 master-0 kubenswrapper[9136]: E1203 21:50:09.328632 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:10.328620427 +0000 UTC m=+16.603796819 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : configmap "audit-0" not found Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429628 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-modprobe-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429720 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-sys\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429743 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-tmp\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429851 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysconfig\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429898 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df8nl\" (UniqueName: \"kubernetes.io/projected/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-kube-api-access-df8nl\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429953 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-tuned\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.429975 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-lib-modules\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430019 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-conf\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430069 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-var-lib-kubelet\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430130 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-kubernetes\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430174 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430220 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-host\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430293 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-systemd\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430354 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-run\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430664 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-run\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430819 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-modprobe-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.430892 master-0 kubenswrapper[9136]: I1203 21:50:09.430886 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-sys\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.432647 master-0 kubenswrapper[9136]: I1203 21:50:09.431996 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysconfig\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.432975 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-kubernetes\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.433226 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-lib-modules\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.433506 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-conf\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.433604 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-var-lib-kubelet\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.433694 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-host\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.433813 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.435529 master-0 kubenswrapper[9136]: I1203 21:50:09.433925 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-systemd\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.436682 master-0 kubenswrapper[9136]: I1203 21:50:09.436204 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-tmp\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.437263 master-0 kubenswrapper[9136]: I1203 21:50:09.437185 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-tuned\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.482292 master-0 kubenswrapper[9136]: I1203 21:50:09.482236 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df8nl\" (UniqueName: \"kubernetes.io/projected/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-kube-api-access-df8nl\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.549304 master-0 kubenswrapper[9136]: I1203 21:50:09.549212 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 21:50:09.585863 master-0 kubenswrapper[9136]: W1203 21:50:09.580687 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e6d3a1_b3a7_4c7f_b76c_84ca2c957b23.slice/crio-68861d9742eba09a9447ea986d7ff70da6fe58e04de7e81f2257923ca0223ec1 WatchSource:0}: Error finding container 68861d9742eba09a9447ea986d7ff70da6fe58e04de7e81f2257923ca0223ec1: Status 404 returned error can't find the container with id 68861d9742eba09a9447ea986d7ff70da6fe58e04de7e81f2257923ca0223ec1 Dec 03 21:50:10.173983 master-0 kubenswrapper[9136]: I1203 21:50:10.173413 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fvghq" event={"ID":"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23","Type":"ContainerStarted","Data":"0c3af861af08bccdc2663168f257c71394ed2fffa4985ad159441de38267d6b6"} Dec 03 21:50:10.175327 master-0 kubenswrapper[9136]: I1203 21:50:10.174002 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fvghq" event={"ID":"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23","Type":"ContainerStarted","Data":"68861d9742eba09a9447ea986d7ff70da6fe58e04de7e81f2257923ca0223ec1"} Dec 03 21:50:10.340269 master-0 kubenswrapper[9136]: I1203 21:50:10.340142 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:10.340269 master-0 kubenswrapper[9136]: I1203 21:50:10.340196 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:10.340564 master-0 kubenswrapper[9136]: I1203 21:50:10.340282 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:10.340564 master-0 kubenswrapper[9136]: E1203 21:50:10.340401 9136 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 21:50:10.340564 master-0 kubenswrapper[9136]: E1203 21:50:10.340471 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:12.340450049 +0000 UTC m=+18.615626431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : configmap "audit-0" not found Dec 03 21:50:10.341005 master-0 kubenswrapper[9136]: E1203 21:50:10.340975 9136 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 21:50:10.341054 master-0 kubenswrapper[9136]: E1203 21:50:10.341016 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:12.341007197 +0000 UTC m=+18.616183569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : secret "serving-cert" not found Dec 03 21:50:10.341054 master-0 kubenswrapper[9136]: E1203 21:50:10.341053 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:10.341119 master-0 kubenswrapper[9136]: E1203 21:50:10.341074 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:18.341068279 +0000 UTC m=+24.616244661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : configmap "client-ca" not found Dec 03 21:50:10.645514 master-0 kubenswrapper[9136]: I1203 21:50:10.645438 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:50:10.645514 master-0 kubenswrapper[9136]: I1203 21:50:10.645519 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:50:10.645928 master-0 kubenswrapper[9136]: I1203 21:50:10.645547 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:50:10.645928 master-0 kubenswrapper[9136]: I1203 21:50:10.645585 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:50:10.645928 master-0 kubenswrapper[9136]: I1203 21:50:10.645618 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:10.645928 master-0 kubenswrapper[9136]: I1203 21:50:10.645664 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:50:10.645928 master-0 kubenswrapper[9136]: I1203 21:50:10.645698 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:50:10.645928 master-0 kubenswrapper[9136]: I1203 21:50:10.645717 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:50:10.646752 master-0 kubenswrapper[9136]: E1203 21:50:10.646711 9136 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Dec 03 21:50:10.646855 master-0 kubenswrapper[9136]: E1203 21:50:10.646831 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs podName:134c10ef-9f37-4a77-8e8b-4f8326bc8f40 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:26.646805261 +0000 UTC m=+32.921981673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs") pod "multus-admission-controller-78ddcf56f9-6b8qj" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40") : secret "multus-admission-controller-secret" not found Dec 03 21:50:10.646933 master-0 kubenswrapper[9136]: E1203 21:50:10.646898 9136 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Dec 03 21:50:10.647000 master-0 kubenswrapper[9136]: E1203 21:50:10.646979 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert podName:04f5fc52-4ec2-48c3-8441-2b15ad632233 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:26.646958706 +0000 UTC m=+32.922135198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert") pod "package-server-manager-75b4d49d4c-psjj5" (UID: "04f5fc52-4ec2-48c3-8441-2b15ad632233") : secret "package-server-manager-serving-cert" not found Dec 03 21:50:10.647063 master-0 kubenswrapper[9136]: E1203 21:50:10.647036 9136 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Dec 03 21:50:10.647063 master-0 kubenswrapper[9136]: E1203 21:50:10.647064 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics podName:a4399d20-f9a6-4ab1-86be-e2845394eaba nodeName:}" failed. No retries permitted until 2025-12-03 21:50:26.64705669 +0000 UTC m=+32.922233192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics") pod "marketplace-operator-7d67745bb7-4jd6d" (UID: "a4399d20-f9a6-4ab1-86be-e2845394eaba") : secret "marketplace-operator-metrics" not found Dec 03 21:50:10.647149 master-0 kubenswrapper[9136]: E1203 21:50:10.647104 9136 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Dec 03 21:50:10.647149 master-0 kubenswrapper[9136]: E1203 21:50:10.647128 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls podName:bebd69d2-5b0f-4b66-8722-d6861eba3e12 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:26.647120392 +0000 UTC m=+32.922296914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-69cc794c58-vns7s" (UID: "bebd69d2-5b0f-4b66-8722-d6861eba3e12") : secret "cluster-monitoring-operator-tls" not found Dec 03 21:50:10.647470 master-0 kubenswrapper[9136]: E1203 21:50:10.647439 9136 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Dec 03 21:50:10.647532 master-0 kubenswrapper[9136]: E1203 21:50:10.647512 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs podName:812401c0-d1ac-4857-b939-217b7b07f8bc nodeName:}" failed. No retries permitted until 2025-12-03 21:50:26.647492204 +0000 UTC m=+32.922668626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs") pod "network-metrics-daemon-h6569" (UID: "812401c0-d1ac-4857-b939-217b7b07f8bc") : secret "metrics-daemon-secret" not found Dec 03 21:50:10.653086 master-0 kubenswrapper[9136]: I1203 21:50:10.652142 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:50:10.653086 master-0 kubenswrapper[9136]: I1203 21:50:10.652254 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:50:10.653086 master-0 kubenswrapper[9136]: I1203 21:50:10.652314 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:50:10.745382 master-0 kubenswrapper[9136]: I1203 21:50:10.745311 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 21:50:10.745659 master-0 kubenswrapper[9136]: I1203 21:50:10.745311 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 21:50:10.752087 master-0 kubenswrapper[9136]: I1203 21:50:10.751940 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 21:50:11.456819 master-0 kubenswrapper[9136]: I1203 21:50:11.456679 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fvghq" podStartSLOduration=2.456636052 podStartE2EDuration="2.456636052s" podCreationTimestamp="2025-12-03 21:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:10.289483858 +0000 UTC m=+16.564660250" watchObservedRunningTime="2025-12-03 21:50:11.456636052 +0000 UTC m=+17.731812474" Dec 03 21:50:11.458451 master-0 kubenswrapper[9136]: I1203 21:50:11.457396 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj"] Dec 03 21:50:11.466855 master-0 kubenswrapper[9136]: I1203 21:50:11.466706 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x"] Dec 03 21:50:11.653810 master-0 kubenswrapper[9136]: I1203 21:50:11.653738 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh"] Dec 03 21:50:12.186274 master-0 kubenswrapper[9136]: I1203 21:50:12.186226 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" event={"ID":"39f0e973-7864-4842-af8e-47718ab1804c","Type":"ContainerStarted","Data":"bc976a2c91b624ecb3225ef5481f2ee770ab314872bf257e672f9145ca896171"} Dec 03 21:50:12.186984 master-0 kubenswrapper[9136]: I1203 21:50:12.186962 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"2ae62ce3b48419372763c717979fe81f4210258cf21652aeea2900df29d7ef00"} Dec 03 21:50:12.188009 master-0 kubenswrapper[9136]: I1203 21:50:12.187986 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" event={"ID":"a9a3f403-a742-4977-901a-cf4a8eb7df5a","Type":"ContainerStarted","Data":"3d9073cce7422ee3531e5bcb005037bb8c3536325d2ddba91234eef2010050ed"} Dec 03 21:50:12.379539 master-0 kubenswrapper[9136]: I1203 21:50:12.379484 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:12.381013 master-0 kubenswrapper[9136]: E1203 21:50:12.379609 9136 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Dec 03 21:50:12.381013 master-0 kubenswrapper[9136]: I1203 21:50:12.379620 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert\") pod \"apiserver-5b4c744c84-xgxft\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:12.381013 master-0 kubenswrapper[9136]: E1203 21:50:12.379706 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:16.379683784 +0000 UTC m=+22.654860166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : configmap "audit-0" not found Dec 03 21:50:12.381013 master-0 kubenswrapper[9136]: E1203 21:50:12.379711 9136 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 21:50:12.381013 master-0 kubenswrapper[9136]: E1203 21:50:12.379742 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert podName:553d3958-b25d-4855-a2da-1d94c0196f4a nodeName:}" failed. No retries permitted until 2025-12-03 21:50:16.379733456 +0000 UTC m=+22.654909838 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert") pod "apiserver-5b4c744c84-xgxft" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a") : secret "serving-cert" not found Dec 03 21:50:12.494485 master-0 kubenswrapper[9136]: I1203 21:50:12.494349 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5b4c744c84-xgxft"] Dec 03 21:50:12.495456 master-0 kubenswrapper[9136]: E1203 21:50:12.494581 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" podUID="553d3958-b25d-4855-a2da-1d94c0196f4a" Dec 03 21:50:13.204037 master-0 kubenswrapper[9136]: I1203 21:50:13.203952 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:13.230830 master-0 kubenswrapper[9136]: I1203 21:50:13.230686 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:13.409811 master-0 kubenswrapper[9136]: I1203 21:50:13.409752 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m6w9\" (UniqueName: \"kubernetes.io/projected/553d3958-b25d-4855-a2da-1d94c0196f4a-kube-api-access-9m6w9\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.410121 master-0 kubenswrapper[9136]: I1203 21:50:13.409832 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-audit-dir\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.410121 master-0 kubenswrapper[9136]: I1203 21:50:13.409854 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-trusted-ca-bundle\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.410121 master-0 kubenswrapper[9136]: I1203 21:50:13.409884 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-image-import-ca\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.410121 master-0 kubenswrapper[9136]: I1203 21:50:13.409978 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:13.410974 master-0 kubenswrapper[9136]: I1203 21:50:13.410944 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:13.411112 master-0 kubenswrapper[9136]: I1203 21:50:13.410965 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:13.411112 master-0 kubenswrapper[9136]: I1203 21:50:13.411022 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-node-pullsecrets\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.411112 master-0 kubenswrapper[9136]: I1203 21:50:13.411046 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:13.411112 master-0 kubenswrapper[9136]: I1203 21:50:13.411071 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-config\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.411112 master-0 kubenswrapper[9136]: I1203 21:50:13.411100 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-client\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.411552 master-0 kubenswrapper[9136]: I1203 21:50:13.411501 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-config" (OuterVolumeSpecName: "config") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:13.411673 master-0 kubenswrapper[9136]: I1203 21:50:13.411651 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-encryption-config\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.411725 master-0 kubenswrapper[9136]: I1203 21:50:13.411702 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-serving-ca\") pod \"553d3958-b25d-4855-a2da-1d94c0196f4a\" (UID: \"553d3958-b25d-4855-a2da-1d94c0196f4a\") " Dec 03 21:50:13.412062 master-0 kubenswrapper[9136]: I1203 21:50:13.411994 9136 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.412062 master-0 kubenswrapper[9136]: I1203 21:50:13.412017 9136 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.412062 master-0 kubenswrapper[9136]: I1203 21:50:13.412030 9136 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-image-import-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.412062 master-0 kubenswrapper[9136]: I1203 21:50:13.412039 9136 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/553d3958-b25d-4855-a2da-1d94c0196f4a-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.412062 master-0 kubenswrapper[9136]: I1203 21:50:13.412050 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.412380 master-0 kubenswrapper[9136]: I1203 21:50:13.412303 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:13.415985 master-0 kubenswrapper[9136]: I1203 21:50:13.415947 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/553d3958-b25d-4855-a2da-1d94c0196f4a-kube-api-access-9m6w9" (OuterVolumeSpecName: "kube-api-access-9m6w9") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "kube-api-access-9m6w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:13.416880 master-0 kubenswrapper[9136]: I1203 21:50:13.416842 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:13.417831 master-0 kubenswrapper[9136]: I1203 21:50:13.417750 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "553d3958-b25d-4855-a2da-1d94c0196f4a" (UID: "553d3958-b25d-4855-a2da-1d94c0196f4a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:13.513150 master-0 kubenswrapper[9136]: I1203 21:50:13.513031 9136 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-client\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.513150 master-0 kubenswrapper[9136]: I1203 21:50:13.513090 9136 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-encryption-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.513150 master-0 kubenswrapper[9136]: I1203 21:50:13.513103 9136 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:13.513150 master-0 kubenswrapper[9136]: I1203 21:50:13.513117 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m6w9\" (UniqueName: \"kubernetes.io/projected/553d3958-b25d-4855-a2da-1d94c0196f4a-kube-api-access-9m6w9\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:14.206997 master-0 kubenswrapper[9136]: I1203 21:50:14.206888 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-5b4c744c84-xgxft" Dec 03 21:50:14.534892 master-0 kubenswrapper[9136]: I1203 21:50:14.525527 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw"] Dec 03 21:50:14.534892 master-0 kubenswrapper[9136]: I1203 21:50:14.526040 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.537609 master-0 kubenswrapper[9136]: I1203 21:50:14.537568 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 21:50:14.537741 master-0 kubenswrapper[9136]: I1203 21:50:14.537644 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 21:50:14.537877 master-0 kubenswrapper[9136]: I1203 21:50:14.537859 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 21:50:14.541673 master-0 kubenswrapper[9136]: I1203 21:50:14.541619 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 21:50:14.557060 master-0 kubenswrapper[9136]: I1203 21:50:14.554123 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw"] Dec 03 21:50:14.581529 master-0 kubenswrapper[9136]: I1203 21:50:14.581475 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk"] Dec 03 21:50:14.585678 master-0 kubenswrapper[9136]: I1203 21:50:14.585611 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-64554dd846-6vfz6"] Dec 03 21:50:14.586011 master-0 kubenswrapper[9136]: I1203 21:50:14.585951 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.586500 master-0 kubenswrapper[9136]: I1203 21:50:14.586467 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.587130 master-0 kubenswrapper[9136]: I1203 21:50:14.587084 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-5b4c744c84-xgxft"] Dec 03 21:50:14.588809 master-0 kubenswrapper[9136]: I1203 21:50:14.588741 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 21:50:14.589032 master-0 kubenswrapper[9136]: I1203 21:50:14.589008 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 21:50:14.589234 master-0 kubenswrapper[9136]: I1203 21:50:14.589212 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 21:50:14.589444 master-0 kubenswrapper[9136]: I1203 21:50:14.589401 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 21:50:14.590166 master-0 kubenswrapper[9136]: I1203 21:50:14.590128 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 21:50:14.590410 master-0 kubenswrapper[9136]: I1203 21:50:14.590380 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 21:50:14.591574 master-0 kubenswrapper[9136]: I1203 21:50:14.591544 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 21:50:14.591761 master-0 kubenswrapper[9136]: I1203 21:50:14.591737 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 21:50:14.592026 master-0 kubenswrapper[9136]: I1203 21:50:14.592001 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 21:50:14.592197 master-0 kubenswrapper[9136]: I1203 21:50:14.592175 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 21:50:14.592351 master-0 kubenswrapper[9136]: I1203 21:50:14.592326 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 21:50:14.598361 master-0 kubenswrapper[9136]: I1203 21:50:14.598329 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 21:50:14.600061 master-0 kubenswrapper[9136]: I1203 21:50:14.600023 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 21:50:14.626629 master-0 kubenswrapper[9136]: I1203 21:50:14.626547 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-docker\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.626629 master-0 kubenswrapper[9136]: I1203 21:50:14.626617 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z9vv\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-kube-api-access-4z9vv\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.626902 master-0 kubenswrapper[9136]: I1203 21:50:14.626671 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-containers\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.626902 master-0 kubenswrapper[9136]: I1203 21:50:14.626729 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-cache\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.626902 master-0 kubenswrapper[9136]: I1203 21:50:14.626788 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-ca-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.626902 master-0 kubenswrapper[9136]: I1203 21:50:14.626822 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.665620 master-0 kubenswrapper[9136]: I1203 21:50:14.665570 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk"] Dec 03 21:50:14.679007 master-0 kubenswrapper[9136]: I1203 21:50:14.678913 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-64554dd846-6vfz6"] Dec 03 21:50:14.679007 master-0 kubenswrapper[9136]: I1203 21:50:14.678990 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-5b4c744c84-xgxft"] Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729160 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-image-import-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729202 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729230 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-docker\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729249 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9vv\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-kube-api-access-4z9vv\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729277 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729292 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729308 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-encryption-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729325 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtxdk\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-kube-api-access-dtxdk\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729351 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-containers\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729375 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-node-pullsecrets\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729407 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729425 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-serving-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729443 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-trusted-ca-bundle\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729463 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-audit\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729480 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-cache\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729694 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/da949cf7-ab12-43ff-8e45-da1c2fd46e20-cache\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729722 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-ca-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729746 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729760 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-audit-dir\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729804 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729822 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-client\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.729840 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zsw\" (UniqueName: \"kubernetes.io/projected/b522af85-394e-4965-9bf4-83f48fb8ad94-kube-api-access-b7zsw\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.730036 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-docker\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.730302 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-containers\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: I1203 21:50:14.730373 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-cache\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: E1203 21:50:14.730393 9136 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 03 21:50:14.731808 master-0 kubenswrapper[9136]: E1203 21:50:14.730437 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs podName:9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:15.230420165 +0000 UTC m=+21.505596547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs") pod "catalogd-controller-manager-754cfd84-bnstw" (UID: "9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6") : secret "catalogserver-cert" not found Dec 03 21:50:14.752813 master-0 kubenswrapper[9136]: I1203 21:50:14.745672 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-ca-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.800903 master-0 kubenswrapper[9136]: I1203 21:50:14.791943 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9vv\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-kube-api-access-4z9vv\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:14.830659 master-0 kubenswrapper[9136]: I1203 21:50:14.830568 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-audit-dir\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.830659 master-0 kubenswrapper[9136]: I1203 21:50:14.830640 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.830659 master-0 kubenswrapper[9136]: I1203 21:50:14.830658 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-client\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.830659 master-0 kubenswrapper[9136]: I1203 21:50:14.830675 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zsw\" (UniqueName: \"kubernetes.io/projected/b522af85-394e-4965-9bf4-83f48fb8ad94-kube-api-access-b7zsw\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830697 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-image-import-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830717 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830743 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830761 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830795 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-encryption-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830822 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxdk\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-kube-api-access-dtxdk\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830851 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-node-pullsecrets\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830874 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830891 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-serving-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830908 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-trusted-ca-bundle\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830929 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-audit\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830954 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/da949cf7-ab12-43ff-8e45-da1c2fd46e20-cache\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.830988 9136 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/553d3958-b25d-4855-a2da-1d94c0196f4a-audit\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:14.831056 master-0 kubenswrapper[9136]: I1203 21:50:14.831001 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/553d3958-b25d-4855-a2da-1d94c0196f4a-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:14.831474 master-0 kubenswrapper[9136]: I1203 21:50:14.831438 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/da949cf7-ab12-43ff-8e45-da1c2fd46e20-cache\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.831515 master-0 kubenswrapper[9136]: I1203 21:50:14.831494 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-audit-dir\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.832097 master-0 kubenswrapper[9136]: I1203 21:50:14.832064 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.834622 master-0 kubenswrapper[9136]: E1203 21:50:14.834488 9136 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 21:50:14.834622 master-0 kubenswrapper[9136]: E1203 21:50:14.834607 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert podName:b522af85-394e-4965-9bf4-83f48fb8ad94 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:15.334588558 +0000 UTC m=+21.609764940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert") pod "apiserver-64554dd846-6vfz6" (UID: "b522af85-394e-4965-9bf4-83f48fb8ad94") : secret "serving-cert" not found Dec 03 21:50:14.834952 master-0 kubenswrapper[9136]: I1203 21:50:14.834916 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-node-pullsecrets\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.835038 master-0 kubenswrapper[9136]: I1203 21:50:14.835006 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.835591 master-0 kubenswrapper[9136]: I1203 21:50:14.835562 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-serving-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.835745 master-0 kubenswrapper[9136]: I1203 21:50:14.835699 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-audit\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.835799 master-0 kubenswrapper[9136]: I1203 21:50:14.835761 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.836001 master-0 kubenswrapper[9136]: I1203 21:50:14.835972 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-client\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.836998 master-0 kubenswrapper[9136]: I1203 21:50:14.836963 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-trusted-ca-bundle\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.839631 master-0 kubenswrapper[9136]: I1203 21:50:14.839594 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.840483 master-0 kubenswrapper[9136]: I1203 21:50:14.840452 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-encryption-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.849223 master-0 kubenswrapper[9136]: I1203 21:50:14.849174 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-image-import-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.853200 master-0 kubenswrapper[9136]: I1203 21:50:14.851220 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zsw\" (UniqueName: \"kubernetes.io/projected/b522af85-394e-4965-9bf4-83f48fb8ad94-kube-api-access-b7zsw\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:14.859510 master-0 kubenswrapper[9136]: I1203 21:50:14.859456 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxdk\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-kube-api-access-dtxdk\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:14.914756 master-0 kubenswrapper[9136]: I1203 21:50:14.914706 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:15.236805 master-0 kubenswrapper[9136]: I1203 21:50:15.236684 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:15.236805 master-0 kubenswrapper[9136]: I1203 21:50:15.236787 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:15.237121 master-0 kubenswrapper[9136]: I1203 21:50:15.236884 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:15.237167 master-0 kubenswrapper[9136]: E1203 21:50:15.237127 9136 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 03 21:50:15.237530 master-0 kubenswrapper[9136]: E1203 21:50:15.237225 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs podName:9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:16.237200499 +0000 UTC m=+22.512376881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs") pod "catalogd-controller-manager-754cfd84-bnstw" (UID: "9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6") : secret "catalogserver-cert" not found Dec 03 21:50:15.237530 master-0 kubenswrapper[9136]: E1203 21:50:15.237382 9136 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:15.237530 master-0 kubenswrapper[9136]: E1203 21:50:15.237493 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca podName:0c19c101-43a4-4df5-a5b5-4a1d251b9d56 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:31.237464017 +0000 UTC m=+37.512640409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca") pod "route-controller-manager-7896f4c46b-xmld6" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56") : configmap "client-ca" not found Dec 03 21:50:15.242803 master-0 kubenswrapper[9136]: I1203 21:50:15.242729 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"route-controller-manager-7896f4c46b-xmld6\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:15.338805 master-0 kubenswrapper[9136]: I1203 21:50:15.338706 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:15.339142 master-0 kubenswrapper[9136]: E1203 21:50:15.338991 9136 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Dec 03 21:50:15.339142 master-0 kubenswrapper[9136]: E1203 21:50:15.339125 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert podName:b522af85-394e-4965-9bf4-83f48fb8ad94 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:16.339090239 +0000 UTC m=+22.614266641 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert") pod "apiserver-64554dd846-6vfz6" (UID: "b522af85-394e-4965-9bf4-83f48fb8ad94") : secret "serving-cert" not found Dec 03 21:50:15.930602 master-0 kubenswrapper[9136]: I1203 21:50:15.930140 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="553d3958-b25d-4855-a2da-1d94c0196f4a" path="/var/lib/kubelet/pods/553d3958-b25d-4855-a2da-1d94c0196f4a/volumes" Dec 03 21:50:16.253028 master-0 kubenswrapper[9136]: I1203 21:50:16.252964 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:16.253408 master-0 kubenswrapper[9136]: E1203 21:50:16.253237 9136 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 03 21:50:16.253408 master-0 kubenswrapper[9136]: E1203 21:50:16.253373 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs podName:9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:18.25334595 +0000 UTC m=+24.528522332 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs") pod "catalogd-controller-manager-754cfd84-bnstw" (UID: "9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6") : secret "catalogserver-cert" not found Dec 03 21:50:16.354385 master-0 kubenswrapper[9136]: I1203 21:50:16.354314 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:16.359275 master-0 kubenswrapper[9136]: I1203 21:50:16.357866 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:16.440071 master-0 kubenswrapper[9136]: I1203 21:50:16.438457 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:17.228087 master-0 kubenswrapper[9136]: I1203 21:50:17.227889 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" event={"ID":"a9a3f403-a742-4977-901a-cf4a8eb7df5a","Type":"ContainerStarted","Data":"4d33b8996a9907eee2420c9ffc1f976f04def00afc79e4c88fc91436ffd375dc"} Dec 03 21:50:17.228087 master-0 kubenswrapper[9136]: I1203 21:50:17.227963 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" event={"ID":"a9a3f403-a742-4977-901a-cf4a8eb7df5a","Type":"ContainerStarted","Data":"b9bd6bca32a6b0ee5f1c8266b23d6d336fd3b62e13e72de0eb5f2d1ab3279dd1"} Dec 03 21:50:17.231635 master-0 kubenswrapper[9136]: I1203 21:50:17.231546 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"6c63c3c05a429e1c2a2724aa6046ef16e3d07844020b76ab9c6e4dd4aedf14d1"} Dec 03 21:50:17.231834 master-0 kubenswrapper[9136]: I1203 21:50:17.231694 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"f19ff35f2b24d88afcc00f79fc42996d874d61c061936d271b6ff663d701d5fa"} Dec 03 21:50:17.234231 master-0 kubenswrapper[9136]: I1203 21:50:17.234166 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" event={"ID":"39f0e973-7864-4842-af8e-47718ab1804c","Type":"ContainerStarted","Data":"0eed9d981ff70eac9619ddc620b8f9e1ce7420952f2e8fb539ac72d9a0cb037f"} Dec 03 21:50:18.279382 master-0 kubenswrapper[9136]: I1203 21:50:18.279269 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:18.280344 master-0 kubenswrapper[9136]: E1203 21:50:18.279507 9136 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Dec 03 21:50:18.280344 master-0 kubenswrapper[9136]: E1203 21:50:18.279633 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs podName:9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:22.279604239 +0000 UTC m=+28.554780651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs") pod "catalogd-controller-manager-754cfd84-bnstw" (UID: "9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6") : secret "catalogserver-cert" not found Dec 03 21:50:18.381710 master-0 kubenswrapper[9136]: E1203 21:50:18.381625 9136 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Dec 03 21:50:18.381710 master-0 kubenswrapper[9136]: E1203 21:50:18.381706 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca podName:0778622f-e8ed-4eb0-9317-b4e95c135a48 nodeName:}" failed. No retries permitted until 2025-12-03 21:50:34.381687726 +0000 UTC m=+40.656864198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca") pod "controller-manager-6db9498b78-xvrlr" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48") : configmap "client-ca" not found Dec 03 21:50:18.382120 master-0 kubenswrapper[9136]: I1203 21:50:18.381722 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") pod \"controller-manager-6db9498b78-xvrlr\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:20.570958 master-0 kubenswrapper[9136]: I1203 21:50:20.570868 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 21:50:20.571894 master-0 kubenswrapper[9136]: I1203 21:50:20.571859 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.572729 master-0 kubenswrapper[9136]: I1203 21:50:20.572671 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-64554dd846-6vfz6"] Dec 03 21:50:20.581360 master-0 kubenswrapper[9136]: I1203 21:50:20.581302 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 03 21:50:20.588719 master-0 kubenswrapper[9136]: I1203 21:50:20.588473 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 21:50:20.590197 master-0 kubenswrapper[9136]: I1203 21:50:20.589172 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.591176 master-0 kubenswrapper[9136]: W1203 21:50:20.591141 9136 reflector.go:561] object-"openshift-etcd"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:master-0" cannot list resource "configmaps" in API group "" in the namespace "openshift-etcd": no relationship found between node 'master-0' and this object Dec 03 21:50:20.591372 master-0 kubenswrapper[9136]: E1203 21:50:20.591194 9136 reflector.go:158] "Unhandled Error" err="object-\"openshift-etcd\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:master-0\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-etcd\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 21:50:20.599377 master-0 kubenswrapper[9136]: I1203 21:50:20.599213 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk"] Dec 03 21:50:20.600921 master-0 kubenswrapper[9136]: W1203 21:50:20.600026 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb522af85_394e_4965_9bf4_83f48fb8ad94.slice/crio-82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa WatchSource:0}: Error finding container 82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa: Status 404 returned error can't find the container with id 82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa Dec 03 21:50:20.726853 master-0 kubenswrapper[9136]: I1203 21:50:20.722872 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.726853 master-0 kubenswrapper[9136]: I1203 21:50:20.722951 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.726853 master-0 kubenswrapper[9136]: I1203 21:50:20.723007 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-var-lock\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.726853 master-0 kubenswrapper[9136]: I1203 21:50:20.723069 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kube-api-access\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.726853 master-0 kubenswrapper[9136]: I1203 21:50:20.723097 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-var-lock\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.726853 master-0 kubenswrapper[9136]: I1203 21:50:20.723161 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3497f5dd-4c6f-4108-a948-481cef475ba9-kube-api-access\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.825214 master-0 kubenswrapper[9136]: I1203 21:50:20.825041 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3497f5dd-4c6f-4108-a948-481cef475ba9-kube-api-access\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.825214 master-0 kubenswrapper[9136]: I1203 21:50:20.825170 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.825214 master-0 kubenswrapper[9136]: I1203 21:50:20.825193 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.825525 master-0 kubenswrapper[9136]: I1203 21:50:20.825235 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-var-lock\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.825525 master-0 kubenswrapper[9136]: I1203 21:50:20.825331 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kube-api-access\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.825525 master-0 kubenswrapper[9136]: I1203 21:50:20.825346 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:20.825525 master-0 kubenswrapper[9136]: I1203 21:50:20.825346 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.825525 master-0 kubenswrapper[9136]: I1203 21:50:20.825354 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-var-lock\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.825525 master-0 kubenswrapper[9136]: I1203 21:50:20.825472 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-var-lock\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:20.825858 master-0 kubenswrapper[9136]: I1203 21:50:20.825426 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-var-lock\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:21.262696 master-0 kubenswrapper[9136]: I1203 21:50:21.262589 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" event={"ID":"b522af85-394e-4965-9bf4-83f48fb8ad94","Type":"ContainerStarted","Data":"82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa"} Dec 03 21:50:21.264213 master-0 kubenswrapper[9136]: I1203 21:50:21.264163 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerStarted","Data":"c899fa45611ca8adf80b1957f02ef9824a27d5c29a283564fa77a53c34bc01e4"} Dec 03 21:50:21.425833 master-0 kubenswrapper[9136]: I1203 21:50:21.419300 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 21:50:21.955649 master-0 kubenswrapper[9136]: I1203 21:50:21.955587 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 03 21:50:22.272217 master-0 kubenswrapper[9136]: I1203 21:50:22.272036 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerStarted","Data":"5ae5503bf1205bc663cc1204fe09557a6737e824df9d6d84a86d54a184a45a47"} Dec 03 21:50:22.349353 master-0 kubenswrapper[9136]: I1203 21:50:22.349272 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:22.354657 master-0 kubenswrapper[9136]: I1203 21:50:22.354579 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:22.368933 master-0 kubenswrapper[9136]: I1203 21:50:22.368251 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:23.946835 master-0 kubenswrapper[9136]: I1203 21:50:23.936938 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 21:50:23.995809 master-0 kubenswrapper[9136]: I1203 21:50:23.995516 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kube-api-access\") pod \"installer-1-master-0\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:24.020803 master-0 kubenswrapper[9136]: I1203 21:50:24.020277 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw"] Dec 03 21:50:24.032058 master-0 kubenswrapper[9136]: I1203 21:50:24.031386 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3497f5dd-4c6f-4108-a948-481cef475ba9-kube-api-access\") pod \"installer-1-master-0\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:24.123728 master-0 kubenswrapper[9136]: I1203 21:50:24.123667 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-869c786959-2bnjf"] Dec 03 21:50:24.128482 master-0 kubenswrapper[9136]: I1203 21:50:24.124379 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" podUID="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" containerName="cluster-version-operator" containerID="cri-o://f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3" gracePeriod=130 Dec 03 21:50:24.160879 master-0 kubenswrapper[9136]: I1203 21:50:24.154443 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-67d47fb995-88vr2"] Dec 03 21:50:24.160879 master-0 kubenswrapper[9136]: I1203 21:50:24.155112 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.161672 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.162579 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.162898 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.163171 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.168347 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.168394 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.168344 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 21:50:24.171214 master-0 kubenswrapper[9136]: I1203 21:50:24.168629 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 21:50:24.205203 master-0 kubenswrapper[9136]: I1203 21:50:24.205119 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:50:24.231158 master-0 kubenswrapper[9136]: I1203 21:50:24.231019 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-67d47fb995-88vr2"] Dec 03 21:50:24.234807 master-0 kubenswrapper[9136]: I1203 21:50:24.234059 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288252 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-trusted-ca-bundle\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288309 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-dir\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288353 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-policies\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288384 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-encryption-config\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288449 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-client\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288469 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h2wx\" (UniqueName: \"kubernetes.io/projected/246b7846-0dfd-43a8-bcfa-81e7435060dc-kube-api-access-5h2wx\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288494 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-serving-ca\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.291818 master-0 kubenswrapper[9136]: I1203 21:50:24.288530 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-serving-cert\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.312823 master-0 kubenswrapper[9136]: I1203 21:50:24.310923 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerStarted","Data":"9ec46f78631dc5d3136c1bddb2b12b556c8cb1a57963c1f9c92e95b445e53c3f"} Dec 03 21:50:24.312823 master-0 kubenswrapper[9136]: I1203 21:50:24.311537 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:24.356552 master-0 kubenswrapper[9136]: I1203 21:50:24.356451 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerStarted","Data":"6cee457a91036b49aea08a36be55afac46ecdcc7958eeb7e1aefc04a6a5aeae8"} Dec 03 21:50:24.389931 master-0 kubenswrapper[9136]: I1203 21:50:24.389855 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-client\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.389931 master-0 kubenswrapper[9136]: I1203 21:50:24.389907 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2wx\" (UniqueName: \"kubernetes.io/projected/246b7846-0dfd-43a8-bcfa-81e7435060dc-kube-api-access-5h2wx\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.389931 master-0 kubenswrapper[9136]: I1203 21:50:24.389929 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-serving-ca\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.390177 master-0 kubenswrapper[9136]: I1203 21:50:24.389958 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-serving-cert\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.390177 master-0 kubenswrapper[9136]: I1203 21:50:24.390016 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-trusted-ca-bundle\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.390177 master-0 kubenswrapper[9136]: I1203 21:50:24.390034 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-dir\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.390177 master-0 kubenswrapper[9136]: I1203 21:50:24.390059 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-policies\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.390177 master-0 kubenswrapper[9136]: I1203 21:50:24.390080 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-encryption-config\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.402807 master-0 kubenswrapper[9136]: I1203 21:50:24.396871 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-serving-ca\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.402807 master-0 kubenswrapper[9136]: I1203 21:50:24.399930 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-serving-cert\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.402807 master-0 kubenswrapper[9136]: I1203 21:50:24.400218 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-client\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.402807 master-0 kubenswrapper[9136]: I1203 21:50:24.401116 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-trusted-ca-bundle\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.402807 master-0 kubenswrapper[9136]: I1203 21:50:24.401423 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-dir\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.402807 master-0 kubenswrapper[9136]: I1203 21:50:24.401826 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-policies\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.418924 master-0 kubenswrapper[9136]: I1203 21:50:24.417115 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-encryption-config\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.436382 master-0 kubenswrapper[9136]: I1203 21:50:24.436298 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podStartSLOduration=10.436279988999999 podStartE2EDuration="10.436279989s" podCreationTimestamp="2025-12-03 21:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:24.435318189 +0000 UTC m=+30.710494581" watchObservedRunningTime="2025-12-03 21:50:24.436279989 +0000 UTC m=+30.711456371" Dec 03 21:50:24.439566 master-0 kubenswrapper[9136]: I1203 21:50:24.439513 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2wx\" (UniqueName: \"kubernetes.io/projected/246b7846-0dfd-43a8-bcfa-81e7435060dc-kube-api-access-5h2wx\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.496071 master-0 kubenswrapper[9136]: I1203 21:50:24.496025 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:24.501171 master-0 kubenswrapper[9136]: I1203 21:50:24.499514 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6db9498b78-xvrlr"] Dec 03 21:50:24.501171 master-0 kubenswrapper[9136]: E1203 21:50:24.499725 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" podUID="0778622f-e8ed-4eb0-9317-b4e95c135a48" Dec 03 21:50:24.511278 master-0 kubenswrapper[9136]: I1203 21:50:24.508938 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6"] Dec 03 21:50:24.511530 master-0 kubenswrapper[9136]: E1203 21:50:24.511491 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" podUID="0c19c101-43a4-4df5-a5b5-4a1d251b9d56" Dec 03 21:50:24.558388 master-0 kubenswrapper[9136]: I1203 21:50:24.558350 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9skcn"] Dec 03 21:50:24.559191 master-0 kubenswrapper[9136]: I1203 21:50:24.559152 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.561232 master-0 kubenswrapper[9136]: I1203 21:50:24.561203 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 21:50:24.561660 master-0 kubenswrapper[9136]: I1203 21:50:24.561627 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 21:50:24.561799 master-0 kubenswrapper[9136]: I1203 21:50:24.561754 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 21:50:24.562120 master-0 kubenswrapper[9136]: I1203 21:50:24.562091 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 21:50:24.572007 master-0 kubenswrapper[9136]: I1203 21:50:24.571621 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9skcn"] Dec 03 21:50:24.625604 master-0 kubenswrapper[9136]: I1203 21:50:24.625550 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 21:50:24.648027 master-0 kubenswrapper[9136]: W1203 21:50:24.647980 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca8a36ae_e8fb_4a96_b42d_0e39b51fdc75.slice/crio-d73dffef2d54911dd7ba3e993e7fcb953c22745a8d407d8bc378e0d66df9f829 WatchSource:0}: Error finding container d73dffef2d54911dd7ba3e993e7fcb953c22745a8d407d8bc378e0d66df9f829: Status 404 returned error can't find the container with id d73dffef2d54911dd7ba3e993e7fcb953c22745a8d407d8bc378e0d66df9f829 Dec 03 21:50:24.650205 master-0 kubenswrapper[9136]: I1203 21:50:24.650061 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Dec 03 21:50:24.656044 master-0 kubenswrapper[9136]: W1203 21:50:24.655479 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3497f5dd_4c6f_4108_a948_481cef475ba9.slice/crio-1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62 WatchSource:0}: Error finding container 1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62: Status 404 returned error can't find the container with id 1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62 Dec 03 21:50:24.695820 master-0 kubenswrapper[9136]: I1203 21:50:24.695615 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl7tn\" (UniqueName: \"kubernetes.io/projected/54767c36-ca29-4c91-9a8a-9699ecfa4afb-kube-api-access-bl7tn\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.695926 master-0 kubenswrapper[9136]: I1203 21:50:24.695839 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.696143 master-0 kubenswrapper[9136]: I1203 21:50:24.696028 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54767c36-ca29-4c91-9a8a-9699ecfa4afb-config-volume\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.797304 master-0 kubenswrapper[9136]: I1203 21:50:24.797256 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl7tn\" (UniqueName: \"kubernetes.io/projected/54767c36-ca29-4c91-9a8a-9699ecfa4afb-kube-api-access-bl7tn\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.797379 master-0 kubenswrapper[9136]: I1203 21:50:24.797311 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.797379 master-0 kubenswrapper[9136]: I1203 21:50:24.797373 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54767c36-ca29-4c91-9a8a-9699ecfa4afb-config-volume\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.798097 master-0 kubenswrapper[9136]: I1203 21:50:24.798076 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54767c36-ca29-4c91-9a8a-9699ecfa4afb-config-volume\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.798436 master-0 kubenswrapper[9136]: E1203 21:50:24.798395 9136 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Dec 03 21:50:24.798436 master-0 kubenswrapper[9136]: E1203 21:50:24.798434 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls podName:54767c36-ca29-4c91-9a8a-9699ecfa4afb nodeName:}" failed. No retries permitted until 2025-12-03 21:50:25.298423996 +0000 UTC m=+31.573600378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls") pod "dns-default-9skcn" (UID: "54767c36-ca29-4c91-9a8a-9699ecfa4afb") : secret "dns-default-metrics-tls" not found Dec 03 21:50:24.821617 master-0 kubenswrapper[9136]: I1203 21:50:24.821573 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl7tn\" (UniqueName: \"kubernetes.io/projected/54767c36-ca29-4c91-9a8a-9699ecfa4afb-kube-api-access-bl7tn\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:24.832618 master-0 kubenswrapper[9136]: I1203 21:50:24.832544 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-67d47fb995-88vr2"] Dec 03 21:50:24.873897 master-0 kubenswrapper[9136]: W1203 21:50:24.873839 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod246b7846_0dfd_43a8_bcfa_81e7435060dc.slice/crio-e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e WatchSource:0}: Error finding container e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e: Status 404 returned error can't find the container with id e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e Dec 03 21:50:24.890789 master-0 kubenswrapper[9136]: I1203 21:50:24.890735 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:50:25.000030 master-0 kubenswrapper[9136]: I1203 21:50:24.999976 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") pod \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000047 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") pod \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000100 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") pod \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000162 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") pod \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000189 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") pod \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\" (UID: \"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9\") " Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000400 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000507 9136 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.000711 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:25.000588 master-0 kubenswrapper[9136]: I1203 21:50:25.001197 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca" (OuterVolumeSpecName: "service-ca") pod "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:25.003133 master-0 kubenswrapper[9136]: I1203 21:50:25.003085 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4dx8h"] Dec 03 21:50:25.003346 master-0 kubenswrapper[9136]: E1203 21:50:25.003319 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" containerName="cluster-version-operator" Dec 03 21:50:25.003346 master-0 kubenswrapper[9136]: I1203 21:50:25.003341 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" containerName="cluster-version-operator" Dec 03 21:50:25.003465 master-0 kubenswrapper[9136]: I1203 21:50:25.003418 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" containerName="cluster-version-operator" Dec 03 21:50:25.003961 master-0 kubenswrapper[9136]: I1203 21:50:25.003922 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.006194 master-0 kubenswrapper[9136]: I1203 21:50:25.006166 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:25.006452 master-0 kubenswrapper[9136]: I1203 21:50:25.006399 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" (UID: "2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:25.103175 master-0 kubenswrapper[9136]: I1203 21:50:25.101498 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66aa2598-f4b6-4d3a-9623-aeb707e4912b-hosts-file\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.103175 master-0 kubenswrapper[9136]: I1203 21:50:25.101578 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw7l6\" (UniqueName: \"kubernetes.io/projected/66aa2598-f4b6-4d3a-9623-aeb707e4912b-kube-api-access-mw7l6\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.103175 master-0 kubenswrapper[9136]: I1203 21:50:25.101652 9136 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.103175 master-0 kubenswrapper[9136]: I1203 21:50:25.101663 9136 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.103175 master-0 kubenswrapper[9136]: I1203 21:50:25.101674 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.103175 master-0 kubenswrapper[9136]: I1203 21:50:25.101684 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.203003 master-0 kubenswrapper[9136]: I1203 21:50:25.202895 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw7l6\" (UniqueName: \"kubernetes.io/projected/66aa2598-f4b6-4d3a-9623-aeb707e4912b-kube-api-access-mw7l6\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.203174 master-0 kubenswrapper[9136]: I1203 21:50:25.203086 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66aa2598-f4b6-4d3a-9623-aeb707e4912b-hosts-file\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.203247 master-0 kubenswrapper[9136]: I1203 21:50:25.203221 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66aa2598-f4b6-4d3a-9623-aeb707e4912b-hosts-file\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.221927 master-0 kubenswrapper[9136]: I1203 21:50:25.220390 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw7l6\" (UniqueName: \"kubernetes.io/projected/66aa2598-f4b6-4d3a-9623-aeb707e4912b-kube-api-access-mw7l6\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.304520 master-0 kubenswrapper[9136]: I1203 21:50:25.304448 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:25.309207 master-0 kubenswrapper[9136]: I1203 21:50:25.309168 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:25.366615 master-0 kubenswrapper[9136]: I1203 21:50:25.366549 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerStarted","Data":"0f6a85e4a73afc226173f2b4c67fa571e667d3c81985c4b5d23669be9018152c"} Dec 03 21:50:25.366615 master-0 kubenswrapper[9136]: I1203 21:50:25.366619 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerStarted","Data":"dbcecffadbcf997695109b41d5b20a8abbbcd583e677531a3b9382eb0b2b3b66"} Dec 03 21:50:25.367215 master-0 kubenswrapper[9136]: I1203 21:50:25.367176 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:25.371883 master-0 kubenswrapper[9136]: I1203 21:50:25.371827 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"3497f5dd-4c6f-4108-a948-481cef475ba9","Type":"ContainerStarted","Data":"e15ea152ef1a26d4cb376f11826bf4ceac9e8245aad4cfc2e16ac02f57a9e91c"} Dec 03 21:50:25.371940 master-0 kubenswrapper[9136]: I1203 21:50:25.371900 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"3497f5dd-4c6f-4108-a948-481cef475ba9","Type":"ContainerStarted","Data":"1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62"} Dec 03 21:50:25.375338 master-0 kubenswrapper[9136]: I1203 21:50:25.375298 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75","Type":"ContainerStarted","Data":"d1b7ddaa40a55179e8a8084845290bc3ea8a9cdf63eaebcf45106950ba9b8650"} Dec 03 21:50:25.375416 master-0 kubenswrapper[9136]: I1203 21:50:25.375340 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75","Type":"ContainerStarted","Data":"d73dffef2d54911dd7ba3e993e7fcb953c22745a8d407d8bc378e0d66df9f829"} Dec 03 21:50:25.377678 master-0 kubenswrapper[9136]: I1203 21:50:25.377639 9136 generic.go:334] "Generic (PLEG): container finished" podID="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" containerID="f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3" exitCode=0 Dec 03 21:50:25.377753 master-0 kubenswrapper[9136]: I1203 21:50:25.377704 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" event={"ID":"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9","Type":"ContainerDied","Data":"f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3"} Dec 03 21:50:25.377753 master-0 kubenswrapper[9136]: I1203 21:50:25.377728 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" event={"ID":"2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9","Type":"ContainerDied","Data":"fad8816e9bad40e33a61a6c3879fbe4c9632a314c3eede6ca1d36dcf479c7e09"} Dec 03 21:50:25.377858 master-0 kubenswrapper[9136]: I1203 21:50:25.377792 9136 scope.go:117] "RemoveContainer" containerID="f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3" Dec 03 21:50:25.378094 master-0 kubenswrapper[9136]: I1203 21:50:25.378069 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-869c786959-2bnjf" Dec 03 21:50:25.382752 master-0 kubenswrapper[9136]: I1203 21:50:25.382708 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:25.383079 master-0 kubenswrapper[9136]: I1203 21:50:25.382880 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" event={"ID":"246b7846-0dfd-43a8-bcfa-81e7435060dc","Type":"ContainerStarted","Data":"e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e"} Dec 03 21:50:25.383221 master-0 kubenswrapper[9136]: I1203 21:50:25.383185 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:25.398960 master-0 kubenswrapper[9136]: I1203 21:50:25.398887 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podStartSLOduration=11.398868446 podStartE2EDuration="11.398868446s" podCreationTimestamp="2025-12-03 21:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:25.3870968 +0000 UTC m=+31.662273202" watchObservedRunningTime="2025-12-03 21:50:25.398868446 +0000 UTC m=+31.674044858" Dec 03 21:50:25.399169 master-0 kubenswrapper[9136]: I1203 21:50:25.399118 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:25.404830 master-0 kubenswrapper[9136]: I1203 21:50:25.404729 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dx8h" Dec 03 21:50:25.419400 master-0 kubenswrapper[9136]: I1203 21:50:25.419364 9136 scope.go:117] "RemoveContainer" containerID="f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3" Dec 03 21:50:25.419967 master-0 kubenswrapper[9136]: I1203 21:50:25.419943 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:25.420171 master-0 kubenswrapper[9136]: E1203 21:50:25.420127 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3\": container with ID starting with f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3 not found: ID does not exist" containerID="f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3" Dec 03 21:50:25.420346 master-0 kubenswrapper[9136]: I1203 21:50:25.420176 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3"} err="failed to get container status \"f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3\": rpc error: code = NotFound desc = could not find container \"f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3\": container with ID starting with f728ac382b016272b700c7636a7947d7a0690ed40b449e8f5f1abee819097ab3 not found: ID does not exist" Dec 03 21:50:25.426516 master-0 kubenswrapper[9136]: I1203 21:50:25.426460 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=6.426441199 podStartE2EDuration="6.426441199s" podCreationTimestamp="2025-12-03 21:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:25.413490374 +0000 UTC m=+31.688666776" watchObservedRunningTime="2025-12-03 21:50:25.426441199 +0000 UTC m=+31.701617601" Dec 03 21:50:25.427024 master-0 kubenswrapper[9136]: I1203 21:50:25.426974 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=6.426967335 podStartE2EDuration="6.426967335s" podCreationTimestamp="2025-12-03 21:50:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:25.425142337 +0000 UTC m=+31.700318739" watchObservedRunningTime="2025-12-03 21:50:25.426967335 +0000 UTC m=+31.702143727" Dec 03 21:50:25.446516 master-0 kubenswrapper[9136]: I1203 21:50:25.446447 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-869c786959-2bnjf"] Dec 03 21:50:25.449927 master-0 kubenswrapper[9136]: I1203 21:50:25.449885 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-869c786959-2bnjf"] Dec 03 21:50:25.490127 master-0 kubenswrapper[9136]: I1203 21:50:25.489901 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g"] Dec 03 21:50:25.490740 master-0 kubenswrapper[9136]: I1203 21:50:25.490698 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.493152 master-0 kubenswrapper[9136]: I1203 21:50:25.493110 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 21:50:25.493331 master-0 kubenswrapper[9136]: I1203 21:50:25.493302 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 21:50:25.493464 master-0 kubenswrapper[9136]: I1203 21:50:25.493436 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 21:50:25.506855 master-0 kubenswrapper[9136]: I1203 21:50:25.506814 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4h8v\" (UniqueName: \"kubernetes.io/projected/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-kube-api-access-j4h8v\") pod \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " Dec 03 21:50:25.506952 master-0 kubenswrapper[9136]: I1203 21:50:25.506865 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-config\") pod \"0778622f-e8ed-4eb0-9317-b4e95c135a48\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " Dec 03 21:50:25.506952 master-0 kubenswrapper[9136]: I1203 21:50:25.506898 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") pod \"0778622f-e8ed-4eb0-9317-b4e95c135a48\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " Dec 03 21:50:25.507016 master-0 kubenswrapper[9136]: I1203 21:50:25.506958 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x6sn\" (UniqueName: \"kubernetes.io/projected/0778622f-e8ed-4eb0-9317-b4e95c135a48-kube-api-access-5x6sn\") pod \"0778622f-e8ed-4eb0-9317-b4e95c135a48\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " Dec 03 21:50:25.507016 master-0 kubenswrapper[9136]: I1203 21:50:25.506984 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-config\") pod \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " Dec 03 21:50:25.507075 master-0 kubenswrapper[9136]: I1203 21:50:25.507033 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") pod \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\" (UID: \"0c19c101-43a4-4df5-a5b5-4a1d251b9d56\") " Dec 03 21:50:25.507075 master-0 kubenswrapper[9136]: I1203 21:50:25.507063 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-proxy-ca-bundles\") pod \"0778622f-e8ed-4eb0-9317-b4e95c135a48\" (UID: \"0778622f-e8ed-4eb0-9317-b4e95c135a48\") " Dec 03 21:50:25.510173 master-0 kubenswrapper[9136]: I1203 21:50:25.510093 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:25.510281 master-0 kubenswrapper[9136]: I1203 21:50:25.510241 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0778622f-e8ed-4eb0-9317-b4e95c135a48" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:25.510759 master-0 kubenswrapper[9136]: I1203 21:50:25.510721 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-config" (OuterVolumeSpecName: "config") pod "0778622f-e8ed-4eb0-9317-b4e95c135a48" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:25.511297 master-0 kubenswrapper[9136]: I1203 21:50:25.511227 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-config" (OuterVolumeSpecName: "config") pod "0c19c101-43a4-4df5-a5b5-4a1d251b9d56" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:25.512058 master-0 kubenswrapper[9136]: I1203 21:50:25.512018 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-kube-api-access-j4h8v" (OuterVolumeSpecName: "kube-api-access-j4h8v") pod "0c19c101-43a4-4df5-a5b5-4a1d251b9d56" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56"). InnerVolumeSpecName "kube-api-access-j4h8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:25.514889 master-0 kubenswrapper[9136]: I1203 21:50:25.514844 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0778622f-e8ed-4eb0-9317-b4e95c135a48" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:25.519133 master-0 kubenswrapper[9136]: I1203 21:50:25.515914 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0778622f-e8ed-4eb0-9317-b4e95c135a48-kube-api-access-5x6sn" (OuterVolumeSpecName: "kube-api-access-5x6sn") pod "0778622f-e8ed-4eb0-9317-b4e95c135a48" (UID: "0778622f-e8ed-4eb0-9317-b4e95c135a48"). InnerVolumeSpecName "kube-api-access-5x6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:25.519133 master-0 kubenswrapper[9136]: I1203 21:50:25.517999 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0c19c101-43a4-4df5-a5b5-4a1d251b9d56" (UID: "0c19c101-43a4-4df5-a5b5-4a1d251b9d56"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:25.611010 master-0 kubenswrapper[9136]: I1203 21:50:25.610963 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.611108 master-0 kubenswrapper[9136]: I1203 21:50:25.611053 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.611167 master-0 kubenswrapper[9136]: I1203 21:50:25.611123 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.611167 master-0 kubenswrapper[9136]: I1203 21:50:25.611154 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.611257 master-0 kubenswrapper[9136]: I1203 21:50:25.611187 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.611257 master-0 kubenswrapper[9136]: I1203 21:50:25.611234 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.611257 master-0 kubenswrapper[9136]: I1203 21:50:25.611250 9136 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.611478 master-0 kubenswrapper[9136]: I1203 21:50:25.611264 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4h8v\" (UniqueName: \"kubernetes.io/projected/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-kube-api-access-j4h8v\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.611478 master-0 kubenswrapper[9136]: I1203 21:50:25.611282 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.611478 master-0 kubenswrapper[9136]: I1203 21:50:25.611295 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0778622f-e8ed-4eb0-9317-b4e95c135a48-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.611478 master-0 kubenswrapper[9136]: I1203 21:50:25.611308 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x6sn\" (UniqueName: \"kubernetes.io/projected/0778622f-e8ed-4eb0-9317-b4e95c135a48-kube-api-access-5x6sn\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.611478 master-0 kubenswrapper[9136]: I1203 21:50:25.611321 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:25.712704 master-0 kubenswrapper[9136]: I1203 21:50:25.712541 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.712704 master-0 kubenswrapper[9136]: I1203 21:50:25.712626 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.712704 master-0 kubenswrapper[9136]: I1203 21:50:25.712695 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.712704 master-0 kubenswrapper[9136]: I1203 21:50:25.712715 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.713318 master-0 kubenswrapper[9136]: I1203 21:50:25.712756 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.713318 master-0 kubenswrapper[9136]: I1203 21:50:25.712887 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.713318 master-0 kubenswrapper[9136]: I1203 21:50:25.712905 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.713760 master-0 kubenswrapper[9136]: I1203 21:50:25.713732 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.718423 master-0 kubenswrapper[9136]: I1203 21:50:25.718386 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.731964 master-0 kubenswrapper[9136]: I1203 21:50:25.731847 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.807649 master-0 kubenswrapper[9136]: I1203 21:50:25.807584 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 21:50:25.827671 master-0 kubenswrapper[9136]: W1203 21:50:25.827598 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9427b8_d62c_45f7_97d0_1f7667ff27aa.slice/crio-8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8 WatchSource:0}: Error finding container 8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8: Status 404 returned error can't find the container with id 8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8 Dec 03 21:50:25.916716 master-0 kubenswrapper[9136]: I1203 21:50:25.915511 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9" path="/var/lib/kubelet/pods/2c3fb5d5-2cd2-448c-b749-4bb70e2b55f9/volumes" Dec 03 21:50:25.946748 master-0 kubenswrapper[9136]: I1203 21:50:25.946671 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9skcn"] Dec 03 21:50:25.970436 master-0 kubenswrapper[9136]: W1203 21:50:25.970296 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54767c36_ca29_4c91_9a8a_9699ecfa4afb.slice/crio-a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f WatchSource:0}: Error finding container a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f: Status 404 returned error can't find the container with id a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f Dec 03 21:50:26.395898 master-0 kubenswrapper[9136]: I1203 21:50:26.395801 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dx8h" event={"ID":"66aa2598-f4b6-4d3a-9623-aeb707e4912b","Type":"ContainerStarted","Data":"b176c079e107047ad2379d3320e3574f611fd347e852e67772f0ee674fa16544"} Dec 03 21:50:26.395898 master-0 kubenswrapper[9136]: I1203 21:50:26.395881 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dx8h" event={"ID":"66aa2598-f4b6-4d3a-9623-aeb707e4912b","Type":"ContainerStarted","Data":"f8227287562fdcce3f6bc21dbb0cca3acb31c35d8825110980ae93ba96b9894f"} Dec 03 21:50:26.397226 master-0 kubenswrapper[9136]: I1203 21:50:26.397014 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9skcn" event={"ID":"54767c36-ca29-4c91-9a8a-9699ecfa4afb","Type":"ContainerStarted","Data":"a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f"} Dec 03 21:50:26.403030 master-0 kubenswrapper[9136]: I1203 21:50:26.402915 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6" Dec 03 21:50:26.403583 master-0 kubenswrapper[9136]: I1203 21:50:26.403558 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" event={"ID":"0e9427b8-d62c-45f7-97d0-1f7667ff27aa","Type":"ContainerStarted","Data":"3ad0afa8f21e830ac9c2172cd6306aca03770c2953877a91f4130533970ae228"} Dec 03 21:50:26.403660 master-0 kubenswrapper[9136]: I1203 21:50:26.403588 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" event={"ID":"0e9427b8-d62c-45f7-97d0-1f7667ff27aa","Type":"ContainerStarted","Data":"8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8"} Dec 03 21:50:26.404810 master-0 kubenswrapper[9136]: I1203 21:50:26.404046 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db9498b78-xvrlr" Dec 03 21:50:26.410106 master-0 kubenswrapper[9136]: I1203 21:50:26.410057 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:50:26.413154 master-0 kubenswrapper[9136]: I1203 21:50:26.413017 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4dx8h" podStartSLOduration=2.413004733 podStartE2EDuration="2.413004733s" podCreationTimestamp="2025-12-03 21:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:26.411997821 +0000 UTC m=+32.687174203" watchObservedRunningTime="2025-12-03 21:50:26.413004733 +0000 UTC m=+32.688181115" Dec 03 21:50:26.443915 master-0 kubenswrapper[9136]: I1203 21:50:26.435538 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" podStartSLOduration=1.435506323 podStartE2EDuration="1.435506323s" podCreationTimestamp="2025-12-03 21:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:26.432328611 +0000 UTC m=+32.707505003" watchObservedRunningTime="2025-12-03 21:50:26.435506323 +0000 UTC m=+32.710682705" Dec 03 21:50:26.502106 master-0 kubenswrapper[9136]: I1203 21:50:26.502029 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6"] Dec 03 21:50:26.506111 master-0 kubenswrapper[9136]: I1203 21:50:26.506060 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7896f4c46b-xmld6"] Dec 03 21:50:26.543806 master-0 kubenswrapper[9136]: I1203 21:50:26.543168 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2"] Dec 03 21:50:26.544095 master-0 kubenswrapper[9136]: I1203 21:50:26.543941 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.559569 master-0 kubenswrapper[9136]: I1203 21:50:26.547674 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 21:50:26.559569 master-0 kubenswrapper[9136]: I1203 21:50:26.547753 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6db9498b78-xvrlr"] Dec 03 21:50:26.559569 master-0 kubenswrapper[9136]: I1203 21:50:26.558070 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 21:50:26.559569 master-0 kubenswrapper[9136]: I1203 21:50:26.558313 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 21:50:26.559569 master-0 kubenswrapper[9136]: I1203 21:50:26.558422 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 21:50:26.559569 master-0 kubenswrapper[9136]: I1203 21:50:26.558550 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 21:50:26.576819 master-0 kubenswrapper[9136]: I1203 21:50:26.560473 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 21:50:26.584818 master-0 kubenswrapper[9136]: I1203 21:50:26.580885 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2"] Dec 03 21:50:26.584818 master-0 kubenswrapper[9136]: I1203 21:50:26.581747 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6db9498b78-xvrlr"] Dec 03 21:50:26.636811 master-0 kubenswrapper[9136]: I1203 21:50:26.634661 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e45de77-c158-42d3-84be-8f3398e2b2d8-serving-cert\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.636811 master-0 kubenswrapper[9136]: I1203 21:50:26.634733 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-client-ca\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.636811 master-0 kubenswrapper[9136]: I1203 21:50:26.634835 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6bp\" (UniqueName: \"kubernetes.io/projected/9e45de77-c158-42d3-84be-8f3398e2b2d8-kube-api-access-jx6bp\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.636811 master-0 kubenswrapper[9136]: I1203 21:50:26.634882 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-config\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.636811 master-0 kubenswrapper[9136]: I1203 21:50:26.634956 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0c19c101-43a4-4df5-a5b5-4a1d251b9d56-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:26.738301 master-0 kubenswrapper[9136]: I1203 21:50:26.738128 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-config\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.738301 master-0 kubenswrapper[9136]: I1203 21:50:26.738200 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:50:26.738301 master-0 kubenswrapper[9136]: I1203 21:50:26.738246 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:50:26.738301 master-0 kubenswrapper[9136]: I1203 21:50:26.738272 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e45de77-c158-42d3-84be-8f3398e2b2d8-serving-cert\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.738301 master-0 kubenswrapper[9136]: I1203 21:50:26.738298 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-client-ca\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.738893 master-0 kubenswrapper[9136]: I1203 21:50:26.738318 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:26.738893 master-0 kubenswrapper[9136]: I1203 21:50:26.738346 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:50:26.738893 master-0 kubenswrapper[9136]: I1203 21:50:26.738377 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx6bp\" (UniqueName: \"kubernetes.io/projected/9e45de77-c158-42d3-84be-8f3398e2b2d8-kube-api-access-jx6bp\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.738893 master-0 kubenswrapper[9136]: I1203 21:50:26.738397 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:50:26.738893 master-0 kubenswrapper[9136]: I1203 21:50:26.738443 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0778622f-e8ed-4eb0-9317-b4e95c135a48-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:26.739518 master-0 kubenswrapper[9136]: I1203 21:50:26.739464 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-config\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.740087 master-0 kubenswrapper[9136]: I1203 21:50:26.740039 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-client-ca\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.743309 master-0 kubenswrapper[9136]: I1203 21:50:26.743245 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:50:26.746069 master-0 kubenswrapper[9136]: I1203 21:50:26.746033 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:50:26.747038 master-0 kubenswrapper[9136]: I1203 21:50:26.746979 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e45de77-c158-42d3-84be-8f3398e2b2d8-serving-cert\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.747977 master-0 kubenswrapper[9136]: I1203 21:50:26.747943 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"multus-admission-controller-78ddcf56f9-6b8qj\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:50:26.749109 master-0 kubenswrapper[9136]: I1203 21:50:26.749059 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:26.755382 master-0 kubenswrapper[9136]: I1203 21:50:26.755348 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx6bp\" (UniqueName: \"kubernetes.io/projected/9e45de77-c158-42d3-84be-8f3398e2b2d8-kube-api-access-jx6bp\") pod \"route-controller-manager-57689d598d-cj8m2\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.755830 master-0 kubenswrapper[9136]: I1203 21:50:26.755750 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:50:26.931455 master-0 kubenswrapper[9136]: I1203 21:50:26.931398 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:26.948647 master-0 kubenswrapper[9136]: I1203 21:50:26.947354 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 21:50:26.948647 master-0 kubenswrapper[9136]: I1203 21:50:26.948242 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:26.953173 master-0 kubenswrapper[9136]: I1203 21:50:26.952601 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:50:26.958394 master-0 kubenswrapper[9136]: I1203 21:50:26.958360 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 21:50:26.958603 master-0 kubenswrapper[9136]: I1203 21:50:26.958498 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 21:50:27.408550 master-0 kubenswrapper[9136]: I1203 21:50:27.408485 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" containerName="installer" containerID="cri-o://d1b7ddaa40a55179e8a8084845290bc3ea8a9cdf63eaebcf45106950ba9b8650" gracePeriod=30 Dec 03 21:50:27.918933 master-0 kubenswrapper[9136]: I1203 21:50:27.918858 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0778622f-e8ed-4eb0-9317-b4e95c135a48" path="/var/lib/kubelet/pods/0778622f-e8ed-4eb0-9317-b4e95c135a48/volumes" Dec 03 21:50:27.919368 master-0 kubenswrapper[9136]: I1203 21:50:27.919336 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c19c101-43a4-4df5-a5b5-4a1d251b9d56" path="/var/lib/kubelet/pods/0c19c101-43a4-4df5-a5b5-4a1d251b9d56/volumes" Dec 03 21:50:28.541052 master-0 kubenswrapper[9136]: I1203 21:50:28.540760 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774b988f79-vt89b"] Dec 03 21:50:28.541835 master-0 kubenswrapper[9136]: I1203 21:50:28.541751 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.546520 master-0 kubenswrapper[9136]: I1203 21:50:28.546480 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 21:50:28.549600 master-0 kubenswrapper[9136]: I1203 21:50:28.547066 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 21:50:28.549600 master-0 kubenswrapper[9136]: I1203 21:50:28.547105 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 21:50:28.549600 master-0 kubenswrapper[9136]: I1203 21:50:28.549035 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 21:50:28.550522 master-0 kubenswrapper[9136]: I1203 21:50:28.550490 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774b988f79-vt89b"] Dec 03 21:50:28.551004 master-0 kubenswrapper[9136]: I1203 21:50:28.550952 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 21:50:28.552471 master-0 kubenswrapper[9136]: I1203 21:50:28.552285 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 21:50:28.669802 master-0 kubenswrapper[9136]: I1203 21:50:28.669503 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9fb\" (UniqueName: \"kubernetes.io/projected/3a5df13c-bb21-42f1-acb0-06c888081325-kube-api-access-hb9fb\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.669802 master-0 kubenswrapper[9136]: I1203 21:50:28.669558 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-config\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.669802 master-0 kubenswrapper[9136]: I1203 21:50:28.669579 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5df13c-bb21-42f1-acb0-06c888081325-serving-cert\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.669802 master-0 kubenswrapper[9136]: I1203 21:50:28.669646 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-client-ca\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.669802 master-0 kubenswrapper[9136]: I1203 21:50:28.669697 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-proxy-ca-bundles\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.771470 master-0 kubenswrapper[9136]: I1203 21:50:28.771385 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9fb\" (UniqueName: \"kubernetes.io/projected/3a5df13c-bb21-42f1-acb0-06c888081325-kube-api-access-hb9fb\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.771470 master-0 kubenswrapper[9136]: I1203 21:50:28.771470 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-config\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.771756 master-0 kubenswrapper[9136]: I1203 21:50:28.771502 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5df13c-bb21-42f1-acb0-06c888081325-serving-cert\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.771756 master-0 kubenswrapper[9136]: I1203 21:50:28.771542 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-client-ca\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.771756 master-0 kubenswrapper[9136]: I1203 21:50:28.771591 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-proxy-ca-bundles\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.773125 master-0 kubenswrapper[9136]: I1203 21:50:28.773076 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-config\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.774143 master-0 kubenswrapper[9136]: I1203 21:50:28.773595 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-client-ca\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.779609 master-0 kubenswrapper[9136]: I1203 21:50:28.779391 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-proxy-ca-bundles\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.779609 master-0 kubenswrapper[9136]: I1203 21:50:28.779536 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5df13c-bb21-42f1-acb0-06c888081325-serving-cert\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.790108 master-0 kubenswrapper[9136]: I1203 21:50:28.790061 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9fb\" (UniqueName: \"kubernetes.io/projected/3a5df13c-bb21-42f1-acb0-06c888081325-kube-api-access-hb9fb\") pod \"controller-manager-774b988f79-vt89b\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:28.870695 master-0 kubenswrapper[9136]: I1203 21:50:28.870627 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:29.331513 master-0 kubenswrapper[9136]: I1203 21:50:29.331403 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 21:50:29.331977 master-0 kubenswrapper[9136]: I1203 21:50:29.331951 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.342134 master-0 kubenswrapper[9136]: I1203 21:50:29.342073 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 21:50:29.478903 master-0 kubenswrapper[9136]: I1203 21:50:29.478852 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b657e787-8104-4a1d-8c2f-1e75d41858b3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.478903 master-0 kubenswrapper[9136]: I1203 21:50:29.478898 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.479040 master-0 kubenswrapper[9136]: I1203 21:50:29.478963 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-var-lock\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.580366 master-0 kubenswrapper[9136]: I1203 21:50:29.580311 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-var-lock\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.580891 master-0 kubenswrapper[9136]: I1203 21:50:29.580379 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b657e787-8104-4a1d-8c2f-1e75d41858b3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.580891 master-0 kubenswrapper[9136]: I1203 21:50:29.580405 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.580891 master-0 kubenswrapper[9136]: I1203 21:50:29.580512 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.580891 master-0 kubenswrapper[9136]: I1203 21:50:29.580496 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-var-lock\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.626521 master-0 kubenswrapper[9136]: I1203 21:50:29.619182 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b657e787-8104-4a1d-8c2f-1e75d41858b3-kube-api-access\") pod \"installer-2-master-0\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.694741 master-0 kubenswrapper[9136]: I1203 21:50:29.686494 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:29.742892 master-0 kubenswrapper[9136]: I1203 21:50:29.742073 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d"] Dec 03 21:50:29.742892 master-0 kubenswrapper[9136]: W1203 21:50:29.742757 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4399d20_f9a6_4ab1_86be_e2845394eaba.slice/crio-54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e WatchSource:0}: Error finding container 54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e: Status 404 returned error can't find the container with id 54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e Dec 03 21:50:29.986162 master-0 kubenswrapper[9136]: I1203 21:50:29.984600 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h6569"] Dec 03 21:50:29.986162 master-0 kubenswrapper[9136]: I1203 21:50:29.986083 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj"] Dec 03 21:50:30.026821 master-0 kubenswrapper[9136]: I1203 21:50:30.025708 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 21:50:30.052218 master-0 kubenswrapper[9136]: W1203 21:50:30.052135 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb657e787_8104_4a1d_8c2f_1e75d41858b3.slice/crio-6127cb16baea8f72776fda0c949a85566c1cd225cc68e314233a55938a271ab7 WatchSource:0}: Error finding container 6127cb16baea8f72776fda0c949a85566c1cd225cc68e314233a55938a271ab7: Status 404 returned error can't find the container with id 6127cb16baea8f72776fda0c949a85566c1cd225cc68e314233a55938a271ab7 Dec 03 21:50:30.062367 master-0 kubenswrapper[9136]: I1203 21:50:30.062291 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s"] Dec 03 21:50:30.101661 master-0 kubenswrapper[9136]: W1203 21:50:30.100962 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebd69d2_5b0f_4b66_8722_d6861eba3e12.slice/crio-5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb WatchSource:0}: Error finding container 5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb: Status 404 returned error can't find the container with id 5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb Dec 03 21:50:30.109497 master-0 kubenswrapper[9136]: I1203 21:50:30.109448 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2"] Dec 03 21:50:30.122195 master-0 kubenswrapper[9136]: I1203 21:50:30.119337 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774b988f79-vt89b"] Dec 03 21:50:30.122195 master-0 kubenswrapper[9136]: I1203 21:50:30.120782 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5"] Dec 03 21:50:30.138074 master-0 kubenswrapper[9136]: W1203 21:50:30.138027 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04f5fc52_4ec2_48c3_8441_2b15ad632233.slice/crio-1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd WatchSource:0}: Error finding container 1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd: Status 404 returned error can't find the container with id 1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd Dec 03 21:50:30.430154 master-0 kubenswrapper[9136]: I1203 21:50:30.430011 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b657e787-8104-4a1d-8c2f-1e75d41858b3","Type":"ContainerStarted","Data":"6d943366548026c48336b1c340f964986628615dcb1f1fd5e02b9eac957b2d2a"} Dec 03 21:50:30.430154 master-0 kubenswrapper[9136]: I1203 21:50:30.430070 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b657e787-8104-4a1d-8c2f-1e75d41858b3","Type":"ContainerStarted","Data":"6127cb16baea8f72776fda0c949a85566c1cd225cc68e314233a55938a271ab7"} Dec 03 21:50:30.431646 master-0 kubenswrapper[9136]: I1203 21:50:30.431120 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" event={"ID":"134c10ef-9f37-4a77-8e8b-4f8326bc8f40","Type":"ContainerStarted","Data":"f6c57485135ac402ba25cbe0ce41f1bfe52dbe6ef45a2663904b0cf72595d893"} Dec 03 21:50:30.435513 master-0 kubenswrapper[9136]: I1203 21:50:30.435453 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9skcn" event={"ID":"54767c36-ca29-4c91-9a8a-9699ecfa4afb","Type":"ContainerStarted","Data":"da58bfd35073e0233de9b201244e7e3a90052a7b61f43204e4447af4bcbd1079"} Dec 03 21:50:30.435603 master-0 kubenswrapper[9136]: I1203 21:50:30.435559 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9skcn" event={"ID":"54767c36-ca29-4c91-9a8a-9699ecfa4afb","Type":"ContainerStarted","Data":"5dc4c767fae3acf9d2853b52528a9755f7616f198183325ca9deaeb6b351462d"} Dec 03 21:50:30.435707 master-0 kubenswrapper[9136]: I1203 21:50:30.435645 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:30.437580 master-0 kubenswrapper[9136]: I1203 21:50:30.437533 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" event={"ID":"04f5fc52-4ec2-48c3-8441-2b15ad632233","Type":"ContainerStarted","Data":"5e0d8e26202b194ed1eeaba5969402b49d6682bab995c03c8f5c12dfbbd065f6"} Dec 03 21:50:30.437644 master-0 kubenswrapper[9136]: I1203 21:50:30.437593 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" event={"ID":"04f5fc52-4ec2-48c3-8441-2b15ad632233","Type":"ContainerStarted","Data":"1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd"} Dec 03 21:50:30.438864 master-0 kubenswrapper[9136]: I1203 21:50:30.438834 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" event={"ID":"9e45de77-c158-42d3-84be-8f3398e2b2d8","Type":"ContainerStarted","Data":"bb44266497b6711c0e2a223d5a9c3f7bd9be4631cc3dc0fcf1c247769e3d3d58"} Dec 03 21:50:30.440903 master-0 kubenswrapper[9136]: I1203 21:50:30.440598 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" event={"ID":"bebd69d2-5b0f-4b66-8722-d6861eba3e12","Type":"ContainerStarted","Data":"5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb"} Dec 03 21:50:30.444258 master-0 kubenswrapper[9136]: I1203 21:50:30.443820 9136 generic.go:334] "Generic (PLEG): container finished" podID="b522af85-394e-4965-9bf4-83f48fb8ad94" containerID="a2b603babc41baf7369a41e8e1bd12d0ecf8378163120db7862d41d41fe4536e" exitCode=0 Dec 03 21:50:30.444258 master-0 kubenswrapper[9136]: I1203 21:50:30.443890 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" event={"ID":"b522af85-394e-4965-9bf4-83f48fb8ad94","Type":"ContainerDied","Data":"a2b603babc41baf7369a41e8e1bd12d0ecf8378163120db7862d41d41fe4536e"} Dec 03 21:50:30.446965 master-0 kubenswrapper[9136]: I1203 21:50:30.446905 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" event={"ID":"3a5df13c-bb21-42f1-acb0-06c888081325","Type":"ContainerStarted","Data":"6567a8d6c13e79e7e931577c9b49f0047b20d4afd508c69fedf219ce2d715a62"} Dec 03 21:50:30.450056 master-0 kubenswrapper[9136]: I1203 21:50:30.450004 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6569" event={"ID":"812401c0-d1ac-4857-b939-217b7b07f8bc","Type":"ContainerStarted","Data":"b0c53893b7d584ad782050fd24c3bd810082c269ba863ea58500a5c74b322c5a"} Dec 03 21:50:30.451575 master-0 kubenswrapper[9136]: I1203 21:50:30.451535 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerStarted","Data":"54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e"} Dec 03 21:50:30.453987 master-0 kubenswrapper[9136]: I1203 21:50:30.453433 9136 generic.go:334] "Generic (PLEG): container finished" podID="246b7846-0dfd-43a8-bcfa-81e7435060dc" containerID="13495d8d6fbe117fe450e36fc69d037193ff935ef6fdc0f5ba92833ef4c5c160" exitCode=0 Dec 03 21:50:30.453987 master-0 kubenswrapper[9136]: I1203 21:50:30.453514 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" event={"ID":"246b7846-0dfd-43a8-bcfa-81e7435060dc","Type":"ContainerDied","Data":"13495d8d6fbe117fe450e36fc69d037193ff935ef6fdc0f5ba92833ef4c5c160"} Dec 03 21:50:30.455647 master-0 kubenswrapper[9136]: I1203 21:50:30.455590 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=1.455556433 podStartE2EDuration="1.455556433s" podCreationTimestamp="2025-12-03 21:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:30.445571014 +0000 UTC m=+36.720747446" watchObservedRunningTime="2025-12-03 21:50:30.455556433 +0000 UTC m=+36.730732805" Dec 03 21:50:30.492393 master-0 kubenswrapper[9136]: I1203 21:50:30.492302 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9skcn" podStartSLOduration=2.943470075 podStartE2EDuration="6.492275078s" podCreationTimestamp="2025-12-03 21:50:24 +0000 UTC" firstStartedPulling="2025-12-03 21:50:25.973186801 +0000 UTC m=+32.248363183" lastFinishedPulling="2025-12-03 21:50:29.521991804 +0000 UTC m=+35.797168186" observedRunningTime="2025-12-03 21:50:30.491220214 +0000 UTC m=+36.766396606" watchObservedRunningTime="2025-12-03 21:50:30.492275078 +0000 UTC m=+36.767451460" Dec 03 21:50:31.484823 master-0 kubenswrapper[9136]: I1203 21:50:31.482485 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" event={"ID":"b522af85-394e-4965-9bf4-83f48fb8ad94","Type":"ContainerStarted","Data":"4b1166e8f67f1b8e3c3da79703b39f5d3eeb828dd6b409384af319ea114c3a44"} Dec 03 21:50:31.484823 master-0 kubenswrapper[9136]: I1203 21:50:31.482564 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" event={"ID":"b522af85-394e-4965-9bf4-83f48fb8ad94","Type":"ContainerStarted","Data":"1bdcc3a3124ad94770a182c67aca62fe88b461475325268a6a43918fe7e0c48c"} Dec 03 21:50:31.495194 master-0 kubenswrapper[9136]: I1203 21:50:31.494032 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" event={"ID":"246b7846-0dfd-43a8-bcfa-81e7435060dc","Type":"ContainerStarted","Data":"11715522c97cfb44d9c92607c2beb2b153e402e489af1cd9ee03cdc3958e11c7"} Dec 03 21:50:31.508529 master-0 kubenswrapper[9136]: I1203 21:50:31.508430 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" podStartSLOduration=10.588284885 podStartE2EDuration="19.508409879s" podCreationTimestamp="2025-12-03 21:50:12 +0000 UTC" firstStartedPulling="2025-12-03 21:50:20.601926532 +0000 UTC m=+26.877102914" lastFinishedPulling="2025-12-03 21:50:29.522051526 +0000 UTC m=+35.797227908" observedRunningTime="2025-12-03 21:50:31.506533089 +0000 UTC m=+37.781709481" watchObservedRunningTime="2025-12-03 21:50:31.508409879 +0000 UTC m=+37.783586261" Dec 03 21:50:31.530839 master-0 kubenswrapper[9136]: I1203 21:50:31.528122 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" podStartSLOduration=2.890024215 podStartE2EDuration="7.528099729s" podCreationTimestamp="2025-12-03 21:50:24 +0000 UTC" firstStartedPulling="2025-12-03 21:50:24.881112541 +0000 UTC m=+31.156288923" lastFinishedPulling="2025-12-03 21:50:29.519188055 +0000 UTC m=+35.794364437" observedRunningTime="2025-12-03 21:50:31.527699056 +0000 UTC m=+37.802875458" watchObservedRunningTime="2025-12-03 21:50:31.528099729 +0000 UTC m=+37.803276131" Dec 03 21:50:32.372611 master-0 kubenswrapper[9136]: I1203 21:50:32.372552 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:50:33.782309 master-0 kubenswrapper[9136]: I1203 21:50:33.780342 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 03 21:50:33.782309 master-0 kubenswrapper[9136]: I1203 21:50:33.781273 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.793247 master-0 kubenswrapper[9136]: I1203 21:50:33.785150 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 21:50:33.793247 master-0 kubenswrapper[9136]: I1203 21:50:33.787834 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 03 21:50:33.863204 master-0 kubenswrapper[9136]: I1203 21:50:33.863120 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.863570 master-0 kubenswrapper[9136]: I1203 21:50:33.863245 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c2364d3-47b2-4784-9c42-76bf2547b797-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.863570 master-0 kubenswrapper[9136]: I1203 21:50:33.863291 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-var-lock\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.965949 master-0 kubenswrapper[9136]: I1203 21:50:33.965737 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c2364d3-47b2-4784-9c42-76bf2547b797-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.965949 master-0 kubenswrapper[9136]: I1203 21:50:33.965899 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-var-lock\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.966323 master-0 kubenswrapper[9136]: I1203 21:50:33.966046 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-var-lock\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.966323 master-0 kubenswrapper[9136]: I1203 21:50:33.966134 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.966323 master-0 kubenswrapper[9136]: I1203 21:50:33.966014 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:33.986737 master-0 kubenswrapper[9136]: I1203 21:50:33.986685 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c2364d3-47b2-4784-9c42-76bf2547b797-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:34.108342 master-0 kubenswrapper[9136]: I1203 21:50:34.108187 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:50:34.496765 master-0 kubenswrapper[9136]: I1203 21:50:34.496707 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:34.497079 master-0 kubenswrapper[9136]: I1203 21:50:34.497061 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:34.506029 master-0 kubenswrapper[9136]: I1203 21:50:34.505992 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:34.519488 master-0 kubenswrapper[9136]: I1203 21:50:34.519442 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 21:50:36.439801 master-0 kubenswrapper[9136]: I1203 21:50:36.439719 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:36.440345 master-0 kubenswrapper[9136]: I1203 21:50:36.439830 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:36.448392 master-0 kubenswrapper[9136]: I1203 21:50:36.448349 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:36.530339 master-0 kubenswrapper[9136]: I1203 21:50:36.529869 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 21:50:36.532024 master-0 kubenswrapper[9136]: I1203 21:50:36.531937 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 21:50:36.532347 master-0 kubenswrapper[9136]: I1203 21:50:36.532271 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="b657e787-8104-4a1d-8c2f-1e75d41858b3" containerName="installer" containerID="cri-o://6d943366548026c48336b1c340f964986628615dcb1f1fd5e02b9eac957b2d2a" gracePeriod=30 Dec 03 21:50:37.492638 master-0 kubenswrapper[9136]: I1203 21:50:37.492582 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774b988f79-vt89b"] Dec 03 21:50:37.514443 master-0 kubenswrapper[9136]: I1203 21:50:37.513147 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2"] Dec 03 21:50:37.536355 master-0 kubenswrapper[9136]: I1203 21:50:37.536318 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_b657e787-8104-4a1d-8c2f-1e75d41858b3/installer/0.log" Dec 03 21:50:37.536589 master-0 kubenswrapper[9136]: I1203 21:50:37.536363 9136 generic.go:334] "Generic (PLEG): container finished" podID="b657e787-8104-4a1d-8c2f-1e75d41858b3" containerID="6d943366548026c48336b1c340f964986628615dcb1f1fd5e02b9eac957b2d2a" exitCode=1 Dec 03 21:50:37.537185 master-0 kubenswrapper[9136]: I1203 21:50:37.537161 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b657e787-8104-4a1d-8c2f-1e75d41858b3","Type":"ContainerDied","Data":"6d943366548026c48336b1c340f964986628615dcb1f1fd5e02b9eac957b2d2a"} Dec 03 21:50:37.727207 master-0 kubenswrapper[9136]: I1203 21:50:37.727131 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49"] Dec 03 21:50:37.728046 master-0 kubenswrapper[9136]: I1203 21:50:37.728013 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:37.730996 master-0 kubenswrapper[9136]: I1203 21:50:37.730950 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 21:50:37.731370 master-0 kubenswrapper[9136]: I1203 21:50:37.731328 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 21:50:37.731653 master-0 kubenswrapper[9136]: I1203 21:50:37.731618 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 21:50:37.733948 master-0 kubenswrapper[9136]: I1203 21:50:37.733139 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49"] Dec 03 21:50:37.837986 master-0 kubenswrapper[9136]: I1203 21:50:37.837651 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:37.837986 master-0 kubenswrapper[9136]: I1203 21:50:37.837729 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gds\" (UniqueName: \"kubernetes.io/projected/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-kube-api-access-n4gds\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:37.939693 master-0 kubenswrapper[9136]: I1203 21:50:37.939629 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:37.939902 master-0 kubenswrapper[9136]: I1203 21:50:37.939723 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gds\" (UniqueName: \"kubernetes.io/projected/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-kube-api-access-n4gds\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:37.943451 master-0 kubenswrapper[9136]: I1203 21:50:37.943362 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:37.961992 master-0 kubenswrapper[9136]: I1203 21:50:37.960585 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gds\" (UniqueName: \"kubernetes.io/projected/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-kube-api-access-n4gds\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:38.045853 master-0 kubenswrapper[9136]: I1203 21:50:38.045715 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 21:50:38.941789 master-0 kubenswrapper[9136]: I1203 21:50:38.940913 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 21:50:38.942802 master-0 kubenswrapper[9136]: I1203 21:50:38.942667 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.008474 master-0 kubenswrapper[9136]: I1203 21:50:39.005852 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 21:50:39.020890 master-0 kubenswrapper[9136]: I1203 21:50:39.020514 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 21:50:39.021243 master-0 kubenswrapper[9136]: I1203 21:50:39.021094 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.031019 master-0 kubenswrapper[9136]: I1203 21:50:39.025960 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 21:50:39.041059 master-0 kubenswrapper[9136]: I1203 21:50:39.041006 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 21:50:39.054491 master-0 kubenswrapper[9136]: I1203 21:50:39.054243 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-var-lock\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.054491 master-0 kubenswrapper[9136]: I1203 21:50:39.054308 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.054491 master-0 kubenswrapper[9136]: I1203 21:50:39.054361 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kube-api-access\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.156070 master-0 kubenswrapper[9136]: I1203 21:50:39.155986 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.156070 master-0 kubenswrapper[9136]: I1203 21:50:39.156076 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-var-lock\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.156413 master-0 kubenswrapper[9136]: I1203 21:50:39.156103 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.156413 master-0 kubenswrapper[9136]: I1203 21:50:39.156146 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-var-lock\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.156413 master-0 kubenswrapper[9136]: I1203 21:50:39.156183 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kube-api-access\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.156413 master-0 kubenswrapper[9136]: I1203 21:50:39.156228 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.156413 master-0 kubenswrapper[9136]: I1203 21:50:39.156334 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-var-lock\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.156413 master-0 kubenswrapper[9136]: I1203 21:50:39.156381 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.186926 master-0 kubenswrapper[9136]: I1203 21:50:39.186865 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kube-api-access\") pod \"installer-3-master-0\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.256969 master-0 kubenswrapper[9136]: I1203 21:50:39.256800 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-var-lock\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.256969 master-0 kubenswrapper[9136]: I1203 21:50:39.256891 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.256969 master-0 kubenswrapper[9136]: I1203 21:50:39.256926 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.257315 master-0 kubenswrapper[9136]: I1203 21:50:39.256996 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.257315 master-0 kubenswrapper[9136]: I1203 21:50:39.257044 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-var-lock\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.265322 master-0 kubenswrapper[9136]: I1203 21:50:39.265271 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:39.278966 master-0 kubenswrapper[9136]: I1203 21:50:39.278914 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kube-api-access\") pod \"installer-1-master-0\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.344604 master-0 kubenswrapper[9136]: I1203 21:50:39.344548 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:50:39.928827 master-0 kubenswrapper[9136]: I1203 21:50:39.928751 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_b657e787-8104-4a1d-8c2f-1e75d41858b3/installer/0.log" Dec 03 21:50:39.929002 master-0 kubenswrapper[9136]: I1203 21:50:39.928855 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:39.944614 master-0 kubenswrapper[9136]: I1203 21:50:39.944523 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Dec 03 21:50:39.955798 master-0 kubenswrapper[9136]: W1203 21:50:39.952253 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4c2364d3_47b2_4784_9c42_76bf2547b797.slice/crio-2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c WatchSource:0}: Error finding container 2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c: Status 404 returned error can't find the container with id 2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.068394 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-kubelet-dir\") pod \"b657e787-8104-4a1d-8c2f-1e75d41858b3\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.068483 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b657e787-8104-4a1d-8c2f-1e75d41858b3-kube-api-access\") pod \"b657e787-8104-4a1d-8c2f-1e75d41858b3\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.068568 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b657e787-8104-4a1d-8c2f-1e75d41858b3" (UID: "b657e787-8104-4a1d-8c2f-1e75d41858b3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.068743 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-var-lock" (OuterVolumeSpecName: "var-lock") pod "b657e787-8104-4a1d-8c2f-1e75d41858b3" (UID: "b657e787-8104-4a1d-8c2f-1e75d41858b3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.069112 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-var-lock\") pod \"b657e787-8104-4a1d-8c2f-1e75d41858b3\" (UID: \"b657e787-8104-4a1d-8c2f-1e75d41858b3\") " Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.069436 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:40.070813 master-0 kubenswrapper[9136]: I1203 21:50:40.069449 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b657e787-8104-4a1d-8c2f-1e75d41858b3-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:40.085792 master-0 kubenswrapper[9136]: I1203 21:50:40.082812 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b657e787-8104-4a1d-8c2f-1e75d41858b3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b657e787-8104-4a1d-8c2f-1e75d41858b3" (UID: "b657e787-8104-4a1d-8c2f-1e75d41858b3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:40.179801 master-0 kubenswrapper[9136]: I1203 21:50:40.170091 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b657e787-8104-4a1d-8c2f-1e75d41858b3-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:40.220469 master-0 kubenswrapper[9136]: I1203 21:50:40.213837 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49"] Dec 03 21:50:40.220469 master-0 kubenswrapper[9136]: I1203 21:50:40.214148 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 21:50:40.252805 master-0 kubenswrapper[9136]: I1203 21:50:40.247011 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 21:50:40.293532 master-0 kubenswrapper[9136]: W1203 21:50:40.290021 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4f66f491_f82d_4e8c_8929_4675f99aa5b7.slice/crio-53fd5d5cbc26d76aa9e3df18e6da9dd666b3449013285f841eaf2b67b16c171a WatchSource:0}: Error finding container 53fd5d5cbc26d76aa9e3df18e6da9dd666b3449013285f841eaf2b67b16c171a: Status 404 returned error can't find the container with id 53fd5d5cbc26d76aa9e3df18e6da9dd666b3449013285f841eaf2b67b16c171a Dec 03 21:50:40.312367 master-0 kubenswrapper[9136]: W1203 21:50:40.311671 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba624ed0_32cc_4c87_81a5_708a8a8a7f88.slice/crio-d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787 WatchSource:0}: Error finding container d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787: Status 404 returned error can't find the container with id d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787 Dec 03 21:50:40.513670 master-0 kubenswrapper[9136]: I1203 21:50:40.513614 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9skcn" Dec 03 21:50:40.566978 master-0 kubenswrapper[9136]: I1203 21:50:40.566333 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e5aa1b71-5054-4695-a4e2-817ce5274ba9","Type":"ContainerStarted","Data":"9090a7aa6d01c7ba20e48285d09098429c32c2728e0577602eacfc9483748bc2"} Dec 03 21:50:40.569434 master-0 kubenswrapper[9136]: I1203 21:50:40.569331 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" event={"ID":"134c10ef-9f37-4a77-8e8b-4f8326bc8f40","Type":"ContainerStarted","Data":"064594605d70a71f5b8c396507bbd62023c90012ba2c394b075d83b5e3f0a671"} Dec 03 21:50:40.569434 master-0 kubenswrapper[9136]: I1203 21:50:40.569382 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" event={"ID":"134c10ef-9f37-4a77-8e8b-4f8326bc8f40","Type":"ContainerStarted","Data":"b43199be9a4f7a0f84f14243db251c1ad19549fc366171d1c2e7a7eabf31f5ce"} Dec 03 21:50:40.571384 master-0 kubenswrapper[9136]: I1203 21:50:40.571357 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4c2364d3-47b2-4784-9c42-76bf2547b797","Type":"ContainerStarted","Data":"5850f56760e2a61c527c84ff5f17d82ffc3198cbe6b6606b3d9f5f38cfe114d6"} Dec 03 21:50:40.571384 master-0 kubenswrapper[9136]: I1203 21:50:40.571386 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4c2364d3-47b2-4784-9c42-76bf2547b797","Type":"ContainerStarted","Data":"2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c"} Dec 03 21:50:40.574078 master-0 kubenswrapper[9136]: I1203 21:50:40.574033 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" event={"ID":"bebd69d2-5b0f-4b66-8722-d6861eba3e12","Type":"ContainerStarted","Data":"6bdd0a687dd855d3bbf357b436827e9890c3c340620b3413817f006a0436fe45"} Dec 03 21:50:40.576182 master-0 kubenswrapper[9136]: I1203 21:50:40.576135 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_b657e787-8104-4a1d-8c2f-1e75d41858b3/installer/0.log" Dec 03 21:50:40.576318 master-0 kubenswrapper[9136]: I1203 21:50:40.576221 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"b657e787-8104-4a1d-8c2f-1e75d41858b3","Type":"ContainerDied","Data":"6127cb16baea8f72776fda0c949a85566c1cd225cc68e314233a55938a271ab7"} Dec 03 21:50:40.576373 master-0 kubenswrapper[9136]: I1203 21:50:40.576325 9136 scope.go:117] "RemoveContainer" containerID="6d943366548026c48336b1c340f964986628615dcb1f1fd5e02b9eac957b2d2a" Dec 03 21:50:40.576738 master-0 kubenswrapper[9136]: I1203 21:50:40.576708 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Dec 03 21:50:40.577737 master-0 kubenswrapper[9136]: I1203 21:50:40.577702 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"4f66f491-f82d-4e8c-8929-4675f99aa5b7","Type":"ContainerStarted","Data":"53fd5d5cbc26d76aa9e3df18e6da9dd666b3449013285f841eaf2b67b16c171a"} Dec 03 21:50:40.579636 master-0 kubenswrapper[9136]: I1203 21:50:40.579587 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6569" event={"ID":"812401c0-d1ac-4857-b939-217b7b07f8bc","Type":"ContainerStarted","Data":"175db6c084e2a0e95fc5a7e076937ab317342de6224ebf40e03cac264b6fd700"} Dec 03 21:50:40.581196 master-0 kubenswrapper[9136]: I1203 21:50:40.581152 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" event={"ID":"ba624ed0-32cc-4c87-81a5-708a8a8a7f88","Type":"ContainerStarted","Data":"d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787"} Dec 03 21:50:40.585914 master-0 kubenswrapper[9136]: I1203 21:50:40.582532 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerStarted","Data":"82722f9afea36ecc1e8162d281159bb72f29817202c15f48a80157dddaa3525e"} Dec 03 21:50:40.585914 master-0 kubenswrapper[9136]: I1203 21:50:40.583308 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:40.598093 master-0 kubenswrapper[9136]: I1203 21:50:40.597760 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" podUID="3a5df13c-bb21-42f1-acb0-06c888081325" containerName="controller-manager" containerID="cri-o://a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842" gracePeriod=30 Dec 03 21:50:40.599003 master-0 kubenswrapper[9136]: I1203 21:50:40.598210 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" event={"ID":"3a5df13c-bb21-42f1-acb0-06c888081325","Type":"ContainerStarted","Data":"a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842"} Dec 03 21:50:40.599146 master-0 kubenswrapper[9136]: I1203 21:50:40.599100 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:40.601969 master-0 kubenswrapper[9136]: I1203 21:50:40.601656 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:50:40.603306 master-0 kubenswrapper[9136]: I1203 21:50:40.603264 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" event={"ID":"9e45de77-c158-42d3-84be-8f3398e2b2d8","Type":"ContainerStarted","Data":"52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1"} Dec 03 21:50:40.603446 master-0 kubenswrapper[9136]: I1203 21:50:40.603413 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" podUID="9e45de77-c158-42d3-84be-8f3398e2b2d8" containerName="route-controller-manager" containerID="cri-o://52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1" gracePeriod=30 Dec 03 21:50:40.603840 master-0 kubenswrapper[9136]: I1203 21:50:40.603818 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:40.611312 master-0 kubenswrapper[9136]: I1203 21:50:40.611266 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" event={"ID":"04f5fc52-4ec2-48c3-8441-2b15ad632233","Type":"ContainerStarted","Data":"957fee7d4699a08d2cc2951a06021265bc26437f145867b0dd5f82dd49642db5"} Dec 03 21:50:40.611399 master-0 kubenswrapper[9136]: I1203 21:50:40.611348 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:40.611449 master-0 kubenswrapper[9136]: I1203 21:50:40.611426 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:50:40.617012 master-0 kubenswrapper[9136]: I1203 21:50:40.616985 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:40.680664 master-0 kubenswrapper[9136]: I1203 21:50:40.679734 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" podStartSLOduration=7.235969012 podStartE2EDuration="16.67971583s" podCreationTimestamp="2025-12-03 21:50:24 +0000 UTC" firstStartedPulling="2025-12-03 21:50:30.124988676 +0000 UTC m=+36.400165058" lastFinishedPulling="2025-12-03 21:50:39.568735494 +0000 UTC m=+45.843911876" observedRunningTime="2025-12-03 21:50:40.677477777 +0000 UTC m=+46.952654169" watchObservedRunningTime="2025-12-03 21:50:40.67971583 +0000 UTC m=+46.954892212" Dec 03 21:50:40.728275 master-0 kubenswrapper[9136]: I1203 21:50:40.726687 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" podStartSLOduration=7.307550128 podStartE2EDuration="16.72666205s" podCreationTimestamp="2025-12-03 21:50:24 +0000 UTC" firstStartedPulling="2025-12-03 21:50:30.132473697 +0000 UTC m=+36.407650079" lastFinishedPulling="2025-12-03 21:50:39.551585619 +0000 UTC m=+45.826762001" observedRunningTime="2025-12-03 21:50:40.725452922 +0000 UTC m=+47.000629324" watchObservedRunningTime="2025-12-03 21:50:40.72666205 +0000 UTC m=+47.001838432" Dec 03 21:50:40.760844 master-0 kubenswrapper[9136]: I1203 21:50:40.757537 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=7.757513189 podStartE2EDuration="7.757513189s" podCreationTimestamp="2025-12-03 21:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:40.757090957 +0000 UTC m=+47.032267359" watchObservedRunningTime="2025-12-03 21:50:40.757513189 +0000 UTC m=+47.032689571" Dec 03 21:50:41.005146 master-0 kubenswrapper[9136]: I1203 21:50:41.003108 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 21:50:41.012859 master-0 kubenswrapper[9136]: I1203 21:50:41.012826 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:41.088390 master-0 kubenswrapper[9136]: I1203 21:50:41.088273 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Dec 03 21:50:41.150011 master-0 kubenswrapper[9136]: I1203 21:50:41.149964 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:41.185642 master-0 kubenswrapper[9136]: I1203 21:50:41.185580 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-proxy-ca-bundles\") pod \"3a5df13c-bb21-42f1-acb0-06c888081325\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " Dec 03 21:50:41.185642 master-0 kubenswrapper[9136]: I1203 21:50:41.185648 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-config\") pod \"3a5df13c-bb21-42f1-acb0-06c888081325\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " Dec 03 21:50:41.185899 master-0 kubenswrapper[9136]: I1203 21:50:41.185676 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5df13c-bb21-42f1-acb0-06c888081325-serving-cert\") pod \"3a5df13c-bb21-42f1-acb0-06c888081325\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " Dec 03 21:50:41.185899 master-0 kubenswrapper[9136]: I1203 21:50:41.185719 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-client-ca\") pod \"3a5df13c-bb21-42f1-acb0-06c888081325\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " Dec 03 21:50:41.185899 master-0 kubenswrapper[9136]: I1203 21:50:41.185829 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb9fb\" (UniqueName: \"kubernetes.io/projected/3a5df13c-bb21-42f1-acb0-06c888081325-kube-api-access-hb9fb\") pod \"3a5df13c-bb21-42f1-acb0-06c888081325\" (UID: \"3a5df13c-bb21-42f1-acb0-06c888081325\") " Dec 03 21:50:41.187609 master-0 kubenswrapper[9136]: I1203 21:50:41.187433 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-config" (OuterVolumeSpecName: "config") pod "3a5df13c-bb21-42f1-acb0-06c888081325" (UID: "3a5df13c-bb21-42f1-acb0-06c888081325"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:41.187745 master-0 kubenswrapper[9136]: I1203 21:50:41.187714 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-client-ca" (OuterVolumeSpecName: "client-ca") pod "3a5df13c-bb21-42f1-acb0-06c888081325" (UID: "3a5df13c-bb21-42f1-acb0-06c888081325"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:41.187933 master-0 kubenswrapper[9136]: I1203 21:50:41.187898 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3a5df13c-bb21-42f1-acb0-06c888081325" (UID: "3a5df13c-bb21-42f1-acb0-06c888081325"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:41.190539 master-0 kubenswrapper[9136]: I1203 21:50:41.190496 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5df13c-bb21-42f1-acb0-06c888081325-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3a5df13c-bb21-42f1-acb0-06c888081325" (UID: "3a5df13c-bb21-42f1-acb0-06c888081325"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:41.190878 master-0 kubenswrapper[9136]: I1203 21:50:41.190846 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5df13c-bb21-42f1-acb0-06c888081325-kube-api-access-hb9fb" (OuterVolumeSpecName: "kube-api-access-hb9fb") pod "3a5df13c-bb21-42f1-acb0-06c888081325" (UID: "3a5df13c-bb21-42f1-acb0-06c888081325"). InnerVolumeSpecName "kube-api-access-hb9fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:41.287481 master-0 kubenswrapper[9136]: I1203 21:50:41.287295 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-config\") pod \"9e45de77-c158-42d3-84be-8f3398e2b2d8\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " Dec 03 21:50:41.288937 master-0 kubenswrapper[9136]: I1203 21:50:41.288897 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-config" (OuterVolumeSpecName: "config") pod "9e45de77-c158-42d3-84be-8f3398e2b2d8" (UID: "9e45de77-c158-42d3-84be-8f3398e2b2d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:41.289054 master-0 kubenswrapper[9136]: I1203 21:50:41.288971 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-client-ca\") pod \"9e45de77-c158-42d3-84be-8f3398e2b2d8\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " Dec 03 21:50:41.289416 master-0 kubenswrapper[9136]: I1203 21:50:41.289397 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e45de77-c158-42d3-84be-8f3398e2b2d8-serving-cert\") pod \"9e45de77-c158-42d3-84be-8f3398e2b2d8\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " Dec 03 21:50:41.289537 master-0 kubenswrapper[9136]: I1203 21:50:41.289504 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-client-ca" (OuterVolumeSpecName: "client-ca") pod "9e45de77-c158-42d3-84be-8f3398e2b2d8" (UID: "9e45de77-c158-42d3-84be-8f3398e2b2d8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:50:41.289660 master-0 kubenswrapper[9136]: I1203 21:50:41.289640 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx6bp\" (UniqueName: \"kubernetes.io/projected/9e45de77-c158-42d3-84be-8f3398e2b2d8-kube-api-access-jx6bp\") pod \"9e45de77-c158-42d3-84be-8f3398e2b2d8\" (UID: \"9e45de77-c158-42d3-84be-8f3398e2b2d8\") " Dec 03 21:50:41.290105 master-0 kubenswrapper[9136]: I1203 21:50:41.290084 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.290207 master-0 kubenswrapper[9136]: I1203 21:50:41.290193 9136 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.290288 master-0 kubenswrapper[9136]: I1203 21:50:41.290275 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.290370 master-0 kubenswrapper[9136]: I1203 21:50:41.290358 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a5df13c-bb21-42f1-acb0-06c888081325-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.290451 master-0 kubenswrapper[9136]: I1203 21:50:41.290438 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5df13c-bb21-42f1-acb0-06c888081325-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.290545 master-0 kubenswrapper[9136]: I1203 21:50:41.290531 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb9fb\" (UniqueName: \"kubernetes.io/projected/3a5df13c-bb21-42f1-acb0-06c888081325-kube-api-access-hb9fb\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.290626 master-0 kubenswrapper[9136]: I1203 21:50:41.290614 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e45de77-c158-42d3-84be-8f3398e2b2d8-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.292383 master-0 kubenswrapper[9136]: I1203 21:50:41.292346 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e45de77-c158-42d3-84be-8f3398e2b2d8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9e45de77-c158-42d3-84be-8f3398e2b2d8" (UID: "9e45de77-c158-42d3-84be-8f3398e2b2d8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:50:41.303605 master-0 kubenswrapper[9136]: I1203 21:50:41.303520 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e45de77-c158-42d3-84be-8f3398e2b2d8-kube-api-access-jx6bp" (OuterVolumeSpecName: "kube-api-access-jx6bp") pod "9e45de77-c158-42d3-84be-8f3398e2b2d8" (UID: "9e45de77-c158-42d3-84be-8f3398e2b2d8"). InnerVolumeSpecName "kube-api-access-jx6bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:41.393949 master-0 kubenswrapper[9136]: I1203 21:50:41.393352 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e45de77-c158-42d3-84be-8f3398e2b2d8-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.393949 master-0 kubenswrapper[9136]: I1203 21:50:41.393389 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx6bp\" (UniqueName: \"kubernetes.io/projected/9e45de77-c158-42d3-84be-8f3398e2b2d8-kube-api-access-jx6bp\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:41.545305 master-0 kubenswrapper[9136]: I1203 21:50:41.545242 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs"] Dec 03 21:50:41.545530 master-0 kubenswrapper[9136]: E1203 21:50:41.545445 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b657e787-8104-4a1d-8c2f-1e75d41858b3" containerName="installer" Dec 03 21:50:41.545530 master-0 kubenswrapper[9136]: I1203 21:50:41.545463 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="b657e787-8104-4a1d-8c2f-1e75d41858b3" containerName="installer" Dec 03 21:50:41.545530 master-0 kubenswrapper[9136]: E1203 21:50:41.545480 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e45de77-c158-42d3-84be-8f3398e2b2d8" containerName="route-controller-manager" Dec 03 21:50:41.545530 master-0 kubenswrapper[9136]: I1203 21:50:41.545487 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e45de77-c158-42d3-84be-8f3398e2b2d8" containerName="route-controller-manager" Dec 03 21:50:41.545530 master-0 kubenswrapper[9136]: E1203 21:50:41.545496 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5df13c-bb21-42f1-acb0-06c888081325" containerName="controller-manager" Dec 03 21:50:41.545530 master-0 kubenswrapper[9136]: I1203 21:50:41.545503 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5df13c-bb21-42f1-acb0-06c888081325" containerName="controller-manager" Dec 03 21:50:41.545829 master-0 kubenswrapper[9136]: I1203 21:50:41.545593 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e45de77-c158-42d3-84be-8f3398e2b2d8" containerName="route-controller-manager" Dec 03 21:50:41.545829 master-0 kubenswrapper[9136]: I1203 21:50:41.545612 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="b657e787-8104-4a1d-8c2f-1e75d41858b3" containerName="installer" Dec 03 21:50:41.545829 master-0 kubenswrapper[9136]: I1203 21:50:41.545622 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5df13c-bb21-42f1-acb0-06c888081325" containerName="controller-manager" Dec 03 21:50:41.546191 master-0 kubenswrapper[9136]: I1203 21:50:41.546163 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.550847 master-0 kubenswrapper[9136]: I1203 21:50:41.550757 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 21:50:41.550964 master-0 kubenswrapper[9136]: I1203 21:50:41.550824 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 21:50:41.551017 master-0 kubenswrapper[9136]: I1203 21:50:41.550904 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 21:50:41.554194 master-0 kubenswrapper[9136]: I1203 21:50:41.551067 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 21:50:41.554281 master-0 kubenswrapper[9136]: I1203 21:50:41.551260 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 21:50:41.618709 master-0 kubenswrapper[9136]: I1203 21:50:41.618611 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e5aa1b71-5054-4695-a4e2-817ce5274ba9","Type":"ContainerStarted","Data":"7db16df4a826110868fed85df42888a8aa70e542af4ac113bfb0d52af03cbab5"} Dec 03 21:50:41.622968 master-0 kubenswrapper[9136]: I1203 21:50:41.622891 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"4f66f491-f82d-4e8c-8929-4675f99aa5b7","Type":"ContainerStarted","Data":"f69b2fadd7aa21980afa8fbd2988f6315c3c2d241badb575affba3e4afbe34e8"} Dec 03 21:50:41.626188 master-0 kubenswrapper[9136]: I1203 21:50:41.626157 9136 generic.go:334] "Generic (PLEG): container finished" podID="3a5df13c-bb21-42f1-acb0-06c888081325" containerID="a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842" exitCode=0 Dec 03 21:50:41.626309 master-0 kubenswrapper[9136]: I1203 21:50:41.626240 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" event={"ID":"3a5df13c-bb21-42f1-acb0-06c888081325","Type":"ContainerDied","Data":"a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842"} Dec 03 21:50:41.626363 master-0 kubenswrapper[9136]: I1203 21:50:41.626207 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" Dec 03 21:50:41.626421 master-0 kubenswrapper[9136]: I1203 21:50:41.626376 9136 scope.go:117] "RemoveContainer" containerID="a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842" Dec 03 21:50:41.626507 master-0 kubenswrapper[9136]: I1203 21:50:41.626327 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774b988f79-vt89b" event={"ID":"3a5df13c-bb21-42f1-acb0-06c888081325","Type":"ContainerDied","Data":"6567a8d6c13e79e7e931577c9b49f0047b20d4afd508c69fedf219ce2d715a62"} Dec 03 21:50:41.630178 master-0 kubenswrapper[9136]: I1203 21:50:41.630150 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h6569" event={"ID":"812401c0-d1ac-4857-b939-217b7b07f8bc","Type":"ContainerStarted","Data":"e939b86a66d071a681e09ddc3e27632e44e46c2f0fd6a90f23694a1fa24aaf68"} Dec 03 21:50:41.632217 master-0 kubenswrapper[9136]: I1203 21:50:41.632195 9136 generic.go:334] "Generic (PLEG): container finished" podID="9e45de77-c158-42d3-84be-8f3398e2b2d8" containerID="52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1" exitCode=0 Dec 03 21:50:41.632325 master-0 kubenswrapper[9136]: I1203 21:50:41.632299 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" Dec 03 21:50:41.632413 master-0 kubenswrapper[9136]: I1203 21:50:41.632261 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" event={"ID":"9e45de77-c158-42d3-84be-8f3398e2b2d8","Type":"ContainerDied","Data":"52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1"} Dec 03 21:50:41.632480 master-0 kubenswrapper[9136]: I1203 21:50:41.632442 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2" event={"ID":"9e45de77-c158-42d3-84be-8f3398e2b2d8","Type":"ContainerDied","Data":"bb44266497b6711c0e2a223d5a9c3f7bd9be4631cc3dc0fcf1c247769e3d3d58"} Dec 03 21:50:41.652110 master-0 kubenswrapper[9136]: I1203 21:50:41.651845 9136 scope.go:117] "RemoveContainer" containerID="a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842" Dec 03 21:50:41.652451 master-0 kubenswrapper[9136]: E1203 21:50:41.652410 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842\": container with ID starting with a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842 not found: ID does not exist" containerID="a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842" Dec 03 21:50:41.652513 master-0 kubenswrapper[9136]: I1203 21:50:41.652447 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842"} err="failed to get container status \"a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842\": rpc error: code = NotFound desc = could not find container \"a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842\": container with ID starting with a677cbdda3eff1e1b267fd6e5a0d2220319ed6d5add62fcb7cba309ddf6dc842 not found: ID does not exist" Dec 03 21:50:41.652513 master-0 kubenswrapper[9136]: I1203 21:50:41.652478 9136 scope.go:117] "RemoveContainer" containerID="52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1" Dec 03 21:50:41.675063 master-0 kubenswrapper[9136]: I1203 21:50:41.675010 9136 scope.go:117] "RemoveContainer" containerID="52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1" Dec 03 21:50:41.675567 master-0 kubenswrapper[9136]: E1203 21:50:41.675533 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1\": container with ID starting with 52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1 not found: ID does not exist" containerID="52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1" Dec 03 21:50:41.675628 master-0 kubenswrapper[9136]: I1203 21:50:41.675585 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1"} err="failed to get container status \"52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1\": rpc error: code = NotFound desc = could not find container \"52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1\": container with ID starting with 52830615fff896325f76f9c996f54951b2d6e304cf3544d1a091e373049ac6d1 not found: ID does not exist" Dec 03 21:50:41.696869 master-0 kubenswrapper[9136]: I1203 21:50:41.696786 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-auth-proxy-config\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.696869 master-0 kubenswrapper[9136]: I1203 21:50:41.696861 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5a5ef85a-d878-4253-ba90-9306c9490e3c-machine-approver-tls\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.697203 master-0 kubenswrapper[9136]: I1203 21:50:41.696890 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-config\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.697203 master-0 kubenswrapper[9136]: I1203 21:50:41.696924 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57t2\" (UniqueName: \"kubernetes.io/projected/5a5ef85a-d878-4253-ba90-9306c9490e3c-kube-api-access-q57t2\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.742042 master-0 kubenswrapper[9136]: I1203 21:50:41.741958 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=3.741938707 podStartE2EDuration="3.741938707s" podCreationTimestamp="2025-12-03 21:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:41.741627046 +0000 UTC m=+48.016803448" watchObservedRunningTime="2025-12-03 21:50:41.741938707 +0000 UTC m=+48.017115089" Dec 03 21:50:41.798844 master-0 kubenswrapper[9136]: I1203 21:50:41.798750 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5a5ef85a-d878-4253-ba90-9306c9490e3c-machine-approver-tls\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.800844 master-0 kubenswrapper[9136]: I1203 21:50:41.798892 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-config\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.800844 master-0 kubenswrapper[9136]: I1203 21:50:41.799061 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57t2\" (UniqueName: \"kubernetes.io/projected/5a5ef85a-d878-4253-ba90-9306c9490e3c-kube-api-access-q57t2\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.800844 master-0 kubenswrapper[9136]: I1203 21:50:41.799196 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-auth-proxy-config\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.800844 master-0 kubenswrapper[9136]: I1203 21:50:41.799965 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-config\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.801001 master-0 kubenswrapper[9136]: I1203 21:50:41.800902 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-auth-proxy-config\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.802172 master-0 kubenswrapper[9136]: I1203 21:50:41.802143 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5a5ef85a-d878-4253-ba90-9306c9490e3c-machine-approver-tls\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:41.918048 master-0 kubenswrapper[9136]: I1203 21:50:41.917806 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b657e787-8104-4a1d-8c2f-1e75d41858b3" path="/var/lib/kubelet/pods/b657e787-8104-4a1d-8c2f-1e75d41858b3/volumes" Dec 03 21:50:42.073810 master-0 kubenswrapper[9136]: I1203 21:50:42.073176 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57t2\" (UniqueName: \"kubernetes.io/projected/5a5ef85a-d878-4253-ba90-9306c9490e3c-kube-api-access-q57t2\") pod \"machine-approver-5775bfbf6d-f54vs\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:42.161252 master-0 kubenswrapper[9136]: I1203 21:50:42.161175 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:50:42.266903 master-0 kubenswrapper[9136]: I1203 21:50:42.266430 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=3.266306631 podStartE2EDuration="3.266306631s" podCreationTimestamp="2025-12-03 21:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:42.206456783 +0000 UTC m=+48.481633175" watchObservedRunningTime="2025-12-03 21:50:42.266306631 +0000 UTC m=+48.541483023" Dec 03 21:50:42.271856 master-0 kubenswrapper[9136]: I1203 21:50:42.269748 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774b988f79-vt89b"] Dec 03 21:50:42.278369 master-0 kubenswrapper[9136]: I1203 21:50:42.278264 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774b988f79-vt89b"] Dec 03 21:50:42.376357 master-0 kubenswrapper[9136]: I1203 21:50:42.376300 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2"] Dec 03 21:50:42.393263 master-0 kubenswrapper[9136]: I1203 21:50:42.392399 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57689d598d-cj8m2"] Dec 03 21:50:42.553859 master-0 kubenswrapper[9136]: I1203 21:50:42.553224 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87"] Dec 03 21:50:42.553859 master-0 kubenswrapper[9136]: I1203 21:50:42.553742 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.561994 master-0 kubenswrapper[9136]: I1203 21:50:42.557843 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 21:50:42.561994 master-0 kubenswrapper[9136]: I1203 21:50:42.557895 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 21:50:42.561994 master-0 kubenswrapper[9136]: I1203 21:50:42.558041 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 21:50:42.561994 master-0 kubenswrapper[9136]: I1203 21:50:42.558411 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 21:50:42.561994 master-0 kubenswrapper[9136]: I1203 21:50:42.558584 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 21:50:42.564140 master-0 kubenswrapper[9136]: I1203 21:50:42.564105 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj"] Dec 03 21:50:42.565325 master-0 kubenswrapper[9136]: I1203 21:50:42.565295 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.567850 master-0 kubenswrapper[9136]: I1203 21:50:42.567806 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 21:50:42.568274 master-0 kubenswrapper[9136]: I1203 21:50:42.568245 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 21:50:42.568308 master-0 kubenswrapper[9136]: I1203 21:50:42.568295 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 21:50:42.568445 master-0 kubenswrapper[9136]: I1203 21:50:42.568416 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 21:50:42.572596 master-0 kubenswrapper[9136]: I1203 21:50:42.571290 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 21:50:42.575087 master-0 kubenswrapper[9136]: I1203 21:50:42.575052 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 21:50:42.616852 master-0 kubenswrapper[9136]: I1203 21:50:42.616796 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87"] Dec 03 21:50:42.618108 master-0 kubenswrapper[9136]: I1203 21:50:42.618065 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj"] Dec 03 21:50:42.715746 master-0 kubenswrapper[9136]: I1203 21:50:42.715673 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-config\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.715746 master-0 kubenswrapper[9136]: I1203 21:50:42.715729 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-client-ca\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.715746 master-0 kubenswrapper[9136]: I1203 21:50:42.715777 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-serving-cert\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.716069 master-0 kubenswrapper[9136]: I1203 21:50:42.715795 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-proxy-ca-bundles\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.716069 master-0 kubenswrapper[9136]: I1203 21:50:42.715863 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-serving-cert\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.716069 master-0 kubenswrapper[9136]: I1203 21:50:42.715881 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-config\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.716069 master-0 kubenswrapper[9136]: I1203 21:50:42.716011 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz545\" (UniqueName: \"kubernetes.io/projected/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-kube-api-access-dz545\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.716181 master-0 kubenswrapper[9136]: I1203 21:50:42.716075 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-client-ca\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.716181 master-0 kubenswrapper[9136]: I1203 21:50:42.716152 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8mw4\" (UniqueName: \"kubernetes.io/projected/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-kube-api-access-h8mw4\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.817141 master-0 kubenswrapper[9136]: I1203 21:50:42.817086 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-config\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.817141 master-0 kubenswrapper[9136]: I1203 21:50:42.817144 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-client-ca\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.817362 master-0 kubenswrapper[9136]: I1203 21:50:42.817187 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-serving-cert\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.817362 master-0 kubenswrapper[9136]: I1203 21:50:42.817205 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-proxy-ca-bundles\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.817362 master-0 kubenswrapper[9136]: I1203 21:50:42.817243 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-serving-cert\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.817362 master-0 kubenswrapper[9136]: I1203 21:50:42.817261 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-config\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.817362 master-0 kubenswrapper[9136]: I1203 21:50:42.817334 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz545\" (UniqueName: \"kubernetes.io/projected/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-kube-api-access-dz545\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.817362 master-0 kubenswrapper[9136]: I1203 21:50:42.817360 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-client-ca\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.817747 master-0 kubenswrapper[9136]: I1203 21:50:42.817394 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8mw4\" (UniqueName: \"kubernetes.io/projected/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-kube-api-access-h8mw4\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.818666 master-0 kubenswrapper[9136]: I1203 21:50:42.818601 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-client-ca\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.818734 master-0 kubenswrapper[9136]: I1203 21:50:42.818656 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-proxy-ca-bundles\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.818996 master-0 kubenswrapper[9136]: I1203 21:50:42.818964 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-config\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.819040 master-0 kubenswrapper[9136]: I1203 21:50:42.819009 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-client-ca\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.819225 master-0 kubenswrapper[9136]: I1203 21:50:42.819171 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-config\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.822141 master-0 kubenswrapper[9136]: I1203 21:50:42.821947 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-serving-cert\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.823382 master-0 kubenswrapper[9136]: I1203 21:50:42.823333 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-serving-cert\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.836346 master-0 kubenswrapper[9136]: I1203 21:50:42.835396 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz545\" (UniqueName: \"kubernetes.io/projected/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-kube-api-access-dz545\") pod \"route-controller-manager-75678b97b8-qqn87\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.845481 master-0 kubenswrapper[9136]: I1203 21:50:42.845437 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8mw4\" (UniqueName: \"kubernetes.io/projected/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-kube-api-access-h8mw4\") pod \"controller-manager-6b48b87d7b-m7hgj\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:42.883412 master-0 kubenswrapper[9136]: I1203 21:50:42.883309 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:42.892224 master-0 kubenswrapper[9136]: I1203 21:50:42.892173 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:43.494336 master-0 kubenswrapper[9136]: I1203 21:50:43.494252 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87"] Dec 03 21:50:43.495710 master-0 kubenswrapper[9136]: I1203 21:50:43.495664 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj"] Dec 03 21:50:43.656470 master-0 kubenswrapper[9136]: I1203 21:50:43.656415 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerStarted","Data":"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86"} Dec 03 21:50:43.656627 master-0 kubenswrapper[9136]: I1203 21:50:43.656484 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerStarted","Data":"c99595cdd7fb8a317ad7460b612a48324cc29b45b511573083700dbd1f3293bf"} Dec 03 21:50:43.658248 master-0 kubenswrapper[9136]: I1203 21:50:43.658195 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" event={"ID":"ba624ed0-32cc-4c87-81a5-708a8a8a7f88","Type":"ContainerStarted","Data":"6f91acf82d50566dd54057e9b1be20f0d89f3f616993406ed044d52651283dc2"} Dec 03 21:50:43.659627 master-0 kubenswrapper[9136]: I1203 21:50:43.659589 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" event={"ID":"89b8bdab-89ba-4a7d-a464-71a2f240e3c7","Type":"ContainerStarted","Data":"d1f5e826f7ae6ce38586c70ff633062c9e9f839723da64e9fa179afd4ce387ea"} Dec 03 21:50:43.662327 master-0 kubenswrapper[9136]: I1203 21:50:43.662284 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" event={"ID":"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb","Type":"ContainerStarted","Data":"587880584f977d63d726e554f2ecb6f2fb8174c49c96366cee3cd4ee073e231a"} Dec 03 21:50:43.681366 master-0 kubenswrapper[9136]: I1203 21:50:43.681235 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" podStartSLOduration=4.24407399 podStartE2EDuration="6.681207912s" podCreationTimestamp="2025-12-03 21:50:37 +0000 UTC" firstStartedPulling="2025-12-03 21:50:40.329233187 +0000 UTC m=+46.604409569" lastFinishedPulling="2025-12-03 21:50:42.766367109 +0000 UTC m=+49.041543491" observedRunningTime="2025-12-03 21:50:43.679351922 +0000 UTC m=+49.954528314" watchObservedRunningTime="2025-12-03 21:50:43.681207912 +0000 UTC m=+49.956384304" Dec 03 21:50:43.938058 master-0 kubenswrapper[9136]: I1203 21:50:43.938006 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5df13c-bb21-42f1-acb0-06c888081325" path="/var/lib/kubelet/pods/3a5df13c-bb21-42f1-acb0-06c888081325/volumes" Dec 03 21:50:43.939226 master-0 kubenswrapper[9136]: I1203 21:50:43.939199 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e45de77-c158-42d3-84be-8f3398e2b2d8" path="/var/lib/kubelet/pods/9e45de77-c158-42d3-84be-8f3398e2b2d8/volumes" Dec 03 21:50:44.537420 master-0 kubenswrapper[9136]: I1203 21:50:44.537361 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n"] Dec 03 21:50:44.538797 master-0 kubenswrapper[9136]: I1203 21:50:44.538271 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.548945 master-0 kubenswrapper[9136]: I1203 21:50:44.548020 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 21:50:44.548945 master-0 kubenswrapper[9136]: I1203 21:50:44.548295 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 21:50:44.550800 master-0 kubenswrapper[9136]: I1203 21:50:44.550616 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 21:50:44.551191 master-0 kubenswrapper[9136]: I1203 21:50:44.551049 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 21:50:44.562041 master-0 kubenswrapper[9136]: I1203 21:50:44.561997 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n"] Dec 03 21:50:44.662990 master-0 kubenswrapper[9136]: I1203 21:50:44.658899 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kvxd\" (UniqueName: \"kubernetes.io/projected/3a7e0eea-3da8-43de-87bc-d10231e7c239-kube-api-access-6kvxd\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.662990 master-0 kubenswrapper[9136]: I1203 21:50:44.658993 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7e0eea-3da8-43de-87bc-d10231e7c239-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.662990 master-0 kubenswrapper[9136]: I1203 21:50:44.659022 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.719888 master-0 kubenswrapper[9136]: I1203 21:50:44.719810 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" event={"ID":"89b8bdab-89ba-4a7d-a464-71a2f240e3c7","Type":"ContainerStarted","Data":"3bfa8deaf08a2a1e0e9189e6bd019c4a1fa728ac1cad25a37ff4255d9c02c3f0"} Dec 03 21:50:44.721114 master-0 kubenswrapper[9136]: I1203 21:50:44.721014 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:44.726935 master-0 kubenswrapper[9136]: I1203 21:50:44.726698 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" event={"ID":"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb","Type":"ContainerStarted","Data":"2d9a8a18ee922fc6f32018ce01fd96ead1e3d6f161acb32757967568167037fa"} Dec 03 21:50:44.726935 master-0 kubenswrapper[9136]: I1203 21:50:44.726791 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:44.727396 master-0 kubenswrapper[9136]: I1203 21:50:44.727281 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:50:44.732863 master-0 kubenswrapper[9136]: I1203 21:50:44.732758 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:50:44.754934 master-0 kubenswrapper[9136]: I1203 21:50:44.754864 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podStartSLOduration=7.754844248 podStartE2EDuration="7.754844248s" podCreationTimestamp="2025-12-03 21:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:44.753612208 +0000 UTC m=+51.028788600" watchObservedRunningTime="2025-12-03 21:50:44.754844248 +0000 UTC m=+51.030020630" Dec 03 21:50:44.762813 master-0 kubenswrapper[9136]: I1203 21:50:44.761388 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7e0eea-3da8-43de-87bc-d10231e7c239-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.762813 master-0 kubenswrapper[9136]: I1203 21:50:44.761463 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.762813 master-0 kubenswrapper[9136]: I1203 21:50:44.761505 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvxd\" (UniqueName: \"kubernetes.io/projected/3a7e0eea-3da8-43de-87bc-d10231e7c239-kube-api-access-6kvxd\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.764651 master-0 kubenswrapper[9136]: I1203 21:50:44.764625 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.769531 master-0 kubenswrapper[9136]: I1203 21:50:44.768172 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7e0eea-3da8-43de-87bc-d10231e7c239-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.792130 master-0 kubenswrapper[9136]: I1203 21:50:44.792044 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvxd\" (UniqueName: \"kubernetes.io/projected/3a7e0eea-3da8-43de-87bc-d10231e7c239-kube-api-access-6kvxd\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:44.823808 master-0 kubenswrapper[9136]: I1203 21:50:44.821974 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" podStartSLOduration=7.821940772 podStartE2EDuration="7.821940772s" podCreationTimestamp="2025-12-03 21:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:44.808552697 +0000 UTC m=+51.083729089" watchObservedRunningTime="2025-12-03 21:50:44.821940772 +0000 UTC m=+51.097117174" Dec 03 21:50:44.943911 master-0 kubenswrapper[9136]: I1203 21:50:44.943868 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 21:50:45.263958 master-0 kubenswrapper[9136]: I1203 21:50:45.263906 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl"] Dec 03 21:50:45.264899 master-0 kubenswrapper[9136]: I1203 21:50:45.264873 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.268526 master-0 kubenswrapper[9136]: I1203 21:50:45.268481 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 21:50:45.268741 master-0 kubenswrapper[9136]: I1203 21:50:45.268715 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 21:50:45.269079 master-0 kubenswrapper[9136]: I1203 21:50:45.269053 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-lm87f" Dec 03 21:50:45.269185 master-0 kubenswrapper[9136]: I1203 21:50:45.269149 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 21:50:45.294120 master-0 kubenswrapper[9136]: I1203 21:50:45.293934 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl"] Dec 03 21:50:45.370078 master-0 kubenswrapper[9136]: I1203 21:50:45.370003 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.370306 master-0 kubenswrapper[9136]: I1203 21:50:45.370112 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgwxt\" (UniqueName: \"kubernetes.io/projected/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-kube-api-access-wgwxt\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.475460 master-0 kubenswrapper[9136]: I1203 21:50:45.474460 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwxt\" (UniqueName: \"kubernetes.io/projected/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-kube-api-access-wgwxt\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.475460 master-0 kubenswrapper[9136]: I1203 21:50:45.474545 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.478107 master-0 kubenswrapper[9136]: I1203 21:50:45.478032 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.512916 master-0 kubenswrapper[9136]: I1203 21:50:45.512852 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwxt\" (UniqueName: \"kubernetes.io/projected/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-kube-api-access-wgwxt\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:45.588802 master-0 kubenswrapper[9136]: I1203 21:50:45.588646 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 21:50:46.216322 master-0 kubenswrapper[9136]: I1203 21:50:46.216257 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl"] Dec 03 21:50:46.295026 master-0 kubenswrapper[9136]: I1203 21:50:46.294974 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n"] Dec 03 21:50:46.742759 master-0 kubenswrapper[9136]: I1203 21:50:46.742525 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerStarted","Data":"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036"} Dec 03 21:50:46.744219 master-0 kubenswrapper[9136]: I1203 21:50:46.744176 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" event={"ID":"3a7e0eea-3da8-43de-87bc-d10231e7c239","Type":"ContainerStarted","Data":"251348a86327ed6b6f83ac90ec2739d9a0f62a3c63fbdc4adf789f2d9b17f194"} Dec 03 21:50:46.744219 master-0 kubenswrapper[9136]: I1203 21:50:46.744206 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" event={"ID":"3a7e0eea-3da8-43de-87bc-d10231e7c239","Type":"ContainerStarted","Data":"a7ee3d19bf23284e7bcdb3c58da723e7cde6ebcface7f54320c36b317f73830b"} Dec 03 21:50:46.745569 master-0 kubenswrapper[9136]: I1203 21:50:46.745467 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" event={"ID":"62b43fe1-63f5-4d29-90a2-f36cb9e880ff","Type":"ContainerStarted","Data":"09da87e2dab0559242037544c26bbf09555449a73117b70108fdb02f60b3cce2"} Dec 03 21:50:47.573555 master-0 kubenswrapper[9136]: I1203 21:50:47.573131 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 21:50:47.573555 master-0 kubenswrapper[9136]: I1203 21:50:47.573554 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="e5aa1b71-5054-4695-a4e2-817ce5274ba9" containerName="installer" containerID="cri-o://7db16df4a826110868fed85df42888a8aa70e542af4ac113bfb0d52af03cbab5" gracePeriod=30 Dec 03 21:50:47.784928 master-0 kubenswrapper[9136]: I1203 21:50:47.779928 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" podStartSLOduration=4.142021549 podStartE2EDuration="6.779901543s" podCreationTimestamp="2025-12-03 21:50:41 +0000 UTC" firstStartedPulling="2025-12-03 21:50:43.208070356 +0000 UTC m=+49.483246728" lastFinishedPulling="2025-12-03 21:50:45.84595034 +0000 UTC m=+52.121126722" observedRunningTime="2025-12-03 21:50:47.773144823 +0000 UTC m=+54.048321215" watchObservedRunningTime="2025-12-03 21:50:47.779901543 +0000 UTC m=+54.055077925" Dec 03 21:50:47.784928 master-0 kubenswrapper[9136]: I1203 21:50:47.783698 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6"] Dec 03 21:50:47.785639 master-0 kubenswrapper[9136]: I1203 21:50:47.785439 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:47.796819 master-0 kubenswrapper[9136]: I1203 21:50:47.793592 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zj6wp" Dec 03 21:50:47.796819 master-0 kubenswrapper[9136]: I1203 21:50:47.795999 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx"] Dec 03 21:50:47.797228 master-0 kubenswrapper[9136]: I1203 21:50:47.797210 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:47.799589 master-0 kubenswrapper[9136]: I1203 21:50:47.797868 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 21:50:47.799589 master-0 kubenswrapper[9136]: I1203 21:50:47.798141 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 21:50:47.799589 master-0 kubenswrapper[9136]: I1203 21:50:47.798803 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 21:50:47.808796 master-0 kubenswrapper[9136]: I1203 21:50:47.804599 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 21:50:47.808796 master-0 kubenswrapper[9136]: I1203 21:50:47.805040 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-x894t" Dec 03 21:50:47.808796 master-0 kubenswrapper[9136]: I1203 21:50:47.806720 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 21:50:47.808796 master-0 kubenswrapper[9136]: I1203 21:50:47.806896 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 21:50:47.817824 master-0 kubenswrapper[9136]: I1203 21:50:47.817761 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6"] Dec 03 21:50:47.825899 master-0 kubenswrapper[9136]: I1203 21:50:47.824191 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx"] Dec 03 21:50:47.911992 master-0 kubenswrapper[9136]: I1203 21:50:47.911885 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:47.911992 master-0 kubenswrapper[9136]: I1203 21:50:47.911993 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/858384f3-5741-4e67-8669-2eb2b2dcaf7f-cert\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:47.912254 master-0 kubenswrapper[9136]: I1203 21:50:47.912017 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/858384f3-5741-4e67-8669-2eb2b2dcaf7f-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:47.912254 master-0 kubenswrapper[9136]: I1203 21:50:47.912070 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj79k\" (UniqueName: \"kubernetes.io/projected/fa9b5917-d4f3-4372-a200-45b57412f92f-kube-api-access-pj79k\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:47.912254 master-0 kubenswrapper[9136]: I1203 21:50:47.912106 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:47.912931 master-0 kubenswrapper[9136]: I1203 21:50:47.912438 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:47.912931 master-0 kubenswrapper[9136]: I1203 21:50:47.912653 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:47.912931 master-0 kubenswrapper[9136]: I1203 21:50:47.912781 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrbd\" (UniqueName: \"kubernetes.io/projected/858384f3-5741-4e67-8669-2eb2b2dcaf7f-kube-api-access-qgrbd\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.013640 master-0 kubenswrapper[9136]: I1203 21:50:48.013560 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.013909 master-0 kubenswrapper[9136]: I1203 21:50:48.013666 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.013909 master-0 kubenswrapper[9136]: I1203 21:50:48.013733 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.013909 master-0 kubenswrapper[9136]: I1203 21:50:48.013827 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrbd\" (UniqueName: \"kubernetes.io/projected/858384f3-5741-4e67-8669-2eb2b2dcaf7f-kube-api-access-qgrbd\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.013909 master-0 kubenswrapper[9136]: I1203 21:50:48.013874 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.014061 master-0 kubenswrapper[9136]: I1203 21:50:48.013914 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/858384f3-5741-4e67-8669-2eb2b2dcaf7f-cert\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.014061 master-0 kubenswrapper[9136]: I1203 21:50:48.013960 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/858384f3-5741-4e67-8669-2eb2b2dcaf7f-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.014061 master-0 kubenswrapper[9136]: I1203 21:50:48.014013 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj79k\" (UniqueName: \"kubernetes.io/projected/fa9b5917-d4f3-4372-a200-45b57412f92f-kube-api-access-pj79k\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.015298 master-0 kubenswrapper[9136]: I1203 21:50:48.014361 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.015298 master-0 kubenswrapper[9136]: I1203 21:50:48.015225 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.017859 master-0 kubenswrapper[9136]: I1203 21:50:48.017806 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/858384f3-5741-4e67-8669-2eb2b2dcaf7f-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.019837 master-0 kubenswrapper[9136]: I1203 21:50:48.019803 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.034332 master-0 kubenswrapper[9136]: I1203 21:50:48.034274 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/858384f3-5741-4e67-8669-2eb2b2dcaf7f-cert\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.034634 master-0 kubenswrapper[9136]: I1203 21:50:48.034601 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.268663 master-0 kubenswrapper[9136]: I1203 21:50:48.268610 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-59d99f9b7b-x4tfh"] Dec 03 21:50:48.269622 master-0 kubenswrapper[9136]: I1203 21:50:48.269597 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.276809 master-0 kubenswrapper[9136]: I1203 21:50:48.272344 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d"] Dec 03 21:50:48.276809 master-0 kubenswrapper[9136]: I1203 21:50:48.275568 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.284356 master-0 kubenswrapper[9136]: I1203 21:50:48.284020 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 21:50:48.284356 master-0 kubenswrapper[9136]: I1203 21:50:48.284166 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-2jjj9" Dec 03 21:50:48.284356 master-0 kubenswrapper[9136]: I1203 21:50:48.284238 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 21:50:48.284356 master-0 kubenswrapper[9136]: I1203 21:50:48.284244 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 21:50:48.290349 master-0 kubenswrapper[9136]: I1203 21:50:48.290290 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 21:50:48.290576 master-0 kubenswrapper[9136]: I1203 21:50:48.290474 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 21:50:48.290576 master-0 kubenswrapper[9136]: I1203 21:50:48.290539 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-6lwtj" Dec 03 21:50:48.290736 master-0 kubenswrapper[9136]: I1203 21:50:48.290716 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 21:50:48.292176 master-0 kubenswrapper[9136]: I1203 21:50:48.290972 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 21:50:48.311124 master-0 kubenswrapper[9136]: I1203 21:50:48.310944 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 21:50:48.315915 master-0 kubenswrapper[9136]: I1203 21:50:48.315870 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrbd\" (UniqueName: \"kubernetes.io/projected/858384f3-5741-4e67-8669-2eb2b2dcaf7f-kube-api-access-qgrbd\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.317689 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq"] Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.318607 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320231 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320292 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfd7g\" (UniqueName: \"kubernetes.io/projected/578b2d03-b8b3-4c75-adde-73899c472ad7-kube-api-access-gfd7g\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320334 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5q4k\" (UniqueName: \"kubernetes.io/projected/e50b85a6-7767-4fca-8133-8243bdd85e5d-kube-api-access-z5q4k\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320357 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjtq\" (UniqueName: \"kubernetes.io/projected/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-kube-api-access-hzjtq\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320380 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320399 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.320454 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/578b2d03-b8b3-4c75-adde-73899c472ad7-snapshots\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.323019 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.323065 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50b85a6-7767-4fca-8133-8243bdd85e5d-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.323104 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50b85a6-7767-4fca-8133-8243bdd85e5d-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.323226 master-0 kubenswrapper[9136]: I1203 21:50:48.323136 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.323974 master-0 kubenswrapper[9136]: I1203 21:50:48.323341 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-59d99f9b7b-x4tfh"] Dec 03 21:50:48.325733 master-0 kubenswrapper[9136]: I1203 21:50:48.325446 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-c2z8w" Dec 03 21:50:48.325733 master-0 kubenswrapper[9136]: I1203 21:50:48.325684 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 21:50:48.340107 master-0 kubenswrapper[9136]: I1203 21:50:48.327466 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 21:50:48.340107 master-0 kubenswrapper[9136]: I1203 21:50:48.337144 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d"] Dec 03 21:50:48.350828 master-0 kubenswrapper[9136]: I1203 21:50:48.341849 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj79k\" (UniqueName: \"kubernetes.io/projected/fa9b5917-d4f3-4372-a200-45b57412f92f-kube-api-access-pj79k\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.372237 master-0 kubenswrapper[9136]: I1203 21:50:48.372168 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd"] Dec 03 21:50:48.373950 master-0 kubenswrapper[9136]: I1203 21:50:48.373915 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.381271 master-0 kubenswrapper[9136]: I1203 21:50:48.378283 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 21:50:48.413595 master-0 kubenswrapper[9136]: I1203 21:50:48.409403 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq"] Dec 03 21:50:48.416883 master-0 kubenswrapper[9136]: I1203 21:50:48.416516 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8"] Dec 03 21:50:48.418445 master-0 kubenswrapper[9136]: I1203 21:50:48.418414 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd"] Dec 03 21:50:48.418655 master-0 kubenswrapper[9136]: I1203 21:50:48.418628 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.425948 master-0 kubenswrapper[9136]: I1203 21:50:48.425490 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8"] Dec 03 21:50:48.426861 master-0 kubenswrapper[9136]: I1203 21:50:48.426711 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-ml7r7" Dec 03 21:50:48.426861 master-0 kubenswrapper[9136]: I1203 21:50:48.426842 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 21:50:48.427760 master-0 kubenswrapper[9136]: I1203 21:50:48.427656 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/578b2d03-b8b3-4c75-adde-73899c472ad7-snapshots\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.427760 master-0 kubenswrapper[9136]: I1203 21:50:48.427702 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.427760 master-0 kubenswrapper[9136]: I1203 21:50:48.427738 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50b85a6-7767-4fca-8133-8243bdd85e5d-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.427970 master-0 kubenswrapper[9136]: I1203 21:50:48.427784 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50b85a6-7767-4fca-8133-8243bdd85e5d-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.427970 master-0 kubenswrapper[9136]: I1203 21:50:48.427813 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.427970 master-0 kubenswrapper[9136]: I1203 21:50:48.427855 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.427970 master-0 kubenswrapper[9136]: I1203 21:50:48.427888 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfd7g\" (UniqueName: \"kubernetes.io/projected/578b2d03-b8b3-4c75-adde-73899c472ad7-kube-api-access-gfd7g\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.427970 master-0 kubenswrapper[9136]: I1203 21:50:48.427912 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5q4k\" (UniqueName: \"kubernetes.io/projected/e50b85a6-7767-4fca-8133-8243bdd85e5d-kube-api-access-z5q4k\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.427970 master-0 kubenswrapper[9136]: I1203 21:50:48.427969 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjtq\" (UniqueName: \"kubernetes.io/projected/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-kube-api-access-hzjtq\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.428245 master-0 kubenswrapper[9136]: I1203 21:50:48.428005 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.428245 master-0 kubenswrapper[9136]: I1203 21:50:48.428031 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.428972 master-0 kubenswrapper[9136]: I1203 21:50:48.428641 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.429542 master-0 kubenswrapper[9136]: I1203 21:50:48.429519 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/578b2d03-b8b3-4c75-adde-73899c472ad7-snapshots\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.434265 master-0 kubenswrapper[9136]: I1203 21:50:48.434201 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5"] Dec 03 21:50:48.434936 master-0 kubenswrapper[9136]: I1203 21:50:48.434877 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50b85a6-7767-4fca-8133-8243bdd85e5d-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.436793 master-0 kubenswrapper[9136]: I1203 21:50:48.436750 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.446019 master-0 kubenswrapper[9136]: I1203 21:50:48.442399 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.446489 master-0 kubenswrapper[9136]: I1203 21:50:48.446456 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.446652 master-0 kubenswrapper[9136]: I1203 21:50:48.446633 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50b85a6-7767-4fca-8133-8243bdd85e5d-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.446997 master-0 kubenswrapper[9136]: I1203 21:50:48.446966 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.447320 master-0 kubenswrapper[9136]: I1203 21:50:48.447303 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.453104 master-0 kubenswrapper[9136]: I1203 21:50:48.453048 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5"] Dec 03 21:50:48.458281 master-0 kubenswrapper[9136]: I1203 21:50:48.458231 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-tphv8" Dec 03 21:50:48.458673 master-0 kubenswrapper[9136]: I1203 21:50:48.458603 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 21:50:48.465371 master-0 kubenswrapper[9136]: I1203 21:50:48.465318 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 21:50:48.465507 master-0 kubenswrapper[9136]: I1203 21:50:48.465436 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 21:50:48.465571 master-0 kubenswrapper[9136]: I1203 21:50:48.465318 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 21:50:48.470646 master-0 kubenswrapper[9136]: I1203 21:50:48.470242 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 21:50:48.471163 master-0 kubenswrapper[9136]: I1203 21:50:48.471034 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 21:50:48.474592 master-0 kubenswrapper[9136]: I1203 21:50:48.474547 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfd7g\" (UniqueName: \"kubernetes.io/projected/578b2d03-b8b3-4c75-adde-73899c472ad7-kube-api-access-gfd7g\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.479969 master-0 kubenswrapper[9136]: I1203 21:50:48.479919 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 21:50:48.480157 master-0 kubenswrapper[9136]: I1203 21:50:48.480058 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5q4k\" (UniqueName: \"kubernetes.io/projected/e50b85a6-7767-4fca-8133-8243bdd85e5d-kube-api-access-z5q4k\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.482954 master-0 kubenswrapper[9136]: I1203 21:50:48.481808 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:50:48.487448 master-0 kubenswrapper[9136]: I1203 21:50:48.485633 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjtq\" (UniqueName: \"kubernetes.io/projected/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-kube-api-access-hzjtq\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.532642 master-0 kubenswrapper[9136]: I1203 21:50:48.531227 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.532642 master-0 kubenswrapper[9136]: I1203 21:50:48.531306 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzqm\" (UniqueName: \"kubernetes.io/projected/f9a3f900-60e4-49c2-85ec-88d19852d1b9-kube-api-access-jvzqm\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.532642 master-0 kubenswrapper[9136]: I1203 21:50:48.531335 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rjd\" (UniqueName: \"kubernetes.io/projected/6e96335e-1866-41c8-b128-b95e783a9be4-kube-api-access-v8rjd\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.532642 master-0 kubenswrapper[9136]: I1203 21:50:48.531389 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.532642 master-0 kubenswrapper[9136]: I1203 21:50:48.531423 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.553569 master-0 kubenswrapper[9136]: I1203 21:50:48.553521 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632524 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzqm\" (UniqueName: \"kubernetes.io/projected/f9a3f900-60e4-49c2-85ec-88d19852d1b9-kube-api-access-jvzqm\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632587 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rjd\" (UniqueName: \"kubernetes.io/projected/6e96335e-1866-41c8-b128-b95e783a9be4-kube-api-access-v8rjd\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632611 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632645 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632687 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632703 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632725 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgv2\" (UniqueName: \"kubernetes.io/projected/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-kube-api-access-thgv2\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632754 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.633676 master-0 kubenswrapper[9136]: I1203 21:50:48.632813 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.639749 master-0 kubenswrapper[9136]: I1203 21:50:48.639653 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.642536 master-0 kubenswrapper[9136]: I1203 21:50:48.642486 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.643439 master-0 kubenswrapper[9136]: I1203 21:50:48.643385 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.661923 master-0 kubenswrapper[9136]: I1203 21:50:48.660892 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzqm\" (UniqueName: \"kubernetes.io/projected/f9a3f900-60e4-49c2-85ec-88d19852d1b9-kube-api-access-jvzqm\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.670634 master-0 kubenswrapper[9136]: I1203 21:50:48.664968 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rjd\" (UniqueName: \"kubernetes.io/projected/6e96335e-1866-41c8-b128-b95e783a9be4-kube-api-access-v8rjd\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.735480 master-0 kubenswrapper[9136]: I1203 21:50:48.727333 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 21:50:48.735480 master-0 kubenswrapper[9136]: I1203 21:50:48.733751 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.735480 master-0 kubenswrapper[9136]: I1203 21:50:48.733841 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.735480 master-0 kubenswrapper[9136]: I1203 21:50:48.733907 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thgv2\" (UniqueName: \"kubernetes.io/projected/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-kube-api-access-thgv2\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.735480 master-0 kubenswrapper[9136]: I1203 21:50:48.733996 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.735480 master-0 kubenswrapper[9136]: I1203 21:50:48.734400 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.737885 master-0 kubenswrapper[9136]: I1203 21:50:48.737584 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.757886 master-0 kubenswrapper[9136]: I1203 21:50:48.757132 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.766887 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_e5aa1b71-5054-4695-a4e2-817ce5274ba9/installer/0.log" Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.766944 9136 generic.go:334] "Generic (PLEG): container finished" podID="e5aa1b71-5054-4695-a4e2-817ce5274ba9" containerID="7db16df4a826110868fed85df42888a8aa70e542af4ac113bfb0d52af03cbab5" exitCode=1 Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.766989 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e5aa1b71-5054-4695-a4e2-817ce5274ba9","Type":"ContainerDied","Data":"7db16df4a826110868fed85df42888a8aa70e542af4ac113bfb0d52af03cbab5"} Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.767027 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e5aa1b71-5054-4695-a4e2-817ce5274ba9","Type":"ContainerDied","Data":"9090a7aa6d01c7ba20e48285d09098429c32c2728e0577602eacfc9483748bc2"} Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.767039 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9090a7aa6d01c7ba20e48285d09098429c32c2728e0577602eacfc9483748bc2" Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.775540 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_e5aa1b71-5054-4695-a4e2-817ce5274ba9/installer/0.log" Dec 03 21:50:48.778956 master-0 kubenswrapper[9136]: I1203 21:50:48.775640 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:48.782470 master-0 kubenswrapper[9136]: I1203 21:50:48.782417 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgv2\" (UniqueName: \"kubernetes.io/projected/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-kube-api-access-thgv2\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.855657 master-0 kubenswrapper[9136]: I1203 21:50:48.855245 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:48.870366 master-0 kubenswrapper[9136]: I1203 21:50:48.870316 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 21:50:48.905668 master-0 kubenswrapper[9136]: I1203 21:50:48.905476 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 21:50:48.939522 master-0 kubenswrapper[9136]: I1203 21:50:48.939283 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-var-lock\") pod \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " Dec 03 21:50:48.939522 master-0 kubenswrapper[9136]: I1203 21:50:48.939311 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-var-lock" (OuterVolumeSpecName: "var-lock") pod "e5aa1b71-5054-4695-a4e2-817ce5274ba9" (UID: "e5aa1b71-5054-4695-a4e2-817ce5274ba9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:48.939684 master-0 kubenswrapper[9136]: I1203 21:50:48.939389 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kubelet-dir\") pod \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " Dec 03 21:50:48.939684 master-0 kubenswrapper[9136]: I1203 21:50:48.939444 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e5aa1b71-5054-4695-a4e2-817ce5274ba9" (UID: "e5aa1b71-5054-4695-a4e2-817ce5274ba9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:50:48.939684 master-0 kubenswrapper[9136]: I1203 21:50:48.939642 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kube-api-access\") pod \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\" (UID: \"e5aa1b71-5054-4695-a4e2-817ce5274ba9\") " Dec 03 21:50:48.939978 master-0 kubenswrapper[9136]: I1203 21:50:48.939941 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:48.940068 master-0 kubenswrapper[9136]: I1203 21:50:48.939990 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:48.943613 master-0 kubenswrapper[9136]: I1203 21:50:48.943574 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e5aa1b71-5054-4695-a4e2-817ce5274ba9" (UID: "e5aa1b71-5054-4695-a4e2-817ce5274ba9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:50:48.980554 master-0 kubenswrapper[9136]: I1203 21:50:48.980497 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d"] Dec 03 21:50:49.041740 master-0 kubenswrapper[9136]: I1203 21:50:49.041577 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5aa1b71-5054-4695-a4e2-817ce5274ba9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:50:49.073799 master-0 kubenswrapper[9136]: I1203 21:50:49.073722 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq"] Dec 03 21:50:49.083780 master-0 kubenswrapper[9136]: I1203 21:50:49.083704 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6"] Dec 03 21:50:49.088674 master-0 kubenswrapper[9136]: I1203 21:50:49.088624 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx"] Dec 03 21:50:49.777413 master-0 kubenswrapper[9136]: I1203 21:50:49.777324 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Dec 03 21:50:51.467714 master-0 kubenswrapper[9136]: I1203 21:50:51.467649 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 03 21:50:51.468290 master-0 kubenswrapper[9136]: E1203 21:50:51.467883 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5aa1b71-5054-4695-a4e2-817ce5274ba9" containerName="installer" Dec 03 21:50:51.468290 master-0 kubenswrapper[9136]: I1203 21:50:51.467897 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5aa1b71-5054-4695-a4e2-817ce5274ba9" containerName="installer" Dec 03 21:50:51.468290 master-0 kubenswrapper[9136]: I1203 21:50:51.467983 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5aa1b71-5054-4695-a4e2-817ce5274ba9" containerName="installer" Dec 03 21:50:51.468384 master-0 kubenswrapper[9136]: I1203 21:50:51.468367 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.470328 master-0 kubenswrapper[9136]: I1203 21:50:51.470280 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-m76lw" Dec 03 21:50:51.575700 master-0 kubenswrapper[9136]: I1203 21:50:51.575647 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-var-lock\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.575700 master-0 kubenswrapper[9136]: I1203 21:50:51.575702 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.575964 master-0 kubenswrapper[9136]: I1203 21:50:51.575744 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0269ada6-cb6e-4c98-bd24-752ae0286498-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.693924 master-0 kubenswrapper[9136]: I1203 21:50:51.683638 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0269ada6-cb6e-4c98-bd24-752ae0286498-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.693924 master-0 kubenswrapper[9136]: I1203 21:50:51.683752 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-var-lock\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.693924 master-0 kubenswrapper[9136]: I1203 21:50:51.683800 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.693924 master-0 kubenswrapper[9136]: I1203 21:50:51.683895 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.693924 master-0 kubenswrapper[9136]: I1203 21:50:51.684325 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-var-lock\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.693924 master-0 kubenswrapper[9136]: I1203 21:50:51.691380 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-59d99f9b7b-x4tfh"] Dec 03 21:50:51.699381 master-0 kubenswrapper[9136]: I1203 21:50:51.695979 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 03 21:50:51.719814 master-0 kubenswrapper[9136]: W1203 21:50:51.719755 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3d3235_531e_4c7d_9fc9_e65c97970d0f.slice/crio-dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac WatchSource:0}: Error finding container dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac: Status 404 returned error can't find the container with id dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac Dec 03 21:50:51.720728 master-0 kubenswrapper[9136]: W1203 21:50:51.720694 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode50b85a6_7767_4fca_8133_8243bdd85e5d.slice/crio-5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e WatchSource:0}: Error finding container 5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e: Status 404 returned error can't find the container with id 5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e Dec 03 21:50:51.727300 master-0 kubenswrapper[9136]: W1203 21:50:51.727269 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578b2d03_b8b3_4c75_adde_73899c472ad7.slice/crio-b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8 WatchSource:0}: Error finding container b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8: Status 404 returned error can't find the container with id b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8 Dec 03 21:50:51.787594 master-0 kubenswrapper[9136]: I1203 21:50:51.787533 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" event={"ID":"ac3d3235-531e-4c7d-9fc9-e65c97970d0f","Type":"ContainerStarted","Data":"dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac"} Dec 03 21:50:51.788778 master-0 kubenswrapper[9136]: I1203 21:50:51.788720 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" event={"ID":"858384f3-5741-4e67-8669-2eb2b2dcaf7f","Type":"ContainerStarted","Data":"7f5e855d5ea32f73759be5b29eddccce01f7fec70b6614f70377310dbf597215"} Dec 03 21:50:51.790533 master-0 kubenswrapper[9136]: I1203 21:50:51.790501 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e"} Dec 03 21:50:51.791416 master-0 kubenswrapper[9136]: I1203 21:50:51.791395 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" event={"ID":"578b2d03-b8b3-4c75-adde-73899c472ad7","Type":"ContainerStarted","Data":"b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8"} Dec 03 21:50:51.792321 master-0 kubenswrapper[9136]: I1203 21:50:51.792168 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"5703a6e2ab5e982bbe6de5731742c9b48a97f1ee525f754cfdb8e3ab5ff893fc"} Dec 03 21:50:51.853137 master-0 kubenswrapper[9136]: I1203 21:50:51.853100 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 21:50:51.878525 master-0 kubenswrapper[9136]: I1203 21:50:51.869078 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0269ada6-cb6e-4c98-bd24-752ae0286498-kube-api-access\") pod \"installer-4-master-0\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:51.941388 master-0 kubenswrapper[9136]: I1203 21:50:51.941305 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Dec 03 21:50:51.952623 master-0 kubenswrapper[9136]: I1203 21:50:51.952578 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf"] Dec 03 21:50:51.953443 master-0 kubenswrapper[9136]: I1203 21:50:51.953410 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:51.956450 master-0 kubenswrapper[9136]: I1203 21:50:51.956006 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 21:50:51.956450 master-0 kubenswrapper[9136]: I1203 21:50:51.956177 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 21:50:51.956844 master-0 kubenswrapper[9136]: I1203 21:50:51.956821 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 21:50:51.956970 master-0 kubenswrapper[9136]: I1203 21:50:51.956949 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 21:50:51.957087 master-0 kubenswrapper[9136]: I1203 21:50:51.957068 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 21:50:51.958040 master-0 kubenswrapper[9136]: I1203 21:50:51.958012 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-w6qk9" Dec 03 21:50:52.090399 master-0 kubenswrapper[9136]: I1203 21:50:52.090323 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.090399 master-0 kubenswrapper[9136]: I1203 21:50:52.090383 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m4mf\" (UniqueName: \"kubernetes.io/projected/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-kube-api-access-9m4mf\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.090830 master-0 kubenswrapper[9136]: I1203 21:50:52.090621 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.090830 master-0 kubenswrapper[9136]: I1203 21:50:52.090760 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-images\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.090830 master-0 kubenswrapper[9136]: I1203 21:50:52.090813 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.103154 master-0 kubenswrapper[9136]: I1203 21:50:52.103079 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:50:52.192492 master-0 kubenswrapper[9136]: I1203 21:50:52.192427 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.192597 master-0 kubenswrapper[9136]: I1203 21:50:52.192514 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m4mf\" (UniqueName: \"kubernetes.io/projected/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-kube-api-access-9m4mf\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.192597 master-0 kubenswrapper[9136]: I1203 21:50:52.192597 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.192763 master-0 kubenswrapper[9136]: I1203 21:50:52.192704 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-images\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.192763 master-0 kubenswrapper[9136]: I1203 21:50:52.192746 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.192886 master-0 kubenswrapper[9136]: I1203 21:50:52.192837 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.193923 master-0 kubenswrapper[9136]: I1203 21:50:52.193869 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.194517 master-0 kubenswrapper[9136]: I1203 21:50:52.194484 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-images\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.196723 master-0 kubenswrapper[9136]: I1203 21:50:52.196695 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.371443 master-0 kubenswrapper[9136]: I1203 21:50:52.371403 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd"] Dec 03 21:50:52.382966 master-0 kubenswrapper[9136]: W1203 21:50:52.382914 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a3f900_60e4_49c2_85ec_88d19852d1b9.slice/crio-4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e WatchSource:0}: Error finding container 4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e: Status 404 returned error can't find the container with id 4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e Dec 03 21:50:52.383760 master-0 kubenswrapper[9136]: I1203 21:50:52.383716 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m4mf\" (UniqueName: \"kubernetes.io/projected/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-kube-api-access-9m4mf\") pod \"cluster-cloud-controller-manager-operator-76f56467d7-c4ngf\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.409549 master-0 kubenswrapper[9136]: I1203 21:50:52.409480 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5"] Dec 03 21:50:52.417860 master-0 kubenswrapper[9136]: I1203 21:50:52.417808 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8"] Dec 03 21:50:52.571668 master-0 kubenswrapper[9136]: I1203 21:50:52.570998 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:50:52.634727 master-0 kubenswrapper[9136]: I1203 21:50:52.634221 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Dec 03 21:50:52.660439 master-0 kubenswrapper[9136]: W1203 21:50:52.660368 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0269ada6_cb6e_4c98_bd24_752ae0286498.slice/crio-5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785 WatchSource:0}: Error finding container 5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785: Status 404 returned error can't find the container with id 5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785 Dec 03 21:50:52.668028 master-0 kubenswrapper[9136]: W1203 21:50:52.667979 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef057c6e_7d96_4db8_ab3c_8e81d6f29df7.slice/crio-f657f4b1134dbf93e45130449fa579d92c7ed47b59015d49fe477b69dc4d5559 WatchSource:0}: Error finding container f657f4b1134dbf93e45130449fa579d92c7ed47b59015d49fe477b69dc4d5559: Status 404 returned error can't find the container with id f657f4b1134dbf93e45130449fa579d92c7ed47b59015d49fe477b69dc4d5559 Dec 03 21:50:52.823854 master-0 kubenswrapper[9136]: I1203 21:50:52.822262 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" event={"ID":"858384f3-5741-4e67-8669-2eb2b2dcaf7f","Type":"ContainerStarted","Data":"eee4b67cde658ce7ee81b1e3cd07ced5b5f2d8119486be4f33ca9c1e9629f4be"} Dec 03 21:50:52.832563 master-0 kubenswrapper[9136]: I1203 21:50:52.830907 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerStarted","Data":"ee51a7c7f756f465a21b62d37220086693d4cf6214b3eb70fad749bc4447332b"} Dec 03 21:50:52.860935 master-0 kubenswrapper[9136]: I1203 21:50:52.858211 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-7486ff55f-w9xk2"] Dec 03 21:50:52.860935 master-0 kubenswrapper[9136]: I1203 21:50:52.858961 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:52.866016 master-0 kubenswrapper[9136]: I1203 21:50:52.863272 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-5skvl" Dec 03 21:50:52.866016 master-0 kubenswrapper[9136]: I1203 21:50:52.863493 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 21:50:52.866016 master-0 kubenswrapper[9136]: I1203 21:50:52.863632 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 21:50:52.866016 master-0 kubenswrapper[9136]: I1203 21:50:52.863758 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 21:50:52.883801 master-0 kubenswrapper[9136]: I1203 21:50:52.882970 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-7486ff55f-w9xk2"] Dec 03 21:50:52.909286 master-0 kubenswrapper[9136]: I1203 21:50:52.908054 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" event={"ID":"a9940ff5-36a6-4c04-a51d-66f7d83bea7c","Type":"ContainerStarted","Data":"7ce3a1f2c204d02e0dfbdb6c0e194411b3c9a50718e6f590216f87c2ae16d270"} Dec 03 21:50:52.909286 master-0 kubenswrapper[9136]: I1203 21:50:52.908121 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" event={"ID":"a9940ff5-36a6-4c04-a51d-66f7d83bea7c","Type":"ContainerStarted","Data":"4b5dd9433f686e30d70877cae31ac505a20b05bd1de2322e270f022f7fa31aa9"} Dec 03 21:50:52.970974 master-0 kubenswrapper[9136]: I1203 21:50:52.970499 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" event={"ID":"ac3d3235-531e-4c7d-9fc9-e65c97970d0f","Type":"ContainerStarted","Data":"589a98fdc63e90bfabaadebac762af17a10facb966a0f828148dab8128c92d39"} Dec 03 21:50:52.971296 master-0 kubenswrapper[9136]: I1203 21:50:52.971248 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:52.982935 master-0 kubenswrapper[9136]: I1203 21:50:52.982872 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" event={"ID":"62b43fe1-63f5-4d29-90a2-f36cb9e880ff","Type":"ContainerStarted","Data":"15267d4a428d74345987ccca9857cf67222c533495cb71df0a79781432c9ad0d"} Dec 03 21:50:52.982935 master-0 kubenswrapper[9136]: I1203 21:50:52.982932 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" event={"ID":"62b43fe1-63f5-4d29-90a2-f36cb9e880ff","Type":"ContainerStarted","Data":"702b23c81fdfd66bf6056f4c870e87c41e5b76893faf7d1072a7990c28cf4864"} Dec 03 21:50:52.987335 master-0 kubenswrapper[9136]: I1203 21:50:52.986037 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 21:50:52.995832 master-0 kubenswrapper[9136]: I1203 21:50:52.994088 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0269ada6-cb6e-4c98-bd24-752ae0286498","Type":"ContainerStarted","Data":"5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785"} Dec 03 21:50:52.998097 master-0 kubenswrapper[9136]: I1203 21:50:52.998043 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" event={"ID":"f9a3f900-60e4-49c2-85ec-88d19852d1b9","Type":"ContainerStarted","Data":"774f73669e8715d22803904ec9ed4c6e3c746e2aee521b5906614387e121f355"} Dec 03 21:50:52.998097 master-0 kubenswrapper[9136]: I1203 21:50:52.998103 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" event={"ID":"f9a3f900-60e4-49c2-85ec-88d19852d1b9","Type":"ContainerStarted","Data":"4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e"} Dec 03 21:50:53.004185 master-0 kubenswrapper[9136]: I1203 21:50:53.000950 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:53.018655 master-0 kubenswrapper[9136]: I1203 21:50:53.011215 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.018655 master-0 kubenswrapper[9136]: I1203 21:50:53.013228 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.018655 master-0 kubenswrapper[9136]: I1203 21:50:53.013586 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtx6m\" (UniqueName: \"kubernetes.io/projected/8b56c318-09b7-47f0-a7bf-32eb96e836ca-kube-api-access-qtx6m\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.018655 master-0 kubenswrapper[9136]: I1203 21:50:53.013884 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.018655 master-0 kubenswrapper[9136]: I1203 21:50:53.015348 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 21:50:53.028866 master-0 kubenswrapper[9136]: I1203 21:50:53.026110 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerStarted","Data":"f657f4b1134dbf93e45130449fa579d92c7ed47b59015d49fe477b69dc4d5559"} Dec 03 21:50:53.044154 master-0 kubenswrapper[9136]: I1203 21:50:53.044018 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" podStartSLOduration=5.043994853 podStartE2EDuration="5.043994853s" podCreationTimestamp="2025-12-03 21:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:53.006487947 +0000 UTC m=+59.281664329" watchObservedRunningTime="2025-12-03 21:50:53.043994853 +0000 UTC m=+59.319171225" Dec 03 21:50:53.102472 master-0 kubenswrapper[9136]: I1203 21:50:53.099523 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" podStartSLOduration=5.09949467 podStartE2EDuration="5.09949467s" podCreationTimestamp="2025-12-03 21:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:53.046421291 +0000 UTC m=+59.321597693" watchObservedRunningTime="2025-12-03 21:50:53.09949467 +0000 UTC m=+59.374671052" Dec 03 21:50:53.121809 master-0 kubenswrapper[9136]: I1203 21:50:53.119355 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.121809 master-0 kubenswrapper[9136]: I1203 21:50:53.119410 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.121809 master-0 kubenswrapper[9136]: I1203 21:50:53.119467 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtx6m\" (UniqueName: \"kubernetes.io/projected/8b56c318-09b7-47f0-a7bf-32eb96e836ca-kube-api-access-qtx6m\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.121809 master-0 kubenswrapper[9136]: I1203 21:50:53.119540 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.121809 master-0 kubenswrapper[9136]: I1203 21:50:53.120438 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.121809 master-0 kubenswrapper[9136]: I1203 21:50:53.121154 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.132797 master-0 kubenswrapper[9136]: I1203 21:50:53.126635 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" podStartSLOduration=2.249351757 podStartE2EDuration="8.126620128s" podCreationTimestamp="2025-12-03 21:50:45 +0000 UTC" firstStartedPulling="2025-12-03 21:50:46.31491378 +0000 UTC m=+52.590090162" lastFinishedPulling="2025-12-03 21:50:52.192182151 +0000 UTC m=+58.467358533" observedRunningTime="2025-12-03 21:50:53.12418623 +0000 UTC m=+59.399362612" watchObservedRunningTime="2025-12-03 21:50:53.126620128 +0000 UTC m=+59.401796510" Dec 03 21:50:53.132797 master-0 kubenswrapper[9136]: I1203 21:50:53.129406 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.150745 master-0 kubenswrapper[9136]: I1203 21:50:53.150635 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtx6m\" (UniqueName: \"kubernetes.io/projected/8b56c318-09b7-47f0-a7bf-32eb96e836ca-kube-api-access-qtx6m\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.231541 master-0 kubenswrapper[9136]: I1203 21:50:53.231466 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 21:50:53.598450 master-0 kubenswrapper[9136]: W1203 21:50:53.598403 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b56c318_09b7_47f0_a7bf_32eb96e836ca.slice/crio-42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41 WatchSource:0}: Error finding container 42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41: Status 404 returned error can't find the container with id 42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41 Dec 03 21:50:53.599485 master-0 kubenswrapper[9136]: I1203 21:50:53.599438 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-7486ff55f-w9xk2"] Dec 03 21:50:53.924457 master-0 kubenswrapper[9136]: I1203 21:50:53.924396 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5aa1b71-5054-4695-a4e2-817ce5274ba9" path="/var/lib/kubelet/pods/e5aa1b71-5054-4695-a4e2-817ce5274ba9/volumes" Dec 03 21:50:54.035526 master-0 kubenswrapper[9136]: I1203 21:50:54.035479 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0269ada6-cb6e-4c98-bd24-752ae0286498","Type":"ContainerStarted","Data":"828e1c847e23b1857152d987d6c054b5de0c2ac4dd44ef540ab230ae90795ebb"} Dec 03 21:50:54.038443 master-0 kubenswrapper[9136]: I1203 21:50:54.038402 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" event={"ID":"a9940ff5-36a6-4c04-a51d-66f7d83bea7c","Type":"ContainerStarted","Data":"ffffe47517b3c82fae2ca875e8e2dbb1ff9388612a3ebe0e9a78a16affbebb5a"} Dec 03 21:50:54.040841 master-0 kubenswrapper[9136]: I1203 21:50:54.040811 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" event={"ID":"8b56c318-09b7-47f0-a7bf-32eb96e836ca","Type":"ContainerStarted","Data":"569719ad3d0850ae8ae5f921fc6ac8bc3e8b816f5531231cd6ff32bff236442a"} Dec 03 21:50:54.040841 master-0 kubenswrapper[9136]: I1203 21:50:54.040842 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" event={"ID":"8b56c318-09b7-47f0-a7bf-32eb96e836ca","Type":"ContainerStarted","Data":"42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41"} Dec 03 21:50:54.745262 master-0 kubenswrapper[9136]: I1203 21:50:54.745194 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kp794"] Dec 03 21:50:54.746471 master-0 kubenswrapper[9136]: I1203 21:50:54.746435 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.747521 master-0 kubenswrapper[9136]: I1203 21:50:54.747477 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k98b2"] Dec 03 21:50:54.748217 master-0 kubenswrapper[9136]: I1203 21:50:54.748190 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-gk9qh" Dec 03 21:50:54.748991 master-0 kubenswrapper[9136]: I1203 21:50:54.748954 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.750638 master-0 kubenswrapper[9136]: I1203 21:50:54.750613 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-5qqsp" Dec 03 21:50:54.844584 master-0 kubenswrapper[9136]: I1203 21:50:54.844494 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.844954 master-0 kubenswrapper[9136]: I1203 21:50:54.844615 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-catalog-content\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.844954 master-0 kubenswrapper[9136]: I1203 21:50:54.844648 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dfd\" (UniqueName: \"kubernetes.io/projected/0a49c320-f31d-4f6d-98c3-48d24346b873-kube-api-access-s7dfd\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.844954 master-0 kubenswrapper[9136]: I1203 21:50:54.844682 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.844954 master-0 kubenswrapper[9136]: I1203 21:50:54.844839 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.844954 master-0 kubenswrapper[9136]: I1203 21:50:54.844942 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-utilities\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.945950 master-0 kubenswrapper[9136]: I1203 21:50:54.945875 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.945950 master-0 kubenswrapper[9136]: I1203 21:50:54.945961 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-catalog-content\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.945950 master-0 kubenswrapper[9136]: I1203 21:50:54.945989 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dfd\" (UniqueName: \"kubernetes.io/projected/0a49c320-f31d-4f6d-98c3-48d24346b873-kube-api-access-s7dfd\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.946480 master-0 kubenswrapper[9136]: I1203 21:50:54.946013 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.946480 master-0 kubenswrapper[9136]: I1203 21:50:54.946048 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.946480 master-0 kubenswrapper[9136]: I1203 21:50:54.946068 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-utilities\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.946603 master-0 kubenswrapper[9136]: I1203 21:50:54.946546 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-utilities\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:54.947195 master-0 kubenswrapper[9136]: I1203 21:50:54.947015 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.947195 master-0 kubenswrapper[9136]: I1203 21:50:54.947077 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:54.948322 master-0 kubenswrapper[9136]: I1203 21:50:54.947240 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-catalog-content\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:55.026699 master-0 kubenswrapper[9136]: I1203 21:50:55.026549 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kp794"] Dec 03 21:50:55.067892 master-0 kubenswrapper[9136]: I1203 21:50:55.067855 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:55.071810 master-0 kubenswrapper[9136]: I1203 21:50:55.071742 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dfd\" (UniqueName: \"kubernetes.io/projected/0a49c320-f31d-4f6d-98c3-48d24346b873-kube-api-access-s7dfd\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:55.075708 master-0 kubenswrapper[9136]: I1203 21:50:55.075656 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:50:55.098292 master-0 kubenswrapper[9136]: I1203 21:50:55.098229 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:50:55.140657 master-0 kubenswrapper[9136]: I1203 21:50:55.140609 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k98b2"] Dec 03 21:50:55.277445 master-0 kubenswrapper[9136]: I1203 21:50:55.277118 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc"] Dec 03 21:50:55.278575 master-0 kubenswrapper[9136]: I1203 21:50:55.278542 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.282280 master-0 kubenswrapper[9136]: I1203 21:50:55.282224 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 21:50:55.292438 master-0 kubenswrapper[9136]: I1203 21:50:55.291397 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc"] Dec 03 21:50:55.356742 master-0 kubenswrapper[9136]: I1203 21:50:55.356684 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.356742 master-0 kubenswrapper[9136]: I1203 21:50:55.356745 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s87hj\" (UniqueName: \"kubernetes.io/projected/77e36f4e-845b-4b82-8abc-b634636c087a-kube-api-access-s87hj\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.357051 master-0 kubenswrapper[9136]: I1203 21:50:55.356812 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77e36f4e-845b-4b82-8abc-b634636c087a-tmpfs\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.357051 master-0 kubenswrapper[9136]: I1203 21:50:55.356854 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.458474 master-0 kubenswrapper[9136]: I1203 21:50:55.457826 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77e36f4e-845b-4b82-8abc-b634636c087a-tmpfs\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.458474 master-0 kubenswrapper[9136]: I1203 21:50:55.457899 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.458474 master-0 kubenswrapper[9136]: I1203 21:50:55.457955 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.458474 master-0 kubenswrapper[9136]: I1203 21:50:55.457978 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87hj\" (UniqueName: \"kubernetes.io/projected/77e36f4e-845b-4b82-8abc-b634636c087a-kube-api-access-s87hj\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.458845 master-0 kubenswrapper[9136]: I1203 21:50:55.458797 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77e36f4e-845b-4b82-8abc-b634636c087a-tmpfs\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.461750 master-0 kubenswrapper[9136]: I1203 21:50:55.461703 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.464574 master-0 kubenswrapper[9136]: I1203 21:50:55.464507 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.516631 master-0 kubenswrapper[9136]: I1203 21:50:55.516570 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 21:50:55.516956 master-0 kubenswrapper[9136]: I1203 21:50:55.516814 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="4f66f491-f82d-4e8c-8929-4675f99aa5b7" containerName="installer" containerID="cri-o://f69b2fadd7aa21980afa8fbd2988f6315c3c2d241badb575affba3e4afbe34e8" gracePeriod=30 Dec 03 21:50:55.573302 master-0 kubenswrapper[9136]: I1203 21:50:55.573170 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87hj\" (UniqueName: \"kubernetes.io/projected/77e36f4e-845b-4b82-8abc-b634636c087a-kube-api-access-s87hj\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.613676 master-0 kubenswrapper[9136]: I1203 21:50:55.613595 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nwjnw"] Dec 03 21:50:55.614897 master-0 kubenswrapper[9136]: I1203 21:50:55.614869 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.624425 master-0 kubenswrapper[9136]: I1203 21:50:55.621180 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:50:55.624425 master-0 kubenswrapper[9136]: I1203 21:50:55.621554 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-88jcx" Dec 03 21:50:55.624425 master-0 kubenswrapper[9136]: I1203 21:50:55.624354 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwjnw"] Dec 03 21:50:55.649611 master-0 kubenswrapper[9136]: I1203 21:50:55.649488 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" podStartSLOduration=7.649454326 podStartE2EDuration="7.649454326s" podCreationTimestamp="2025-12-03 21:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:55.644086412 +0000 UTC m=+61.919262814" watchObservedRunningTime="2025-12-03 21:50:55.649454326 +0000 UTC m=+61.924630708" Dec 03 21:50:55.671714 master-0 kubenswrapper[9136]: I1203 21:50:55.671650 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-utilities\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.671714 master-0 kubenswrapper[9136]: I1203 21:50:55.671715 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-catalog-content\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.671943 master-0 kubenswrapper[9136]: I1203 21:50:55.671743 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm7kd\" (UniqueName: \"kubernetes.io/projected/da0d36d7-fb62-4254-95b0-fb81dcf372cd-kube-api-access-rm7kd\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.773026 master-0 kubenswrapper[9136]: I1203 21:50:55.772953 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-utilities\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.773026 master-0 kubenswrapper[9136]: I1203 21:50:55.773028 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-catalog-content\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.773672 master-0 kubenswrapper[9136]: I1203 21:50:55.773052 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm7kd\" (UniqueName: \"kubernetes.io/projected/da0d36d7-fb62-4254-95b0-fb81dcf372cd-kube-api-access-rm7kd\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.773672 master-0 kubenswrapper[9136]: I1203 21:50:55.773479 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-catalog-content\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.773672 master-0 kubenswrapper[9136]: I1203 21:50:55.773478 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-utilities\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.883830 master-0 kubenswrapper[9136]: I1203 21:50:55.881191 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=6.881173471 podStartE2EDuration="6.881173471s" podCreationTimestamp="2025-12-03 21:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:50:55.878094411 +0000 UTC m=+62.153270823" watchObservedRunningTime="2025-12-03 21:50:55.881173471 +0000 UTC m=+62.156349853" Dec 03 21:50:55.941904 master-0 kubenswrapper[9136]: I1203 21:50:55.941846 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm7kd\" (UniqueName: \"kubernetes.io/projected/da0d36d7-fb62-4254-95b0-fb81dcf372cd-kube-api-access-rm7kd\") pod \"redhat-marketplace-nwjnw\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:55.949433 master-0 kubenswrapper[9136]: I1203 21:50:55.949386 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j9wwr"] Dec 03 21:50:55.951478 master-0 kubenswrapper[9136]: I1203 21:50:55.951443 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:55.953045 master-0 kubenswrapper[9136]: I1203 21:50:55.953019 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-7n9xc" Dec 03 21:50:55.954734 master-0 kubenswrapper[9136]: I1203 21:50:55.954693 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 21:50:56.079258 master-0 kubenswrapper[9136]: I1203 21:50:56.078674 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd18a700-53b2-430c-a34f-dbb6331cfbe5-rootfs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.079258 master-0 kubenswrapper[9136]: I1203 21:50:56.078837 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.079258 master-0 kubenswrapper[9136]: I1203 21:50:56.078901 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.079258 master-0 kubenswrapper[9136]: I1203 21:50:56.078987 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9bs\" (UniqueName: \"kubernetes.io/projected/bd18a700-53b2-430c-a34f-dbb6331cfbe5-kube-api-access-ll9bs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.193967 master-0 kubenswrapper[9136]: I1203 21:50:56.180610 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.193967 master-0 kubenswrapper[9136]: I1203 21:50:56.180698 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.193967 master-0 kubenswrapper[9136]: I1203 21:50:56.180758 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9bs\" (UniqueName: \"kubernetes.io/projected/bd18a700-53b2-430c-a34f-dbb6331cfbe5-kube-api-access-ll9bs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.193967 master-0 kubenswrapper[9136]: I1203 21:50:56.180853 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd18a700-53b2-430c-a34f-dbb6331cfbe5-rootfs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.193967 master-0 kubenswrapper[9136]: I1203 21:50:56.181440 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd18a700-53b2-430c-a34f-dbb6331cfbe5-rootfs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.193967 master-0 kubenswrapper[9136]: I1203 21:50:56.185612 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.198809 master-0 kubenswrapper[9136]: I1203 21:50:56.198659 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.198809 master-0 kubenswrapper[9136]: I1203 21:50:56.198742 9136 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 21:50:56.199148 master-0 kubenswrapper[9136]: I1203 21:50:56.198991 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" containerID="cri-o://d089dbd1d00b21007f2d7f87058c4506fbc7b6ac8e4051768c22497e2ce3a3f4" gracePeriod=30 Dec 03 21:50:56.199198 master-0 kubenswrapper[9136]: I1203 21:50:56.199174 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" containerID="cri-o://6c365c0b34d496df5874092d2be756d0f4503f99c75f8af416569e515090fd7c" gracePeriod=30 Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.200193 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: E1203 21:50:56.200449 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.200465 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: E1203 21:50:56.200478 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.200485 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.200612 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcd" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.200622 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b95a38663dd6fe34e183818a475977" containerName="etcdctl" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.202213 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.219819 master-0 kubenswrapper[9136]: I1203 21:50:56.210270 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9bs\" (UniqueName: \"kubernetes.io/projected/bd18a700-53b2-430c-a34f-dbb6331cfbe5-kube-api-access-ll9bs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.244801 master-0 kubenswrapper[9136]: I1203 21:50:56.244088 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:50:56.278840 master-0 kubenswrapper[9136]: I1203 21:50:56.278792 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:50:56.282162 master-0 kubenswrapper[9136]: I1203 21:50:56.282133 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-static-pod-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.282237 master-0 kubenswrapper[9136]: I1203 21:50:56.282182 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-cert-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.282237 master-0 kubenswrapper[9136]: I1203 21:50:56.282209 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-resource-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.282333 master-0 kubenswrapper[9136]: I1203 21:50:56.282244 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-data-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.282333 master-0 kubenswrapper[9136]: I1203 21:50:56.282279 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-usr-local-bin\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.282333 master-0 kubenswrapper[9136]: I1203 21:50:56.282297 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-log-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.383900 master-0 kubenswrapper[9136]: I1203 21:50:56.383843 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-static-pod-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.383900 master-0 kubenswrapper[9136]: I1203 21:50:56.383903 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-cert-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384166 master-0 kubenswrapper[9136]: I1203 21:50:56.383928 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-resource-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384166 master-0 kubenswrapper[9136]: I1203 21:50:56.383969 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-data-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384166 master-0 kubenswrapper[9136]: I1203 21:50:56.384009 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-usr-local-bin\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384166 master-0 kubenswrapper[9136]: I1203 21:50:56.384029 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-log-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384166 master-0 kubenswrapper[9136]: I1203 21:50:56.384119 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-log-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384166 master-0 kubenswrapper[9136]: I1203 21:50:56.384163 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-static-pod-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384396 master-0 kubenswrapper[9136]: I1203 21:50:56.384189 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-cert-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384396 master-0 kubenswrapper[9136]: I1203 21:50:56.384215 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-resource-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384396 master-0 kubenswrapper[9136]: I1203 21:50:56.384241 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-data-dir\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:56.384396 master-0 kubenswrapper[9136]: I1203 21:50:56.384265 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-usr-local-bin\") pod \"etcd-master-0\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " pod="openshift-etcd/etcd-master-0" Dec 03 21:50:57.065295 master-0 kubenswrapper[9136]: I1203 21:50:57.065189 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75/installer/0.log" Dec 03 21:50:57.065295 master-0 kubenswrapper[9136]: I1203 21:50:57.065235 9136 generic.go:334] "Generic (PLEG): container finished" podID="ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" containerID="d1b7ddaa40a55179e8a8084845290bc3ea8a9cdf63eaebcf45106950ba9b8650" exitCode=1 Dec 03 21:50:57.065295 master-0 kubenswrapper[9136]: I1203 21:50:57.065264 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75","Type":"ContainerDied","Data":"d1b7ddaa40a55179e8a8084845290bc3ea8a9cdf63eaebcf45106950ba9b8650"} Dec 03 21:51:08.147686 master-0 kubenswrapper[9136]: I1203 21:51:08.147566 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/0.log" Dec 03 21:51:08.147686 master-0 kubenswrapper[9136]: I1203 21:51:08.147678 9136 generic.go:334] "Generic (PLEG): container finished" podID="6976b503-87da-48fc-b097-d1b315fbee3f" containerID="ad5b230b5b0a6050c3e00f60f378eb9862c13b3605b44eb150bd273e89f0bd98" exitCode=1 Dec 03 21:51:08.148933 master-0 kubenswrapper[9136]: I1203 21:51:08.147739 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerDied","Data":"ad5b230b5b0a6050c3e00f60f378eb9862c13b3605b44eb150bd273e89f0bd98"} Dec 03 21:51:08.148933 master-0 kubenswrapper[9136]: I1203 21:51:08.148859 9136 scope.go:117] "RemoveContainer" containerID="ad5b230b5b0a6050c3e00f60f378eb9862c13b3605b44eb150bd273e89f0bd98" Dec 03 21:51:09.281065 master-0 kubenswrapper[9136]: E1203 21:51:09.280931 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 21:51:09.281890 master-0 kubenswrapper[9136]: I1203 21:51:09.281584 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 21:51:09.785687 master-0 kubenswrapper[9136]: I1203 21:51:09.785536 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 21:51:11.162960 master-0 kubenswrapper[9136]: I1203 21:51:11.162891 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75/installer/0.log" Dec 03 21:51:11.163837 master-0 kubenswrapper[9136]: I1203 21:51:11.162980 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:51:11.166701 master-0 kubenswrapper[9136]: I1203 21:51:11.166664 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75/installer/0.log" Dec 03 21:51:11.166817 master-0 kubenswrapper[9136]: I1203 21:51:11.166784 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75","Type":"ContainerDied","Data":"d73dffef2d54911dd7ba3e993e7fcb953c22745a8d407d8bc378e0d66df9f829"} Dec 03 21:51:11.166817 master-0 kubenswrapper[9136]: I1203 21:51:11.166800 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Dec 03 21:51:11.167147 master-0 kubenswrapper[9136]: I1203 21:51:11.167099 9136 scope.go:117] "RemoveContainer" containerID="d1b7ddaa40a55179e8a8084845290bc3ea8a9cdf63eaebcf45106950ba9b8650" Dec 03 21:51:11.169498 master-0 kubenswrapper[9136]: I1203 21:51:11.169466 9136 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133" exitCode=1 Dec 03 21:51:11.169580 master-0 kubenswrapper[9136]: I1203 21:51:11.169546 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133"} Dec 03 21:51:11.170031 master-0 kubenswrapper[9136]: I1203 21:51:11.169993 9136 scope.go:117] "RemoveContainer" containerID="aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133" Dec 03 21:51:11.171705 master-0 kubenswrapper[9136]: I1203 21:51:11.171657 9136 generic.go:334] "Generic (PLEG): container finished" podID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerID="e15ea152ef1a26d4cb376f11826bf4ceac9e8245aad4cfc2e16ac02f57a9e91c" exitCode=0 Dec 03 21:51:11.171705 master-0 kubenswrapper[9136]: I1203 21:51:11.171697 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"3497f5dd-4c6f-4108-a948-481cef475ba9","Type":"ContainerDied","Data":"e15ea152ef1a26d4cb376f11826bf4ceac9e8245aad4cfc2e16ac02f57a9e91c"} Dec 03 21:51:11.201943 master-0 kubenswrapper[9136]: I1203 21:51:11.201862 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kube-api-access\") pod \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " Dec 03 21:51:11.201943 master-0 kubenswrapper[9136]: I1203 21:51:11.201939 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kubelet-dir\") pod \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " Dec 03 21:51:11.202279 master-0 kubenswrapper[9136]: I1203 21:51:11.201963 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-var-lock\") pod \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\" (UID: \"ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75\") " Dec 03 21:51:11.202279 master-0 kubenswrapper[9136]: I1203 21:51:11.201990 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" (UID: "ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:11.202279 master-0 kubenswrapper[9136]: I1203 21:51:11.202048 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-var-lock" (OuterVolumeSpecName: "var-lock") pod "ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" (UID: "ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:11.202279 master-0 kubenswrapper[9136]: I1203 21:51:11.202222 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:11.202279 master-0 kubenswrapper[9136]: I1203 21:51:11.202239 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:11.209562 master-0 kubenswrapper[9136]: I1203 21:51:11.209506 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" (UID: "ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:51:11.303716 master-0 kubenswrapper[9136]: I1203 21:51:11.303568 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:11.509011 master-0 kubenswrapper[9136]: E1203 21:51:11.508939 9136 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod4f66f491_f82d_4e8c_8929_4675f99aa5b7.slice/crio-conmon-f69b2fadd7aa21980afa8fbd2988f6315c3c2d241badb575affba3e4afbe34e8.scope\": RecentStats: unable to find data in memory cache]" Dec 03 21:51:12.178829 master-0 kubenswrapper[9136]: I1203 21:51:12.178744 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_4f66f491-f82d-4e8c-8929-4675f99aa5b7/installer/0.log" Dec 03 21:51:12.179566 master-0 kubenswrapper[9136]: I1203 21:51:12.178838 9136 generic.go:334] "Generic (PLEG): container finished" podID="4f66f491-f82d-4e8c-8929-4675f99aa5b7" containerID="f69b2fadd7aa21980afa8fbd2988f6315c3c2d241badb575affba3e4afbe34e8" exitCode=1 Dec 03 21:51:12.179566 master-0 kubenswrapper[9136]: I1203 21:51:12.178941 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"4f66f491-f82d-4e8c-8929-4675f99aa5b7","Type":"ContainerDied","Data":"f69b2fadd7aa21980afa8fbd2988f6315c3c2d241badb575affba3e4afbe34e8"} Dec 03 21:51:13.184875 master-0 kubenswrapper[9136]: I1203 21:51:13.184826 9136 generic.go:334] "Generic (PLEG): container finished" podID="d78739a7694769882b7e47ea5ac08a10" containerID="a5db70d8c04a9fdadcb37e94b566026a87f2170f5e692e5d3ce48ba377b79800" exitCode=1 Dec 03 21:51:13.184875 master-0 kubenswrapper[9136]: I1203 21:51:13.184876 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerDied","Data":"a5db70d8c04a9fdadcb37e94b566026a87f2170f5e692e5d3ce48ba377b79800"} Dec 03 21:51:13.185485 master-0 kubenswrapper[9136]: I1203 21:51:13.185347 9136 scope.go:117] "RemoveContainer" containerID="a5db70d8c04a9fdadcb37e94b566026a87f2170f5e692e5d3ce48ba377b79800" Dec 03 21:51:14.688609 master-0 kubenswrapper[9136]: I1203 21:51:14.688553 9136 scope.go:117] "RemoveContainer" containerID="0d4f1a1f19ffb465b7ec7db2a36e7c6f6dc0dc2fd64d69d37c5c1b12205b376b" Dec 03 21:51:14.795744 master-0 kubenswrapper[9136]: I1203 21:51:14.795614 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 21:51:14.850639 master-0 kubenswrapper[9136]: I1203 21:51:14.850588 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-kubelet-dir\") pod \"3497f5dd-4c6f-4108-a948-481cef475ba9\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " Dec 03 21:51:14.850730 master-0 kubenswrapper[9136]: I1203 21:51:14.850677 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-var-lock\") pod \"3497f5dd-4c6f-4108-a948-481cef475ba9\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " Dec 03 21:51:14.850730 master-0 kubenswrapper[9136]: I1203 21:51:14.850696 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3497f5dd-4c6f-4108-a948-481cef475ba9" (UID: "3497f5dd-4c6f-4108-a948-481cef475ba9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:14.850843 master-0 kubenswrapper[9136]: I1203 21:51:14.850731 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3497f5dd-4c6f-4108-a948-481cef475ba9-kube-api-access\") pod \"3497f5dd-4c6f-4108-a948-481cef475ba9\" (UID: \"3497f5dd-4c6f-4108-a948-481cef475ba9\") " Dec 03 21:51:14.850843 master-0 kubenswrapper[9136]: I1203 21:51:14.850780 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-var-lock" (OuterVolumeSpecName: "var-lock") pod "3497f5dd-4c6f-4108-a948-481cef475ba9" (UID: "3497f5dd-4c6f-4108-a948-481cef475ba9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:14.851251 master-0 kubenswrapper[9136]: I1203 21:51:14.851221 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:14.851294 master-0 kubenswrapper[9136]: I1203 21:51:14.851250 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3497f5dd-4c6f-4108-a948-481cef475ba9-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:14.857848 master-0 kubenswrapper[9136]: I1203 21:51:14.857816 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3497f5dd-4c6f-4108-a948-481cef475ba9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3497f5dd-4c6f-4108-a948-481cef475ba9" (UID: "3497f5dd-4c6f-4108-a948-481cef475ba9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:51:14.952422 master-0 kubenswrapper[9136]: I1203 21:51:14.952362 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3497f5dd-4c6f-4108-a948-481cef475ba9-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:15.198946 master-0 kubenswrapper[9136]: I1203 21:51:15.198798 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"3497f5dd-4c6f-4108-a948-481cef475ba9","Type":"ContainerDied","Data":"1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62"} Dec 03 21:51:15.198946 master-0 kubenswrapper[9136]: I1203 21:51:15.198862 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62" Dec 03 21:51:15.198946 master-0 kubenswrapper[9136]: I1203 21:51:15.198865 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 21:51:15.200386 master-0 kubenswrapper[9136]: I1203 21:51:15.200313 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"8e96e7577e535d3d67b656e498443858cd44974787e324f77ff42ce65db5b502"} Dec 03 21:51:15.202922 master-0 kubenswrapper[9136]: I1203 21:51:15.202805 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" event={"ID":"bd18a700-53b2-430c-a34f-dbb6331cfbe5","Type":"ContainerStarted","Data":"b3ab62262f6fcfe451dadb0b4353828f4d4962cfc03eadf6453b600925623b4d"} Dec 03 21:51:15.860234 master-0 kubenswrapper[9136]: E1203 21:51:15.860138 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:16.960355 master-0 kubenswrapper[9136]: I1203 21:51:16.960262 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 21:51:17.920273 master-0 kubenswrapper[9136]: I1203 21:51:17.920170 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_4f66f491-f82d-4e8c-8929-4675f99aa5b7/installer/0.log" Dec 03 21:51:17.920273 master-0 kubenswrapper[9136]: I1203 21:51:17.920264 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:51:17.995829 master-0 kubenswrapper[9136]: I1203 21:51:17.995693 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kube-api-access\") pod \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " Dec 03 21:51:17.995829 master-0 kubenswrapper[9136]: I1203 21:51:17.995817 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-var-lock\") pod \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " Dec 03 21:51:17.996736 master-0 kubenswrapper[9136]: I1203 21:51:17.995903 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-var-lock" (OuterVolumeSpecName: "var-lock") pod "4f66f491-f82d-4e8c-8929-4675f99aa5b7" (UID: "4f66f491-f82d-4e8c-8929-4675f99aa5b7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:17.996736 master-0 kubenswrapper[9136]: I1203 21:51:17.995945 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kubelet-dir\") pod \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\" (UID: \"4f66f491-f82d-4e8c-8929-4675f99aa5b7\") " Dec 03 21:51:17.996736 master-0 kubenswrapper[9136]: I1203 21:51:17.996038 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4f66f491-f82d-4e8c-8929-4675f99aa5b7" (UID: "4f66f491-f82d-4e8c-8929-4675f99aa5b7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:17.997652 master-0 kubenswrapper[9136]: I1203 21:51:17.996908 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:17.997652 master-0 kubenswrapper[9136]: I1203 21:51:17.996937 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:18.000967 master-0 kubenswrapper[9136]: I1203 21:51:18.000903 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4f66f491-f82d-4e8c-8929-4675f99aa5b7" (UID: "4f66f491-f82d-4e8c-8929-4675f99aa5b7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:51:18.099166 master-0 kubenswrapper[9136]: I1203 21:51:18.098892 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f66f491-f82d-4e8c-8929-4675f99aa5b7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:18.227730 master-0 kubenswrapper[9136]: I1203 21:51:18.227666 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_4f66f491-f82d-4e8c-8929-4675f99aa5b7/installer/0.log" Dec 03 21:51:18.228015 master-0 kubenswrapper[9136]: I1203 21:51:18.227753 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"4f66f491-f82d-4e8c-8929-4675f99aa5b7","Type":"ContainerDied","Data":"53fd5d5cbc26d76aa9e3df18e6da9dd666b3449013285f841eaf2b67b16c171a"} Dec 03 21:51:18.228015 master-0 kubenswrapper[9136]: I1203 21:51:18.227864 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Dec 03 21:51:18.323596 master-0 kubenswrapper[9136]: I1203 21:51:18.323456 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:51:18.394902 master-0 kubenswrapper[9136]: I1203 21:51:18.394733 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:51:19.785131 master-0 kubenswrapper[9136]: I1203 21:51:19.785036 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:51:22.006281 master-0 kubenswrapper[9136]: I1203 21:51:22.005339 9136 scope.go:117] "RemoveContainer" containerID="f69b2fadd7aa21980afa8fbd2988f6315c3c2d241badb575affba3e4afbe34e8" Dec 03 21:51:23.273723 master-0 kubenswrapper[9136]: I1203 21:51:23.273668 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" event={"ID":"858384f3-5741-4e67-8669-2eb2b2dcaf7f","Type":"ContainerStarted","Data":"4c2adc08380436f319ebce0bff4387679c426ba1840c8e9241539270a64e7dab"} Dec 03 21:51:23.276215 master-0 kubenswrapper[9136]: I1203 21:51:23.276151 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"08f0994ce641aa8480f422d38c06dab9bf39acafe61488448bfa2be6e9c006dc"} Dec 03 21:51:23.277998 master-0 kubenswrapper[9136]: I1203 21:51:23.277938 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" event={"ID":"578b2d03-b8b3-4c75-adde-73899c472ad7","Type":"ContainerStarted","Data":"69c66e8c33554080901dde3e7449b542159057d7baa15426e32c9d8a63162fa3"} Dec 03 21:51:23.281086 master-0 kubenswrapper[9136]: I1203 21:51:23.281036 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/0.log" Dec 03 21:51:23.281205 master-0 kubenswrapper[9136]: I1203 21:51:23.281142 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerStarted","Data":"ed377a768f3ae617a354166679d4faec90c18a1d576ddc666702319bbce47318"} Dec 03 21:51:23.284129 master-0 kubenswrapper[9136]: I1203 21:51:23.284092 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b"} Dec 03 21:51:23.286188 master-0 kubenswrapper[9136]: I1203 21:51:23.286148 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" event={"ID":"bd18a700-53b2-430c-a34f-dbb6331cfbe5","Type":"ContainerStarted","Data":"70f67b41dc589699749acb86409088e137724e0bbf05196ad9f606340dc1c95e"} Dec 03 21:51:23.287849 master-0 kubenswrapper[9136]: I1203 21:51:23.287803 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"b8ff15b93fc508f3e0b23a351d0dc90be313d555a284b59cf318e88e1e041e14"} Dec 03 21:51:24.307629 master-0 kubenswrapper[9136]: I1203 21:51:24.307450 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1"} Dec 03 21:51:24.321383 master-0 kubenswrapper[9136]: I1203 21:51:24.321328 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26"} Dec 03 21:51:24.323705 master-0 kubenswrapper[9136]: I1203 21:51:24.323646 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" event={"ID":"3a7e0eea-3da8-43de-87bc-d10231e7c239","Type":"ContainerStarted","Data":"4449d628e9f899d870a78c826bdbbb73dda6c4015e8f36280c0b88ef4a51d663"} Dec 03 21:51:24.325213 master-0 kubenswrapper[9136]: I1203 21:51:24.325160 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerStarted","Data":"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7"} Dec 03 21:51:24.327058 master-0 kubenswrapper[9136]: I1203 21:51:24.327014 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerStarted","Data":"2e3a2cbe5d1d0f700f443f5a9e3ce8b40d65f0655e3c96ce54030b35eb469f7b"} Dec 03 21:51:24.328527 master-0 kubenswrapper[9136]: I1203 21:51:24.328495 9136 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="08f0994ce641aa8480f422d38c06dab9bf39acafe61488448bfa2be6e9c006dc" exitCode=0 Dec 03 21:51:24.328527 master-0 kubenswrapper[9136]: I1203 21:51:24.328526 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerDied","Data":"08f0994ce641aa8480f422d38c06dab9bf39acafe61488448bfa2be6e9c006dc"} Dec 03 21:51:25.338371 master-0 kubenswrapper[9136]: I1203 21:51:25.338299 9136 generic.go:334] "Generic (PLEG): container finished" podID="41b95a38663dd6fe34e183818a475977" containerID="6c365c0b34d496df5874092d2be756d0f4503f99c75f8af416569e515090fd7c" exitCode=0 Dec 03 21:51:25.341892 master-0 kubenswrapper[9136]: I1203 21:51:25.341500 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="b8ff15b93fc508f3e0b23a351d0dc90be313d555a284b59cf318e88e1e041e14" exitCode=0 Dec 03 21:51:25.341892 master-0 kubenswrapper[9136]: I1203 21:51:25.341571 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerDied","Data":"b8ff15b93fc508f3e0b23a351d0dc90be313d555a284b59cf318e88e1e041e14"} Dec 03 21:51:25.862313 master-0 kubenswrapper[9136]: E1203 21:51:25.862133 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:25.920630 master-0 kubenswrapper[9136]: E1203 21:51:25.920446 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:51:15Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:51:15Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:51:15Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:51:15Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9f4724570795357eb097251a021f20c94c79b3054f3adb3bc0812143ba791dc1\\\"],\\\"sizeBytes\\\":461716546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\\\"],\\\"sizeBytes\\\":459566623},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:607e31ebb2c85f53775455b38a607a68cb2bdab1e369f03c57e715a4ebb88831\\\"],\\\"sizeBytes\\\":458183681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36fa1378b9c26de6d45187b1e7352f3b1147109427fab3669b107d81fd967601\\\"],\\\"sizeBytes\\\":452603646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e\\\"],\\\"sizeBytes\\\":451053419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:912759ba49a70e63f7585b351b1deed008b5815d275f478f052c8c2880101d3c\\\"],\\\"sizeBytes\\\":449985691},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7825952834ade266ce08d1a9eb0665e4661dea0a40647d3e1de2cf6266665e9d\\\"],\\\"sizeBytes\\\":443305841},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6dd80564094a262c1bb53c037288c9c69a46b22dc7dd3ee5c52384404ebfdc81\\\"],\\\"sizeBytes\\\":442523452},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:68dbccdff76515d5b659c9c2d031235073d292cb56a5385f8e69d24ac5f48b8f\\\"],\\\"sizeBytes\\\":437751308},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee896bce586a3fcd37b4be8165cf1b4a83e88b5d47667de10475ec43e31b7926\\\"],\\\"sizeBytes\\\":406067436},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2f86d9ffe13cbab06ff676496b50a26bbc4819d8b81b98fbacca6aee9b56792f\\\"],\\\"sizeBytes\\\":401824348},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd\\\"],\\\"sizeBytes\\\":391002580}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:26.350535 master-0 kubenswrapper[9136]: I1203 21:51:26.350431 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" event={"ID":"bd18a700-53b2-430c-a34f-dbb6331cfbe5","Type":"ContainerStarted","Data":"4ec998a5b166308a47b0f47ae48714238a633c357c19b101040a47cbf3083dde"} Dec 03 21:51:26.352403 master-0 kubenswrapper[9136]: I1203 21:51:26.352335 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4c2364d3-47b2-4784-9c42-76bf2547b797/installer/0.log" Dec 03 21:51:26.352533 master-0 kubenswrapper[9136]: I1203 21:51:26.352448 9136 generic.go:334] "Generic (PLEG): container finished" podID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerID="5850f56760e2a61c527c84ff5f17d82ffc3198cbe6b6606b3d9f5f38cfe114d6" exitCode=1 Dec 03 21:51:26.352603 master-0 kubenswrapper[9136]: I1203 21:51:26.352551 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4c2364d3-47b2-4784-9c42-76bf2547b797","Type":"ContainerDied","Data":"5850f56760e2a61c527c84ff5f17d82ffc3198cbe6b6606b3d9f5f38cfe114d6"} Dec 03 21:51:26.354433 master-0 kubenswrapper[9136]: I1203 21:51:26.354366 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerStarted","Data":"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4"} Dec 03 21:51:26.356150 master-0 kubenswrapper[9136]: I1203 21:51:26.356097 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"2bde1579de05ada97e585d0ce6d0d39ebdf48e9c221db2db2566818b9002dffa"} Dec 03 21:51:26.463251 master-0 kubenswrapper[9136]: I1203 21:51:26.463132 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 21:51:26.463251 master-0 kubenswrapper[9136]: I1203 21:51:26.463219 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 21:51:27.369563 master-0 kubenswrapper[9136]: I1203 21:51:27.369469 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_41b95a38663dd6fe34e183818a475977/etcdctl/0.log" Dec 03 21:51:27.369563 master-0 kubenswrapper[9136]: I1203 21:51:27.369553 9136 generic.go:334] "Generic (PLEG): container finished" podID="41b95a38663dd6fe34e183818a475977" containerID="d089dbd1d00b21007f2d7f87058c4506fbc7b6ac8e4051768c22497e2ce3a3f4" exitCode=137 Dec 03 21:51:28.324262 master-0 kubenswrapper[9136]: I1203 21:51:28.324116 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:51:28.394452 master-0 kubenswrapper[9136]: I1203 21:51:28.394348 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:51:30.230501 master-0 kubenswrapper[9136]: E1203 21:51:30.230206 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.187dd3070ad685e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:41b95a38663dd6fe34e183818a475977,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:50:56.199165411 +0000 UTC m=+62.474341793,LastTimestamp:2025-12-03 21:50:56.199165411 +0000 UTC m=+62.474341793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:51:31.324588 master-0 kubenswrapper[9136]: I1203 21:51:31.324476 9136 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:35.863045 master-0 kubenswrapper[9136]: E1203 21:51:35.862810 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:35.921495 master-0 kubenswrapper[9136]: E1203 21:51:35.921401 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:36.463740 master-0 kubenswrapper[9136]: I1203 21:51:36.463492 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 21:51:36.469994 master-0 kubenswrapper[9136]: I1203 21:51:36.464091 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 21:51:38.351258 master-0 kubenswrapper[9136]: E1203 21:51:38.351199 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 21:51:41.324138 master-0 kubenswrapper[9136]: I1203 21:51:41.323998 9136 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:42.469819 master-0 kubenswrapper[9136]: I1203 21:51:42.469650 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/0.log" Dec 03 21:51:42.469819 master-0 kubenswrapper[9136]: I1203 21:51:42.469713 9136 generic.go:334] "Generic (PLEG): container finished" podID="892d5611-debf-402f-abc5-3f99aa080159" containerID="d1b75fc955087be48690894945186bc731e2bed0635b14d30a5226a6cb7dbae4" exitCode=255 Dec 03 21:51:42.469819 master-0 kubenswrapper[9136]: I1203 21:51:42.469751 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerDied","Data":"d1b75fc955087be48690894945186bc731e2bed0635b14d30a5226a6cb7dbae4"} Dec 03 21:51:42.471295 master-0 kubenswrapper[9136]: I1203 21:51:42.470241 9136 scope.go:117] "RemoveContainer" containerID="d1b75fc955087be48690894945186bc731e2bed0635b14d30a5226a6cb7dbae4" Dec 03 21:51:44.016608 master-0 kubenswrapper[9136]: I1203 21:51:44.016549 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4c2364d3-47b2-4784-9c42-76bf2547b797/installer/0.log" Dec 03 21:51:44.018909 master-0 kubenswrapper[9136]: I1203 21:51:44.016688 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:51:44.094819 master-0 kubenswrapper[9136]: I1203 21:51:44.094726 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-kubelet-dir\") pod \"4c2364d3-47b2-4784-9c42-76bf2547b797\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " Dec 03 21:51:44.095239 master-0 kubenswrapper[9136]: I1203 21:51:44.094847 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c2364d3-47b2-4784-9c42-76bf2547b797-kube-api-access\") pod \"4c2364d3-47b2-4784-9c42-76bf2547b797\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " Dec 03 21:51:44.095239 master-0 kubenswrapper[9136]: I1203 21:51:44.094979 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-var-lock\") pod \"4c2364d3-47b2-4784-9c42-76bf2547b797\" (UID: \"4c2364d3-47b2-4784-9c42-76bf2547b797\") " Dec 03 21:51:44.095239 master-0 kubenswrapper[9136]: I1203 21:51:44.095103 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4c2364d3-47b2-4784-9c42-76bf2547b797" (UID: "4c2364d3-47b2-4784-9c42-76bf2547b797"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:44.095239 master-0 kubenswrapper[9136]: I1203 21:51:44.095135 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-var-lock" (OuterVolumeSpecName: "var-lock") pod "4c2364d3-47b2-4784-9c42-76bf2547b797" (UID: "4c2364d3-47b2-4784-9c42-76bf2547b797"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:44.095416 master-0 kubenswrapper[9136]: I1203 21:51:44.095382 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:44.095416 master-0 kubenswrapper[9136]: I1203 21:51:44.095404 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4c2364d3-47b2-4784-9c42-76bf2547b797-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:44.100079 master-0 kubenswrapper[9136]: I1203 21:51:44.099994 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2364d3-47b2-4784-9c42-76bf2547b797-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4c2364d3-47b2-4784-9c42-76bf2547b797" (UID: "4c2364d3-47b2-4784-9c42-76bf2547b797"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:51:44.196826 master-0 kubenswrapper[9136]: I1203 21:51:44.196611 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4c2364d3-47b2-4784-9c42-76bf2547b797-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:44.502682 master-0 kubenswrapper[9136]: I1203 21:51:44.502489 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4c2364d3-47b2-4784-9c42-76bf2547b797/installer/0.log" Dec 03 21:51:44.502682 master-0 kubenswrapper[9136]: I1203 21:51:44.502606 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"4c2364d3-47b2-4784-9c42-76bf2547b797","Type":"ContainerDied","Data":"2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c"} Dec 03 21:51:44.502682 master-0 kubenswrapper[9136]: I1203 21:51:44.502666 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c" Dec 03 21:51:44.503138 master-0 kubenswrapper[9136]: I1203 21:51:44.502865 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 21:51:45.864452 master-0 kubenswrapper[9136]: E1203 21:51:45.864280 9136 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io master-0)" Dec 03 21:51:45.922222 master-0 kubenswrapper[9136]: E1203 21:51:45.922083 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:46.463347 master-0 kubenswrapper[9136]: I1203 21:51:46.463266 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 21:51:46.463347 master-0 kubenswrapper[9136]: I1203 21:51:46.463333 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 21:51:46.463828 master-0 kubenswrapper[9136]: I1203 21:51:46.463381 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:51:46.464155 master-0 kubenswrapper[9136]: I1203 21:51:46.464072 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce"} pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Dec 03 21:51:46.464155 master-0 kubenswrapper[9136]: I1203 21:51:46.464129 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" containerID="cri-o://2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce" gracePeriod=30 Dec 03 21:51:50.468096 master-0 kubenswrapper[9136]: I1203 21:51:50.468041 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_41b95a38663dd6fe34e183818a475977/etcdctl/0.log" Dec 03 21:51:50.468964 master-0 kubenswrapper[9136]: I1203 21:51:50.468156 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:51:50.544991 master-0 kubenswrapper[9136]: I1203 21:51:50.544844 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_41b95a38663dd6fe34e183818a475977/etcdctl/0.log" Dec 03 21:51:50.544991 master-0 kubenswrapper[9136]: I1203 21:51:50.544921 9136 scope.go:117] "RemoveContainer" containerID="6c365c0b34d496df5874092d2be756d0f4503f99c75f8af416569e515090fd7c" Dec 03 21:51:50.545321 master-0 kubenswrapper[9136]: I1203 21:51:50.545055 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:51:50.585214 master-0 kubenswrapper[9136]: I1203 21:51:50.585148 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") pod \"41b95a38663dd6fe34e183818a475977\" (UID: \"41b95a38663dd6fe34e183818a475977\") " Dec 03 21:51:50.585214 master-0 kubenswrapper[9136]: I1203 21:51:50.585219 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") pod \"41b95a38663dd6fe34e183818a475977\" (UID: \"41b95a38663dd6fe34e183818a475977\") " Dec 03 21:51:50.585640 master-0 kubenswrapper[9136]: I1203 21:51:50.585459 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs" (OuterVolumeSpecName: "certs") pod "41b95a38663dd6fe34e183818a475977" (UID: "41b95a38663dd6fe34e183818a475977"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:50.585640 master-0 kubenswrapper[9136]: I1203 21:51:50.585492 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir" (OuterVolumeSpecName: "data-dir") pod "41b95a38663dd6fe34e183818a475977" (UID: "41b95a38663dd6fe34e183818a475977"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:51:50.686280 master-0 kubenswrapper[9136]: I1203 21:51:50.686202 9136 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:50.686280 master-0 kubenswrapper[9136]: I1203 21:51:50.686254 9136 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/41b95a38663dd6fe34e183818a475977-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 21:51:51.324427 master-0 kubenswrapper[9136]: I1203 21:51:51.324345 9136 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:51.324823 master-0 kubenswrapper[9136]: I1203 21:51:51.324803 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:51:51.325618 master-0 kubenswrapper[9136]: I1203 21:51:51.325594 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 03 21:51:51.325803 master-0 kubenswrapper[9136]: I1203 21:51:51.325764 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" containerID="cri-o://311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b" gracePeriod=30 Dec 03 21:51:51.556398 master-0 kubenswrapper[9136]: I1203 21:51:51.556189 9136 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="129bb34723f1a3b3bc6376ce7e4a5163dea17767bc888149557663566136439c" exitCode=0 Dec 03 21:51:51.556398 master-0 kubenswrapper[9136]: I1203 21:51:51.556353 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerDied","Data":"129bb34723f1a3b3bc6376ce7e4a5163dea17767bc888149557663566136439c"} Dec 03 21:51:51.557286 master-0 kubenswrapper[9136]: I1203 21:51:51.557157 9136 scope.go:117] "RemoveContainer" containerID="129bb34723f1a3b3bc6376ce7e4a5163dea17767bc888149557663566136439c" Dec 03 21:51:51.558782 master-0 kubenswrapper[9136]: I1203 21:51:51.558728 9136 generic.go:334] "Generic (PLEG): container finished" podID="82055cfc-b4ce-4a00-a51d-141059947693" containerID="7abc4d8635b4469a4776710f14e691f06a0b7b60d5e937f6ea27f069d519024a" exitCode=0 Dec 03 21:51:51.559067 master-0 kubenswrapper[9136]: I1203 21:51:51.558822 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerDied","Data":"7abc4d8635b4469a4776710f14e691f06a0b7b60d5e937f6ea27f069d519024a"} Dec 03 21:51:51.560148 master-0 kubenswrapper[9136]: I1203 21:51:51.560112 9136 scope.go:117] "RemoveContainer" containerID="7abc4d8635b4469a4776710f14e691f06a0b7b60d5e937f6ea27f069d519024a" Dec 03 21:51:51.561571 master-0 kubenswrapper[9136]: I1203 21:51:51.561513 9136 generic.go:334] "Generic (PLEG): container finished" podID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" containerID="930ace1c47675e8f2e46f1361fdd688d5b7098c0fe077d099c56f05d2371a225" exitCode=0 Dec 03 21:51:51.561663 master-0 kubenswrapper[9136]: I1203 21:51:51.561584 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerDied","Data":"930ace1c47675e8f2e46f1361fdd688d5b7098c0fe077d099c56f05d2371a225"} Dec 03 21:51:51.562609 master-0 kubenswrapper[9136]: I1203 21:51:51.562555 9136 scope.go:117] "RemoveContainer" containerID="930ace1c47675e8f2e46f1361fdd688d5b7098c0fe077d099c56f05d2371a225" Dec 03 21:51:51.564416 master-0 kubenswrapper[9136]: I1203 21:51:51.564343 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/0.log" Dec 03 21:51:51.565076 master-0 kubenswrapper[9136]: I1203 21:51:51.565018 9136 generic.go:334] "Generic (PLEG): container finished" podID="1a0f647a-0260-4737-8ae2-cc90d01d33d1" containerID="9a23336ef4cbe48c00b2e6685616a19fe24f57f37d3cf40c5d7d0d8dc909c159" exitCode=1 Dec 03 21:51:51.565208 master-0 kubenswrapper[9136]: I1203 21:51:51.565090 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerDied","Data":"9a23336ef4cbe48c00b2e6685616a19fe24f57f37d3cf40c5d7d0d8dc909c159"} Dec 03 21:51:51.565964 master-0 kubenswrapper[9136]: I1203 21:51:51.565901 9136 scope.go:117] "RemoveContainer" containerID="9a23336ef4cbe48c00b2e6685616a19fe24f57f37d3cf40c5d7d0d8dc909c159" Dec 03 21:51:51.567365 master-0 kubenswrapper[9136]: I1203 21:51:51.567307 9136 generic.go:334] "Generic (PLEG): container finished" podID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" containerID="f1f81a88c3e9df6920f7399941c169fb03094cedcb1ae2463c0147ff10995db1" exitCode=0 Dec 03 21:51:51.567468 master-0 kubenswrapper[9136]: I1203 21:51:51.567395 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerDied","Data":"f1f81a88c3e9df6920f7399941c169fb03094cedcb1ae2463c0147ff10995db1"} Dec 03 21:51:51.567856 master-0 kubenswrapper[9136]: I1203 21:51:51.567787 9136 scope.go:117] "RemoveContainer" containerID="f1f81a88c3e9df6920f7399941c169fb03094cedcb1ae2463c0147ff10995db1" Dec 03 21:51:51.570416 master-0 kubenswrapper[9136]: I1203 21:51:51.570351 9136 generic.go:334] "Generic (PLEG): container finished" podID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" containerID="bfa48bdaae3ed0aa3b0f3696051d4e5dea6449cb42ff81d3ed5c9026e7ac1908" exitCode=0 Dec 03 21:51:51.570527 master-0 kubenswrapper[9136]: I1203 21:51:51.570442 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerDied","Data":"bfa48bdaae3ed0aa3b0f3696051d4e5dea6449cb42ff81d3ed5c9026e7ac1908"} Dec 03 21:51:51.571362 master-0 kubenswrapper[9136]: I1203 21:51:51.571257 9136 scope.go:117] "RemoveContainer" containerID="bfa48bdaae3ed0aa3b0f3696051d4e5dea6449cb42ff81d3ed5c9026e7ac1908" Dec 03 21:51:51.573835 master-0 kubenswrapper[9136]: I1203 21:51:51.573741 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/0.log" Dec 03 21:51:51.573948 master-0 kubenswrapper[9136]: I1203 21:51:51.573865 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerStarted","Data":"db4f2041dbd606d645ef97beb6cb4387e780516338543969beb7168c5691d4f3"} Dec 03 21:51:51.603087 master-0 kubenswrapper[9136]: I1203 21:51:51.603001 9136 scope.go:117] "RemoveContainer" containerID="d089dbd1d00b21007f2d7f87058c4506fbc7b6ac8e4051768c22497e2ce3a3f4" Dec 03 21:51:51.921287 master-0 kubenswrapper[9136]: I1203 21:51:51.921211 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b95a38663dd6fe34e183818a475977" path="/var/lib/kubelet/pods/41b95a38663dd6fe34e183818a475977/volumes" Dec 03 21:51:51.922092 master-0 kubenswrapper[9136]: I1203 21:51:51.922054 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 21:51:52.591763 master-0 kubenswrapper[9136]: I1203 21:51:52.591679 9136 generic.go:334] "Generic (PLEG): container finished" podID="08432be8-0086-48d2-a93d-7a474e96749d" containerID="ffe4f10866cf1bc36713ebdb04a86f2cd5ff92ba34f253339cffa02ccd5a5e66" exitCode=0 Dec 03 21:51:52.596983 master-0 kubenswrapper[9136]: I1203 21:51:52.596912 9136 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b" exitCode=2 Dec 03 21:51:52.603374 master-0 kubenswrapper[9136]: I1203 21:51:52.603268 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="8c01bfaede47fa025c4081862076a0aa6db537aee1b97a2732cbc64498e3dbda" exitCode=0 Dec 03 21:51:54.632213 master-0 kubenswrapper[9136]: I1203 21:51:54.632128 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/0.log" Dec 03 21:51:55.652817 master-0 kubenswrapper[9136]: I1203 21:51:55.652684 9136 generic.go:334] "Generic (PLEG): container finished" podID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerID="2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce" exitCode=0 Dec 03 21:51:55.866186 master-0 kubenswrapper[9136]: E1203 21:51:55.865801 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Dec 03 21:51:55.866186 master-0 kubenswrapper[9136]: I1203 21:51:55.865879 9136 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 21:51:55.922308 master-0 kubenswrapper[9136]: E1203 21:51:55.922143 9136 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff" to get inode usage: stat /var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff: no such file or directory, extraDiskErr: Dec 03 21:51:55.922927 master-0 kubenswrapper[9136]: E1203 21:51:55.922358 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:51:56.281692 master-0 kubenswrapper[9136]: E1203 21:51:56.281514 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" is forbidden: the server was unable to return a response in the time allotted, but may still be processing the request (get limitranges)" pod="openshift-etcd/etcd-master-0" Dec 03 21:51:56.660426 master-0 kubenswrapper[9136]: I1203 21:51:56.660323 9136 generic.go:334] "Generic (PLEG): container finished" podID="5f088999-ec66-402e-9634-8c762206d6b4" containerID="7abe60a608c2243a0171688d4d7d094c30c03b0afcc0cdcafa8f846a9926432b" exitCode=0 Dec 03 21:51:58.681421 master-0 kubenswrapper[9136]: I1203 21:51:58.681306 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="f5cb9e42c952aa7425c595831eb5aaa25eadc2bda2a857d2db27d386e674a3ef" exitCode=0 Dec 03 21:52:00.706070 master-0 kubenswrapper[9136]: I1203 21:52:00.705970 9136 generic.go:334] "Generic (PLEG): container finished" podID="f59094ec-47dd-4547-ad41-b15a7933f461" containerID="f8f51cc4e951397e53befcebae88ea2052970977c20019cfe472b3c9dcc46778" exitCode=0 Dec 03 21:52:04.233630 master-0 kubenswrapper[9136]: E1203 21:52:04.233283 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd30a34a813c1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:09.785666497 +0000 UTC m=+76.060842949,LastTimestamp:2025-12-03 21:51:09.785666497 +0000 UTC m=+76.060842949,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:52:05.866683 master-0 kubenswrapper[9136]: E1203 21:52:05.866534 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Dec 03 21:52:05.923242 master-0 kubenswrapper[9136]: E1203 21:52:05.923149 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:52:05.923242 master-0 kubenswrapper[9136]: E1203 21:52:05.923210 9136 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 21:52:08.150175 master-0 kubenswrapper[9136]: I1203 21:52:08.150041 9136 status_manager.go:851] "Failed to get status for pod" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods openshift-controller-manager-operator-7c4697b5f5-458zh)" Dec 03 21:52:12.798765 master-0 kubenswrapper[9136]: I1203 21:52:12.797904 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0269ada6-cb6e-4c98-bd24-752ae0286498/installer/0.log" Dec 03 21:52:12.798765 master-0 kubenswrapper[9136]: I1203 21:52:12.797985 9136 generic.go:334] "Generic (PLEG): container finished" podID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerID="828e1c847e23b1857152d987d6c054b5de0c2ac4dd44ef540ab230ae90795ebb" exitCode=1 Dec 03 21:52:15.314900 master-0 kubenswrapper[9136]: E1203 21:52:15.314842 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:52:15.314900 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf" Netns:"/var/run/netns/dde5e048-e5ba-4cf2-b581-e5b290bb0a41" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kp794?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.314900 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.314900 master-0 kubenswrapper[9136]: > Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: E1203 21:52:15.314938 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf" Netns:"/var/run/netns/dde5e048-e5ba-4cf2-b581-e5b290bb0a41" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kp794?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: E1203 21:52:15.314961 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf" Netns:"/var/run/netns/dde5e048-e5ba-4cf2-b581-e5b290bb0a41" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kp794?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:52:15.315559 master-0 kubenswrapper[9136]: E1203 21:52:15.315035 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-kp794_openshift-marketplace(0a49c320-f31d-4f6d-98c3-48d24346b873)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-kp794_openshift-marketplace(0a49c320-f31d-4f6d-98c3-48d24346b873)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf\\\" Netns:\\\"/var/run/netns/dde5e048-e5ba-4cf2-b581-e5b290bb0a41\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=32fde28cf2df39bbbc78309ff81628c00081ae1835699235ec97201c3ab1b7bf;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-kp794?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-kp794" podUID="0a49c320-f31d-4f6d-98c3-48d24346b873" Dec 03 21:52:15.531761 master-0 kubenswrapper[9136]: E1203 21:52:15.531679 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:52:15.531761 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa" Netns:"/var/run/netns/74e085a1-fd74-4fbd-bc46-dc2c8e1219b8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.531761 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.531761 master-0 kubenswrapper[9136]: > Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: E1203 21:52:15.531814 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa" Netns:"/var/run/netns/74e085a1-fd74-4fbd-bc46-dc2c8e1219b8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: E1203 21:52:15.531844 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa" Netns:"/var/run/netns/74e085a1-fd74-4fbd-bc46-dc2c8e1219b8" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.531941 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:52:15.532167 master-0 kubenswrapper[9136]: E1203 21:52:15.531921 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-nwjnw_openshift-marketplace(da0d36d7-fb62-4254-95b0-fb81dcf372cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-nwjnw_openshift-marketplace(da0d36d7-fb62-4254-95b0-fb81dcf372cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa\\\" Netns:\\\"/var/run/netns/74e085a1-fd74-4fbd-bc46-dc2c8e1219b8\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=b15b03d588c3d2451b9e915396310006bb1355dcb4a18af6f0957e336a9ec8aa;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-nwjnw" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" Dec 03 21:52:15.540928 master-0 kubenswrapper[9136]: E1203 21:52:15.540852 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:52:15.540928 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e" Netns:"/var/run/netns/d97d4a1e-eb39-4c21-9edd-e6e9b94014b0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.540928 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.540928 master-0 kubenswrapper[9136]: > Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: E1203 21:52:15.540953 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e" Netns:"/var/run/netns/d97d4a1e-eb39-4c21-9edd-e6e9b94014b0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: E1203 21:52:15.540983 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e" Netns:"/var/run/netns/d97d4a1e-eb39-4c21-9edd-e6e9b94014b0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:52:15.541117 master-0 kubenswrapper[9136]: E1203 21:52:15.541063 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-k98b2_openshift-marketplace(e403ab42-1840-4292-a37c-a8d4feeb54ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-k98b2_openshift-marketplace(e403ab42-1840-4292-a37c-a8d4feeb54ca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e\\\" Netns:\\\"/var/run/netns/d97d4a1e-eb39-4c21-9edd-e6e9b94014b0\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=842fac15f61f779c638c5d50798bcc09b0b6d61cb707efde79d7179b51a1ea6e;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-k98b2" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" Dec 03 21:52:15.553561 master-0 kubenswrapper[9136]: E1203 21:52:15.553482 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:52:15.553561 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4" Netns:"/var/run/netns/74dc8a94-bbea-41cd-ad9e-d4d2b3910d25" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.553561 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.553561 master-0 kubenswrapper[9136]: > Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: E1203 21:52:15.553584 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4" Netns:"/var/run/netns/74dc8a94-bbea-41cd-ad9e-d4d2b3910d25" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: > pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: E1203 21:52:15.553612 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4" Netns:"/var/run/netns/74dc8a94-bbea-41cd-ad9e-d4d2b3910d25" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:52:15.553682 master-0 kubenswrapper[9136]: > pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:52:15.553895 master-0 kubenswrapper[9136]: E1203 21:52:15.553704 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager(77e36f4e-845b-4b82-8abc-b634636c087a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager(77e36f4e-845b-4b82-8abc-b634636c087a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4\\\" Netns:\\\"/var/run/netns/74dc8a94-bbea-41cd-ad9e-d4d2b3910d25\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=594de8f1c049a1e991703a0567a1fe7b3bf6a7f28cc8d1bbcaa3beb44eb732b4;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" podUID="77e36f4e-845b-4b82-8abc-b634636c087a" Dec 03 21:52:15.819058 master-0 kubenswrapper[9136]: I1203 21:52:15.818827 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:52:15.819058 master-0 kubenswrapper[9136]: I1203 21:52:15.818901 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:52:15.819058 master-0 kubenswrapper[9136]: I1203 21:52:15.818953 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:52:15.819431 master-0 kubenswrapper[9136]: I1203 21:52:15.818965 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:52:15.819802 master-0 kubenswrapper[9136]: I1203 21:52:15.819710 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:52:15.820080 master-0 kubenswrapper[9136]: I1203 21:52:15.820027 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:52:15.820156 master-0 kubenswrapper[9136]: I1203 21:52:15.820115 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:52:15.820222 master-0 kubenswrapper[9136]: I1203 21:52:15.820174 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:52:16.068389 master-0 kubenswrapper[9136]: E1203 21:52:16.068328 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Dec 03 21:52:23.872223 master-0 kubenswrapper[9136]: I1203 21:52:23.872182 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/0.log" Dec 03 21:52:23.872869 master-0 kubenswrapper[9136]: I1203 21:52:23.872235 9136 generic.go:334] "Generic (PLEG): container finished" podID="fa9b5917-d4f3-4372-a200-45b57412f92f" containerID="4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1" exitCode=1 Dec 03 21:52:25.926848 master-0 kubenswrapper[9136]: E1203 21:52:25.925122 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:52:25.926848 master-0 kubenswrapper[9136]: E1203 21:52:25.925378 9136 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.018s" Dec 03 21:52:25.926848 master-0 kubenswrapper[9136]: I1203 21:52:25.925412 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerStarted","Data":"14ce5aa72512f80cbae852a84578e1ef46771d28aa2cb1798dd058e96128786a"} Dec 03 21:52:25.926848 master-0 kubenswrapper[9136]: I1203 21:52:25.925562 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" event={"ID":"8b56c318-09b7-47f0-a7bf-32eb96e836ca","Type":"ContainerStarted","Data":"dc9a686d307f5dc3dced989067ef4ced2ce1f8af42bcaa948a56153b4e7018c7"} Dec 03 21:52:25.940365 master-0 kubenswrapper[9136]: I1203 21:52:25.940303 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 21:52:26.222381 master-0 kubenswrapper[9136]: E1203 21:52:26.222077 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:52:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:52:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:52:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:52:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9f4724570795357eb097251a021f20c94c79b3054f3adb3bc0812143ba791dc1\\\"],\\\"sizeBytes\\\":461716546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\\\"],\\\"sizeBytes\\\":459566623},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:607e31ebb2c85f53775455b38a607a68cb2bdab1e369f03c57e715a4ebb88831\\\"],\\\"sizeBytes\\\":458183681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36fa1378b9c26de6d45187b1e7352f3b1147109427fab3669b107d81fd967601\\\"],\\\"sizeBytes\\\":452603646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e\\\"],\\\"sizeBytes\\\":451053419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2d41c3e944e86b73b4ba0d037ff016562211988f3206b9deb6cc7dccca708248\\\"],\\\"sizeBytes\\\":450855746}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:52:26.469803 master-0 kubenswrapper[9136]: E1203 21:52:26.469698 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Dec 03 21:52:30.918889 master-0 kubenswrapper[9136]: I1203 21:52:30.918796 9136 generic.go:334] "Generic (PLEG): container finished" podID="578b2d03-b8b3-4c75-adde-73899c472ad7" containerID="69c66e8c33554080901dde3e7449b542159057d7baa15426e32c9d8a63162fa3" exitCode=0 Dec 03 21:52:36.223655 master-0 kubenswrapper[9136]: E1203 21:52:36.223566 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:52:37.271261 master-0 kubenswrapper[9136]: E1203 21:52:37.271104 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 03 21:52:38.237611 master-0 kubenswrapper[9136]: E1203 21:52:38.237391 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-controller-manager-operator-7c4697b5f5-458zh.187dd2f8e5e6e842 openshift-controller-manager-operator 3729 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager-operator,Name:openshift-controller-manager-operator-7c4697b5f5-458zh,UID:6976b503-87da-48fc-b097-d1b315fbee3f,APIVersion:v1,ResourceVersion:3558,FieldPath:spec.containers{openshift-controller-manager-operator},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:49:55 +0000 UTC,LastTimestamp:2025-12-03 21:51:10.82464133 +0000 UTC m=+77.099817752,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:52:45.010643 master-0 kubenswrapper[9136]: I1203 21:52:45.010439 9136 generic.go:334] "Generic (PLEG): container finished" podID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerID="82722f9afea36ecc1e8162d281159bb72f29817202c15f48a80157dddaa3525e" exitCode=0 Dec 03 21:52:46.224559 master-0 kubenswrapper[9136]: E1203 21:52:46.224443 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:52:46.949963 master-0 kubenswrapper[9136]: I1203 21:52:46.949845 9136 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-4jd6d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" start-of-body= Dec 03 21:52:46.949963 master-0 kubenswrapper[9136]: I1203 21:52:46.949901 9136 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-4jd6d container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" start-of-body= Dec 03 21:52:46.949963 master-0 kubenswrapper[9136]: I1203 21:52:46.949952 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" podUID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" Dec 03 21:52:46.950475 master-0 kubenswrapper[9136]: I1203 21:52:46.949997 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" podUID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" Dec 03 21:52:48.872371 master-0 kubenswrapper[9136]: E1203 21:52:48.872131 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 03 21:52:54.075969 master-0 kubenswrapper[9136]: I1203 21:52:54.075890 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/1.log" Dec 03 21:52:54.077294 master-0 kubenswrapper[9136]: I1203 21:52:54.077227 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/0.log" Dec 03 21:52:54.077397 master-0 kubenswrapper[9136]: I1203 21:52:54.077315 9136 generic.go:334] "Generic (PLEG): container finished" podID="6976b503-87da-48fc-b097-d1b315fbee3f" containerID="ed377a768f3ae617a354166679d4faec90c18a1d576ddc666702319bbce47318" exitCode=255 Dec 03 21:52:54.080159 master-0 kubenswrapper[9136]: I1203 21:52:54.080083 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/0.log" Dec 03 21:52:54.080330 master-0 kubenswrapper[9136]: I1203 21:52:54.080158 9136 generic.go:334] "Generic (PLEG): container finished" podID="6e96335e-1866-41c8-b128-b95e783a9be4" containerID="2e3a2cbe5d1d0f700f443f5a9e3ce8b40d65f0655e3c96ce54030b35eb469f7b" exitCode=255 Dec 03 21:52:55.090199 master-0 kubenswrapper[9136]: I1203 21:52:55.089978 9136 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="fdc23b60da9292f04f004b2100cec48c57024e9040275eb930fc44673f8ac8bd" exitCode=1 Dec 03 21:52:55.092608 master-0 kubenswrapper[9136]: I1203 21:52:55.092557 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/0.log" Dec 03 21:52:55.092709 master-0 kubenswrapper[9136]: I1203 21:52:55.092612 9136 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="6c63c3c05a429e1c2a2724aa6046ef16e3d07844020b76ab9c6e4dd4aedf14d1" exitCode=1 Dec 03 21:52:56.225136 master-0 kubenswrapper[9136]: E1203 21:52:56.225009 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:52:56.949940 master-0 kubenswrapper[9136]: I1203 21:52:56.949881 9136 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-4jd6d container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" start-of-body= Dec 03 21:52:56.950275 master-0 kubenswrapper[9136]: I1203 21:52:56.950244 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" podUID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" Dec 03 21:52:56.950361 master-0 kubenswrapper[9136]: I1203 21:52:56.949970 9136 patch_prober.go:28] interesting pod/marketplace-operator-7d67745bb7-4jd6d container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" start-of-body= Dec 03 21:52:56.950433 master-0 kubenswrapper[9136]: I1203 21:52:56.950419 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" podUID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.19:8080/healthz\": dial tcp 10.128.0.19:8080: connect: connection refused" Dec 03 21:52:59.943721 master-0 kubenswrapper[9136]: E1203 21:52:59.943630 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:52:59.944909 master-0 kubenswrapper[9136]: E1203 21:52:59.943954 9136 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.018s" Dec 03 21:52:59.945060 master-0 kubenswrapper[9136]: I1203 21:52:59.945007 9136 scope.go:117] "RemoveContainer" containerID="f8f51cc4e951397e53befcebae88ea2052970977c20019cfe472b3c9dcc46778" Dec 03 21:52:59.946042 master-0 kubenswrapper[9136]: I1203 21:52:59.945985 9136 scope.go:117] "RemoveContainer" containerID="2e3a2cbe5d1d0f700f443f5a9e3ce8b40d65f0655e3c96ce54030b35eb469f7b" Dec 03 21:52:59.946180 master-0 kubenswrapper[9136]: I1203 21:52:59.946122 9136 scope.go:117] "RemoveContainer" containerID="69c66e8c33554080901dde3e7449b542159057d7baa15426e32c9d8a63162fa3" Dec 03 21:52:59.949176 master-0 kubenswrapper[9136]: I1203 21:52:59.949126 9136 scope.go:117] "RemoveContainer" containerID="4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1" Dec 03 21:52:59.951193 master-0 kubenswrapper[9136]: I1203 21:52:59.951138 9136 scope.go:117] "RemoveContainer" containerID="82722f9afea36ecc1e8162d281159bb72f29817202c15f48a80157dddaa3525e" Dec 03 21:52:59.951988 master-0 kubenswrapper[9136]: I1203 21:52:59.951929 9136 scope.go:117] "RemoveContainer" containerID="7abe60a608c2243a0171688d4d7d094c30c03b0afcc0cdcafa8f846a9926432b" Dec 03 21:52:59.952262 master-0 kubenswrapper[9136]: I1203 21:52:59.952194 9136 scope.go:117] "RemoveContainer" containerID="ffe4f10866cf1bc36713ebdb04a86f2cd5ff92ba34f253339cffa02ccd5a5e66" Dec 03 21:52:59.958715 master-0 kubenswrapper[9136]: I1203 21:52:59.958661 9136 scope.go:117] "RemoveContainer" containerID="ed377a768f3ae617a354166679d4faec90c18a1d576ddc666702319bbce47318" Dec 03 21:52:59.958895 master-0 kubenswrapper[9136]: I1203 21:52:59.958854 9136 scope.go:117] "RemoveContainer" containerID="6c63c3c05a429e1c2a2724aa6046ef16e3d07844020b76ab9c6e4dd4aedf14d1" Dec 03 21:52:59.959253 master-0 kubenswrapper[9136]: E1203 21:52:59.959073 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 21:52:59.960537 master-0 kubenswrapper[9136]: I1203 21:52:59.960482 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 21:53:00.420234 master-0 kubenswrapper[9136]: I1203 21:53:00.420184 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0269ada6-cb6e-4c98-bd24-752ae0286498/installer/0.log" Dec 03 21:53:00.420404 master-0 kubenswrapper[9136]: I1203 21:53:00.420309 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:53:00.574525 master-0 kubenswrapper[9136]: I1203 21:53:00.574384 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-var-lock\") pod \"0269ada6-cb6e-4c98-bd24-752ae0286498\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " Dec 03 21:53:00.574525 master-0 kubenswrapper[9136]: I1203 21:53:00.574467 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-kubelet-dir\") pod \"0269ada6-cb6e-4c98-bd24-752ae0286498\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " Dec 03 21:53:00.574525 master-0 kubenswrapper[9136]: I1203 21:53:00.574506 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0269ada6-cb6e-4c98-bd24-752ae0286498-kube-api-access\") pod \"0269ada6-cb6e-4c98-bd24-752ae0286498\" (UID: \"0269ada6-cb6e-4c98-bd24-752ae0286498\") " Dec 03 21:53:00.574873 master-0 kubenswrapper[9136]: I1203 21:53:00.574582 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-var-lock" (OuterVolumeSpecName: "var-lock") pod "0269ada6-cb6e-4c98-bd24-752ae0286498" (UID: "0269ada6-cb6e-4c98-bd24-752ae0286498"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:53:00.574873 master-0 kubenswrapper[9136]: I1203 21:53:00.574587 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0269ada6-cb6e-4c98-bd24-752ae0286498" (UID: "0269ada6-cb6e-4c98-bd24-752ae0286498"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:53:00.574873 master-0 kubenswrapper[9136]: I1203 21:53:00.574820 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 21:53:00.574873 master-0 kubenswrapper[9136]: I1203 21:53:00.574836 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0269ada6-cb6e-4c98-bd24-752ae0286498-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 21:53:00.941805 master-0 kubenswrapper[9136]: I1203 21:53:00.941485 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0269ada6-cb6e-4c98-bd24-752ae0286498-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0269ada6-cb6e-4c98-bd24-752ae0286498" (UID: "0269ada6-cb6e-4c98-bd24-752ae0286498"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:53:00.980868 master-0 kubenswrapper[9136]: I1203 21:53:00.980760 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0269ada6-cb6e-4c98-bd24-752ae0286498-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 21:53:01.134103 master-0 kubenswrapper[9136]: I1203 21:53:01.134032 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0269ada6-cb6e-4c98-bd24-752ae0286498/installer/0.log" Dec 03 21:53:01.134393 master-0 kubenswrapper[9136]: I1203 21:53:01.134199 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 21:53:01.146928 master-0 kubenswrapper[9136]: I1203 21:53:01.146878 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/0.log" Dec 03 21:53:01.150980 master-0 kubenswrapper[9136]: I1203 21:53:01.150936 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/0.log" Dec 03 21:53:01.158388 master-0 kubenswrapper[9136]: I1203 21:53:01.158357 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/0.log" Dec 03 21:53:02.073191 master-0 kubenswrapper[9136]: E1203 21:53:02.073095 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 03 21:53:04.916820 master-0 kubenswrapper[9136]: I1203 21:53:04.916674 9136 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-kz8nk container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.32:8081/readyz\": dial tcp 10.128.0.32:8081: connect: connection refused" start-of-body= Dec 03 21:53:04.916820 master-0 kubenswrapper[9136]: I1203 21:53:04.916760 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podUID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.32:8081/readyz\": dial tcp 10.128.0.32:8081: connect: connection refused" Dec 03 21:53:05.189432 master-0 kubenswrapper[9136]: I1203 21:53:05.189380 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/0.log" Dec 03 21:53:05.190202 master-0 kubenswrapper[9136]: I1203 21:53:05.190130 9136 generic.go:334] "Generic (PLEG): container finished" podID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerID="0f6a85e4a73afc226173f2b4c67fa571e667d3c81985c4b5d23669be9018152c" exitCode=1 Dec 03 21:53:05.193237 master-0 kubenswrapper[9136]: I1203 21:53:05.193199 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/0.log" Dec 03 21:53:05.193304 master-0 kubenswrapper[9136]: I1203 21:53:05.193246 9136 generic.go:334] "Generic (PLEG): container finished" podID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerID="5ae5503bf1205bc663cc1204fe09557a6737e824df9d6d84a86d54a184a45a47" exitCode=1 Dec 03 21:53:06.203147 master-0 kubenswrapper[9136]: I1203 21:53:06.203057 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/0.log" Dec 03 21:53:06.204156 master-0 kubenswrapper[9136]: I1203 21:53:06.203146 9136 generic.go:334] "Generic (PLEG): container finished" podID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" containerID="12901db2ae1fd13e3c0aef0f6572f021784e96590f0cc7bfafbbf5cbabb162c5" exitCode=1 Dec 03 21:53:06.225455 master-0 kubenswrapper[9136]: E1203 21:53:06.225362 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:53:06.225455 master-0 kubenswrapper[9136]: E1203 21:53:06.225418 9136 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 21:53:08.156632 master-0 kubenswrapper[9136]: I1203 21:53:08.156498 9136 status_manager.go:851] "Failed to get status for pod" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" pod="openshift-kube-apiserver/installer-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-1-master-0)" Dec 03 21:53:12.240591 master-0 kubenswrapper[9136]: E1203 21:53:12.240345 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{openshift-config-operator-68c95b6cf5-2cs5d.187dd30a848c6717 openshift-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-config-operator,Name:openshift-config-operator-68c95b6cf5-2cs5d,UID:e50b85a6-7767-4fca-8133-8243bdd85e5d,APIVersion:v1,ResourceVersion:8641,FieldPath:spec.initContainers{openshift-api},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e0e3400f1cb68a205bfb841b6b1a78045e7d80703830aa64979d46418d19c835\" in 19.402s (19.402s including waiting). Image size: 433128028 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:11.126030103 +0000 UTC m=+77.401206525,LastTimestamp:2025-12-03 21:51:11.126030103 +0000 UTC m=+77.401206525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:53:12.369881 master-0 kubenswrapper[9136]: I1203 21:53:12.369752 9136 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-bnstw container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.31:8081/readyz\": dial tcp 10.128.0.31:8081: connect: connection refused" start-of-body= Dec 03 21:53:12.370316 master-0 kubenswrapper[9136]: I1203 21:53:12.370265 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podUID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.31:8081/readyz\": dial tcp 10.128.0.31:8081: connect: connection refused" Dec 03 21:53:14.137286 master-0 kubenswrapper[9136]: I1203 21:53:14.137213 9136 patch_prober.go:28] interesting pod/etcd-operator-7978bf889c-w8hsm container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.20:8443/healthz\": net/http: TLS handshake timeout" start-of-body= Dec 03 21:53:14.137990 master-0 kubenswrapper[9136]: I1203 21:53:14.137310 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.20:8443/healthz\": net/http: TLS handshake timeout" Dec 03 21:53:14.916611 master-0 kubenswrapper[9136]: I1203 21:53:14.916549 9136 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-kz8nk container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.32:8081/readyz\": dial tcp 10.128.0.32:8081: connect: connection refused" start-of-body= Dec 03 21:53:14.916997 master-0 kubenswrapper[9136]: I1203 21:53:14.916932 9136 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-kz8nk container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.32:8081/healthz\": dial tcp 10.128.0.32:8081: connect: connection refused" start-of-body= Dec 03 21:53:14.917100 master-0 kubenswrapper[9136]: I1203 21:53:14.917025 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podUID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.32:8081/healthz\": dial tcp 10.128.0.32:8081: connect: connection refused" Dec 03 21:53:14.917230 master-0 kubenswrapper[9136]: I1203 21:53:14.917176 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podUID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.32:8081/readyz\": dial tcp 10.128.0.32:8081: connect: connection refused" Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: E1203 21:53:16.850598 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35" Netns:"/var/run/netns/df9411d1-1744-4b01-966d-b85e087a91ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: > Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: E1203 21:53:16.850691 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35" Netns:"/var/run/netns/df9411d1-1744-4b01-966d-b85e087a91ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: E1203 21:53:16.850722 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35" Netns:"/var/run/netns/df9411d1-1744-4b01-966d-b85e087a91ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca" Path:"" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:53:16.851519 master-0 kubenswrapper[9136]: E1203 21:53:16.850831 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"community-operators-k98b2_openshift-marketplace(e403ab42-1840-4292-a37c-a8d4feeb54ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"community-operators-k98b2_openshift-marketplace(e403ab42-1840-4292-a37c-a8d4feeb54ca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_community-operators-k98b2_openshift-marketplace_e403ab42-1840-4292-a37c-a8d4feeb54ca_0(8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35): error adding pod openshift-marketplace_community-operators-k98b2 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35\\\" Netns:\\\"/var/run/netns/df9411d1-1744-4b01-966d-b85e087a91ed\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=community-operators-k98b2;K8S_POD_INFRA_CONTAINER_ID=8484bf2edb5826be30790e806854c56a75bdc9410c62276c9936f76d1723cd35;K8S_POD_UID=e403ab42-1840-4292-a37c-a8d4feeb54ca\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/community-operators-k98b2] networking: Multus: [openshift-marketplace/community-operators-k98b2/e403ab42-1840-4292-a37c-a8d4feeb54ca]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod community-operators-k98b2 in out of cluster comm: SetNetworkStatus: failed to update the pod community-operators-k98b2 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-k98b2?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/community-operators-k98b2" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" Dec 03 21:53:16.863416 master-0 kubenswrapper[9136]: E1203 21:53:16.863362 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:53:16.863416 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb" Netns:"/var/run/netns/081937ec-9656-47d9-9d60-a7b59ad73689" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.863416 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.863416 master-0 kubenswrapper[9136]: > Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: E1203 21:53:16.863442 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb" Netns:"/var/run/netns/081937ec-9656-47d9-9d60-a7b59ad73689" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: > pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: E1203 21:53:16.863466 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb" Netns:"/var/run/netns/081937ec-9656-47d9-9d60-a7b59ad73689" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a" Path:"" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: > pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:53:16.863632 master-0 kubenswrapper[9136]: E1203 21:53:16.863551 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager(77e36f4e-845b-4b82-8abc-b634636c087a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager(77e36f4e-845b-4b82-8abc-b634636c087a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_packageserver-684c49c488-fpmzc_openshift-operator-lifecycle-manager_77e36f4e-845b-4b82-8abc-b634636c087a_0(d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb): error adding pod openshift-operator-lifecycle-manager_packageserver-684c49c488-fpmzc to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb\\\" Netns:\\\"/var/run/netns/081937ec-9656-47d9-9d60-a7b59ad73689\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-operator-lifecycle-manager;K8S_POD_NAME=packageserver-684c49c488-fpmzc;K8S_POD_INFRA_CONTAINER_ID=d15562a60680493e877bb0400d04b70c08f8f2a458a63531880b5b05adfd68bb;K8S_POD_UID=77e36f4e-845b-4b82-8abc-b634636c087a\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc] networking: Multus: [openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc/77e36f4e-845b-4b82-8abc-b634636c087a]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: SetNetworkStatus: failed to update the pod packageserver-684c49c488-fpmzc in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/pods/packageserver-684c49c488-fpmzc?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" podUID="77e36f4e-845b-4b82-8abc-b634636c087a" Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: E1203 21:53:16.872640 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1" Netns:"/var/run/netns/6c90e29a-fdaa-44e8-a962-91009b76c567" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods certified-operators-kp794) Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: > Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: E1203 21:53:16.872710 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1" Netns:"/var/run/netns/6c90e29a-fdaa-44e8-a962-91009b76c567" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods certified-operators-kp794) Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: E1203 21:53:16.872735 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1" Netns:"/var/run/netns/6c90e29a-fdaa-44e8-a962-91009b76c567" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873" Path:"" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods certified-operators-kp794) Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:53:16.872897 master-0 kubenswrapper[9136]: E1203 21:53:16.872833 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-kp794_openshift-marketplace(0a49c320-f31d-4f6d-98c3-48d24346b873)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-kp794_openshift-marketplace(0a49c320-f31d-4f6d-98c3-48d24346b873)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-kp794_openshift-marketplace_0a49c320-f31d-4f6d-98c3-48d24346b873_0(695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1): error adding pod openshift-marketplace_certified-operators-kp794 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1\\\" Netns:\\\"/var/run/netns/6c90e29a-fdaa-44e8-a962-91009b76c567\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=certified-operators-kp794;K8S_POD_INFRA_CONTAINER_ID=695e96a4204e01e876ce4a61ff1cd8b3c37c01e75b6ba45f086fcca48c8488e1;K8S_POD_UID=0a49c320-f31d-4f6d-98c3-48d24346b873\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/certified-operators-kp794] networking: Multus: [openshift-marketplace/certified-operators-kp794/0a49c320-f31d-4f6d-98c3-48d24346b873]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod certified-operators-kp794 in out of cluster comm: SetNetworkStatus: failed to update the pod certified-operators-kp794 in out of cluster comm: status update failed for pod /: the server was unable to return a response in the time allotted, but may still be processing the request (get pods certified-operators-kp794)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/certified-operators-kp794" podUID="0a49c320-f31d-4f6d-98c3-48d24346b873" Dec 03 21:53:16.878289 master-0 kubenswrapper[9136]: E1203 21:53:16.878216 9136 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 21:53:16.878289 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba" Netns:"/var/run/netns/41e41613-b144-4804-8dec-b860faadd853" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.878289 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.878289 master-0 kubenswrapper[9136]: > Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: E1203 21:53:16.878324 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba" Netns:"/var/run/netns/41e41613-b144-4804-8dec-b860faadd853" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: E1203 21:53:16.878356 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba" Netns:"/var/run/netns/41e41613-b144-4804-8dec-b860faadd853" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd" Path:"" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 21:53:16.878414 master-0 kubenswrapper[9136]: > pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:53:16.878655 master-0 kubenswrapper[9136]: E1203 21:53:16.878459 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"redhat-marketplace-nwjnw_openshift-marketplace(da0d36d7-fb62-4254-95b0-fb81dcf372cd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"redhat-marketplace-nwjnw_openshift-marketplace(da0d36d7-fb62-4254-95b0-fb81dcf372cd)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_redhat-marketplace-nwjnw_openshift-marketplace_da0d36d7-fb62-4254-95b0-fb81dcf372cd_0(89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba): error adding pod openshift-marketplace_redhat-marketplace-nwjnw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba\\\" Netns:\\\"/var/run/netns/41e41613-b144-4804-8dec-b860faadd853\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=redhat-marketplace-nwjnw;K8S_POD_INFRA_CONTAINER_ID=89dd8eee130b2148c259d4b405b49c7b390bb6ce487e43c845198e9464eac5ba;K8S_POD_UID=da0d36d7-fb62-4254-95b0-fb81dcf372cd\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/redhat-marketplace-nwjnw] networking: Multus: [openshift-marketplace/redhat-marketplace-nwjnw/da0d36d7-fb62-4254-95b0-fb81dcf372cd]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: SetNetworkStatus: failed to update the pod redhat-marketplace-nwjnw in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nwjnw?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/redhat-marketplace-nwjnw" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" Dec 03 21:53:17.281639 master-0 kubenswrapper[9136]: I1203 21:53:17.281496 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:53:17.281639 master-0 kubenswrapper[9136]: I1203 21:53:17.281573 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:53:17.281639 master-0 kubenswrapper[9136]: I1203 21:53:17.281520 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:53:17.282356 master-0 kubenswrapper[9136]: I1203 21:53:17.281520 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:53:17.282707 master-0 kubenswrapper[9136]: I1203 21:53:17.282529 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:53:17.282918 master-0 kubenswrapper[9136]: I1203 21:53:17.282809 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:53:17.282918 master-0 kubenswrapper[9136]: I1203 21:53:17.282532 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:53:17.283119 master-0 kubenswrapper[9136]: I1203 21:53:17.282979 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:53:18.474757 master-0 kubenswrapper[9136]: E1203 21:53:18.474591 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 21:53:22.322615 master-0 kubenswrapper[9136]: I1203 21:53:22.322529 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/1.log" Dec 03 21:53:22.323731 master-0 kubenswrapper[9136]: I1203 21:53:22.323646 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/0.log" Dec 03 21:53:22.324000 master-0 kubenswrapper[9136]: I1203 21:53:22.323722 9136 generic.go:334] "Generic (PLEG): container finished" podID="892d5611-debf-402f-abc5-3f99aa080159" containerID="db4f2041dbd606d645ef97beb6cb4387e780516338543969beb7168c5691d4f3" exitCode=255 Dec 03 21:53:22.369730 master-0 kubenswrapper[9136]: I1203 21:53:22.369550 9136 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-bnstw container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.31:8081/healthz\": dial tcp 10.128.0.31:8081: connect: connection refused" start-of-body= Dec 03 21:53:22.369730 master-0 kubenswrapper[9136]: I1203 21:53:22.369642 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podUID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.31:8081/healthz\": dial tcp 10.128.0.31:8081: connect: connection refused" Dec 03 21:53:22.369994 master-0 kubenswrapper[9136]: I1203 21:53:22.369873 9136 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-bnstw container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.31:8081/readyz\": dial tcp 10.128.0.31:8081: connect: connection refused" start-of-body= Dec 03 21:53:22.370065 master-0 kubenswrapper[9136]: I1203 21:53:22.369982 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podUID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.31:8081/readyz\": dial tcp 10.128.0.31:8081: connect: connection refused" Dec 03 21:53:24.340000 master-0 kubenswrapper[9136]: I1203 21:53:24.339843 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/1.log" Dec 03 21:53:24.340693 master-0 kubenswrapper[9136]: I1203 21:53:24.340619 9136 generic.go:334] "Generic (PLEG): container finished" podID="82055cfc-b4ce-4a00-a51d-141059947693" containerID="14ce5aa72512f80cbae852a84578e1ef46771d28aa2cb1798dd058e96128786a" exitCode=255 Dec 03 21:53:24.915516 master-0 kubenswrapper[9136]: I1203 21:53:24.915430 9136 patch_prober.go:28] interesting pod/operator-controller-controller-manager-5f78c89466-kz8nk container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.32:8081/readyz\": dial tcp 10.128.0.32:8081: connect: connection refused" start-of-body= Dec 03 21:53:24.915516 master-0 kubenswrapper[9136]: I1203 21:53:24.915521 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podUID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.32:8081/readyz\": dial tcp 10.128.0.32:8081: connect: connection refused" Dec 03 21:53:25.349171 master-0 kubenswrapper[9136]: I1203 21:53:25.349061 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/1.log" Dec 03 21:53:25.350337 master-0 kubenswrapper[9136]: I1203 21:53:25.350279 9136 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="d544b409fd7217f0540c9141cdb2568edb09ef4b3389d770b5af4b42fbce8988" exitCode=255 Dec 03 21:53:25.352105 master-0 kubenswrapper[9136]: I1203 21:53:25.352061 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/1.log" Dec 03 21:53:25.352456 master-0 kubenswrapper[9136]: I1203 21:53:25.352413 9136 generic.go:334] "Generic (PLEG): container finished" podID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" containerID="b5c51e812810828f3868a9b23aec61a5698eeb487f28ca92e49346f8d56c307a" exitCode=255 Dec 03 21:53:25.353899 master-0 kubenswrapper[9136]: I1203 21:53:25.353858 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/1.log" Dec 03 21:53:25.354344 master-0 kubenswrapper[9136]: I1203 21:53:25.354291 9136 generic.go:334] "Generic (PLEG): container finished" podID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" containerID="ac757c18056258c4d53cfe54ce6d097e1c1edb7723c2d58dc025a8fed4ae755c" exitCode=255 Dec 03 21:53:25.356233 master-0 kubenswrapper[9136]: I1203 21:53:25.356177 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/1.log" Dec 03 21:53:25.356820 master-0 kubenswrapper[9136]: I1203 21:53:25.356732 9136 generic.go:334] "Generic (PLEG): container finished" podID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" containerID="8820c33df939aa1daa4c748a1b14d13d60d4ea6314f7234562766dc5cbdcadf8" exitCode=255 Dec 03 21:53:26.280368 master-0 kubenswrapper[9136]: I1203 21:53:26.280254 9136 patch_prober.go:28] interesting pod/machine-config-daemon-j9wwr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 21:53:26.280368 master-0 kubenswrapper[9136]: I1203 21:53:26.280333 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podUID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 21:53:26.307795 master-0 kubenswrapper[9136]: E1203 21:53:26.307177 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:53:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:53:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:53:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T21:53:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a17e9d83aeb6de5f0851aaacd1a9ebddbc8a4ac3ece2e4af8670aa0c33b8fc9c\\\"],\\\"sizeBytes\\\":1631769045},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98ce2d349f8bc693d76d9a68097b758b987cf17ea3beb66bbd09d12fa78b4d0c\\\"],\\\"sizeBytes\\\":1232076476},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a31af646ce5587c051459a88df413dc30be81e7f15399aa909e19effa7de772a\\\"],\\\"sizeBytes\\\":983731853},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\"],\\\"sizeBytes\\\":938321573},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dfc0403f71f7c926db1084c7fb5fb4f19007271213ee34f6f3d3eecdbe817d6b\\\"],\\\"sizeBytes\\\":874839630},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e8f313372fe49afad871cc56225dcd4d31bed249abeab55fb288e1f854138fbf\\\"],\\\"sizeBytes\\\":870581225},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc72da7f7930eb09abf6f8dbe577bb537e3a2a59dc0e14a4319b42c0212218d1\\\"],\\\"sizeBytes\\\":857083855},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f8a38d71a75c4fa803249cc709d60039d14878e218afd88a86083526ee8f78ad\\\"],\\\"sizeBytes\\\":856674149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0e5c0acdd03dc840d7345ae397feaf6147a32a8fef89a0ac2ddc8d14b068c9ff\\\"],\\\"sizeBytes\\\":767313881},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:184239929f74bb7c56c1cf5b94b5f91dd4013a87034fe04b9fa1027d2bb6c5a4\\\"],\\\"sizeBytes\\\":682385666},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0d866f93bed16cfebd8019ad6b89a4dd4abedfc20ee5d28d7edad045e7df0fda\\\"],\\\"sizeBytes\\\":677540255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b99ce0f31213291444482af4af36345dc93acdbe965868073e8232797b8a2f14\\\"],\\\"sizeBytes\\\":672854011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff94e909d3b037c815e8ae67989a7616936e67195b758abac6b5d3f0d59562c8\\\"],\\\"sizeBytes\\\":616123373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:da6f62afd2795d1b0af69532a5534c099bbb81d4e7abd2616b374db191552c51\\\"],\\\"sizeBytes\\\":583850203},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:51a4c20765f54b6a6b5513f97cf54bb99631c2abe860949293456886a74f87fe\\\"],\\\"sizeBytes\\\":576621883},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc9758be9f0f0a480fb5e119ecb1e1101ef807bdc765a155212a8188d79b9e60\\\"],\\\"sizeBytes\\\":552687886},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32236659da74056138c839429f304a96ba36dd304d7eefb6b2618ecfdf6308e3\\\"],\\\"sizeBytes\\\":551903461},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e8903affdf29401b9a86b9f58795c9f445f34194960c7b2734f30601c48cbdf\\\"],\\\"sizeBytes\\\":543241813},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c921698d30c8175da0c124f72748e93551d6903b0f34d26743b60cb12d25cb1\\\"],\\\"sizeBytes\\\":532668041},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ceaa4102b35e54be54e23c8ea73bb0dac4978cffb54105ad00b51393f47595da\\\"],\\\"sizeBytes\\\":532338751},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ca4933b9ba55069205ea53970128c4e8c4b46560ef721c8aaee00aaf736664b5\\\"],\\\"sizeBytes\\\":512852463},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:98c80d92a2ef8d44ee625b229b77b7bfdb1b06cbfe0d4df9e2ca2cba904467f7\\\"],\\\"sizeBytes\\\":512468025},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\\\"],\\\"sizeBytes\\\":509451797},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\\\"],\\\"sizeBytes\\\":508056015},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a2ef63f356c11ba629d8038474ab287797340de1219b4fee97c386975689110\\\"],\\\"sizeBytes\\\":507701628},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:84a52132860e74998981b76c08d38543561197c3da77836c670fa8e394c5ec17\\\"],\\\"sizeBytes\\\":506755373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:492103a8365ef9a1d5f237b4ba90aff87369167ec91db29ff0251ba5aab2b419\\\"],\\\"sizeBytes\\\":505663073},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c2b518cb834a0b6ca50d73eceb5f8e64aefb09094d39e4ba0d8e4632f6cdf908\\\"],\\\"sizeBytes\\\":505642108},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\"],\\\"sizeBytes\\\":503354646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eefdc67602b8bc3941001b030ab95d82e10432f814634b80eb8ce45bc9ebd3de\\\"],\\\"sizeBytes\\\":503025552},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3051af3343018fecbf3a6edacea69de841fc5211c09e7fb6a2499188dc979395\\\"],\\\"sizeBytes\\\":502450335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4cb6ecfb89e53653b69ae494ebc940b9fcf7b7db317b156e186435cc541589d9\\\"],\\\"sizeBytes\\\":500957387},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d835ce07d1bec4a4b13f0bca5ea20ea5c781ea7853d7b42310f4ad8aeba6d7c\\\"],\\\"sizeBytes\\\":500863090},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49a6a3308d885301c7718a465f1af2d08a617abbdff23352d5422d1ae4af33cf\\\"],\\\"sizeBytes\\\":499812475},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\"],\\\"sizeBytes\\\":499719811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:44e82a51fce7b5996b183c10c44bd79b0e1ae2257fd5809345fbca1c50aaa08f\\\"],\\\"sizeBytes\\\":499138950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:93145fd0c004dc4fca21435a32c7e55e962f321aff260d702f387cfdebee92a5\\\"],\\\"sizeBytes\\\":499096673},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0c6de747539dd00ede882fb4f73cead462bf0a7efda7173fd5d443ef7a00251\\\"],\\\"sizeBytes\\\":490470354},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6199be91b821875ba2609cf7fa886b74b9a8b573622fe33cc1bc39cd55acac08\\\"],\\\"sizeBytes\\\":489542560},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebd79294a663cb38370ae81f9cda91cef7fb1370ec5b495b4bdb95e77272e6a8\\\"],\\\"sizeBytes\\\":481573011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b4e0b20fdb38d516e871ff5d593c4273cc9933cb6a65ec93e727ca4a7777fd20\\\"],\\\"sizeBytes\\\":478931717},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a3e2790bda8898df5e4e9cf1878103ac483ea1633819d76ea68976b0b2062b6\\\"],\\\"sizeBytes\\\":478655954},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b294511902fd7a80e135b23895a944570932dc0fab1ee22f296523840740332e\\\"],\\\"sizeBytes\\\":465302163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23aa409d98c18a25b5dd3c14b4c5a88eba2c793d020f2deb3bafd58a2225c328\\\"],\\\"sizeBytes\\\":465158513},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:656fe650bac2929182cd0cf7d7e566d089f69e06541b8329c6d40b89346c03ca\\\"],\\\"sizeBytes\\\":462741734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9f4724570795357eb097251a021f20c94c79b3054f3adb3bc0812143ba791dc1\\\"],\\\"sizeBytes\\\":461716546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b03d2897e7cc0e8d0c306acb68ca3d9396d502882c14942faadfdb16bc40e17d\\\"],\\\"sizeBytes\\\":459566623},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:607e31ebb2c85f53775455b38a607a68cb2bdab1e369f03c57e715a4ebb88831\\\"],\\\"sizeBytes\\\":458183681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36fa1378b9c26de6d45187b1e7352f3b1147109427fab3669b107d81fd967601\\\"],\\\"sizeBytes\\\":452603646},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eac937aae64688cb47b38ad2cbba5aa7e6d41c691df1f3ca4ff81e5117084d1e\\\"],\\\"sizeBytes\\\":451053419}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:53:28.383529 master-0 kubenswrapper[9136]: I1203 21:53:28.383453 9136 generic.go:334] "Generic (PLEG): container finished" podID="29ac4a9d-1228-49c7-9051-338e7dc98a38" containerID="efd056e2ae598223549439d740b1aee6a580ddac5417acae41c47f9e3c130bb0" exitCode=0 Dec 03 21:53:28.386580 master-0 kubenswrapper[9136]: I1203 21:53:28.386517 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/1.log" Dec 03 21:53:28.387653 master-0 kubenswrapper[9136]: I1203 21:53:28.387568 9136 generic.go:334] "Generic (PLEG): container finished" podID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerID="0e38d42aee36ec27dd6129e81cdc5f8517f478bd68d3edd07a620c1ebb8b1c14" exitCode=255 Dec 03 21:53:30.410470 master-0 kubenswrapper[9136]: I1203 21:53:30.410377 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/0.log" Dec 03 21:53:30.410470 master-0 kubenswrapper[9136]: I1203 21:53:30.410455 9136 generic.go:334] "Generic (PLEG): container finished" podID="ba624ed0-32cc-4c87-81a5-708a8a8a7f88" containerID="6f91acf82d50566dd54057e9b1be20f0d89f3f616993406ed044d52651283dc2" exitCode=1 Dec 03 21:53:31.426000 master-0 kubenswrapper[9136]: I1203 21:53:31.425913 9136 generic.go:334] "Generic (PLEG): container finished" podID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerID="3bfa8deaf08a2a1e0e9189e6bd019c4a1fa728ac1cad25a37ff4255d9c02c3f0" exitCode=0 Dec 03 21:53:31.427855 master-0 kubenswrapper[9136]: I1203 21:53:31.427813 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/0.log" Dec 03 21:53:31.428205 master-0 kubenswrapper[9136]: I1203 21:53:31.428168 9136 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="3870fd945d6202d34fa9215ae4b7aebad43b7e8cbdd1682883399a063c91b9d2" exitCode=255 Dec 03 21:53:32.371047 master-0 kubenswrapper[9136]: I1203 21:53:32.370925 9136 patch_prober.go:28] interesting pod/catalogd-controller-manager-754cfd84-bnstw container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.31:8081/readyz\": dial tcp 10.128.0.31:8081: connect: connection refused" start-of-body= Dec 03 21:53:32.371047 master-0 kubenswrapper[9136]: I1203 21:53:32.371022 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podUID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.31:8081/readyz\": dial tcp 10.128.0.31:8081: connect: connection refused" Dec 03 21:53:32.942836 master-0 kubenswrapper[9136]: I1203 21:53:32.942682 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:53:32.942836 master-0 kubenswrapper[9136]: I1203 21:53:32.942794 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:53:32.944105 master-0 kubenswrapper[9136]: I1203 21:53:32.942695 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:53:32.944105 master-0 kubenswrapper[9136]: I1203 21:53:32.942952 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:53:33.450165 master-0 kubenswrapper[9136]: I1203 21:53:33.450060 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5775bfbf6d-f54vs_5a5ef85a-d878-4253-ba90-9306c9490e3c/machine-approver-controller/0.log" Dec 03 21:53:33.451035 master-0 kubenswrapper[9136]: I1203 21:53:33.450949 9136 generic.go:334] "Generic (PLEG): container finished" podID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerID="f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036" exitCode=255 Dec 03 21:53:33.964038 master-0 kubenswrapper[9136]: E1203 21:53:33.963941 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:53:33.964995 master-0 kubenswrapper[9136]: E1203 21:53:33.964184 9136 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.019s" Dec 03 21:53:33.964995 master-0 kubenswrapper[9136]: I1203 21:53:33.964370 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerDied","Data":"ffe4f10866cf1bc36713ebdb04a86f2cd5ff92ba34f253339cffa02ccd5a5e66"} Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.965634 9136 scope.go:117] "RemoveContainer" containerID="ac757c18056258c4d53cfe54ce6d097e1c1edb7723c2d58dc025a8fed4ae755c" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.965821 9136 scope.go:117] "RemoveContainer" containerID="8820c33df939aa1daa4c748a1b14d13d60d4ea6314f7234562766dc5cbdcadf8" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: E1203 21:53:33.966067 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.966121 9136 scope.go:117] "RemoveContainer" containerID="0f6a85e4a73afc226173f2b4c67fa571e667d3c81985c4b5d23669be9018152c" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.966238 9136 scope.go:117] "RemoveContainer" containerID="0e38d42aee36ec27dd6129e81cdc5f8517f478bd68d3edd07a620c1ebb8b1c14" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.966392 9136 scope.go:117] "RemoveContainer" containerID="5ae5503bf1205bc663cc1204fe09557a6737e824df9d6d84a86d54a184a45a47" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.966592 9136 scope.go:117] "RemoveContainer" containerID="3870fd945d6202d34fa9215ae4b7aebad43b7e8cbdd1682883399a063c91b9d2" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.966650 9136 scope.go:117] "RemoveContainer" containerID="ed377a768f3ae617a354166679d4faec90c18a1d576ddc666702319bbce47318" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: E1203 21:53:33.966230 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.967003 9136 scope.go:117] "RemoveContainer" containerID="db4f2041dbd606d645ef97beb6cb4387e780516338543969beb7168c5691d4f3" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.967400 9136 scope.go:117] "RemoveContainer" containerID="fdc23b60da9292f04f004b2100cec48c57024e9040275eb930fc44673f8ac8bd" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: I1203 21:53:33.967594 9136 scope.go:117] "RemoveContainer" containerID="b5c51e812810828f3868a9b23aec61a5698eeb487f28ca92e49346f8d56c307a" Dec 03 21:53:33.968601 master-0 kubenswrapper[9136]: E1203 21:53:33.967975 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 21:53:33.980029 master-0 kubenswrapper[9136]: I1203 21:53:33.979833 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 21:53:34.461200 master-0 kubenswrapper[9136]: I1203 21:53:34.461143 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/0.log" Dec 03 21:53:34.467582 master-0 kubenswrapper[9136]: I1203 21:53:34.467496 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/1.log" Dec 03 21:53:34.474652 master-0 kubenswrapper[9136]: I1203 21:53:34.474161 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/0.log" Dec 03 21:53:34.483470 master-0 kubenswrapper[9136]: I1203 21:53:34.483433 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/0.log" Dec 03 21:53:34.486561 master-0 kubenswrapper[9136]: I1203 21:53:34.486520 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/1.log" Dec 03 21:53:34.493998 master-0 kubenswrapper[9136]: I1203 21:53:34.493959 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/1.log" Dec 03 21:53:34.494607 master-0 kubenswrapper[9136]: I1203 21:53:34.494581 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/0.log" Dec 03 21:53:34.497343 master-0 kubenswrapper[9136]: I1203 21:53:34.497312 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/0.log" Dec 03 21:53:35.476465 master-0 kubenswrapper[9136]: E1203 21:53:35.476338 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 21:53:36.308800 master-0 kubenswrapper[9136]: E1203 21:53:36.308691 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:53:42.893573 master-0 kubenswrapper[9136]: I1203 21:53:42.893502 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:53:42.894646 master-0 kubenswrapper[9136]: I1203 21:53:42.893553 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:53:42.894646 master-0 kubenswrapper[9136]: I1203 21:53:42.894315 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:53:42.894646 master-0 kubenswrapper[9136]: I1203 21:53:42.894364 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:53:46.244294 master-0 kubenswrapper[9136]: E1203 21:53:46.244060 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-storage-operator-f84784664-hv5z8.187dd30a85aad749 openshift-cluster-storage-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-cluster-storage-operator,Name:cluster-storage-operator-f84784664-hv5z8,UID:6e96335e-1866-41c8-b128-b95e783a9be4,APIVersion:v1,ResourceVersion:8691,FieldPath:spec.containers{cluster-storage-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae8c6193ace2c439dd93d8129f68f3704727650851a628c906bff9290940ef03\" in 18.686s (18.686s including waiting). Image size: 508056015 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:11.144802121 +0000 UTC m=+77.419978523,LastTimestamp:2025-12-03 21:51:11.144802121 +0000 UTC m=+77.419978523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:53:46.310254 master-0 kubenswrapper[9136]: E1203 21:53:46.310173 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:53:46.979224 master-0 kubenswrapper[9136]: E1203 21:53:46.979113 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Dec 03 21:53:52.478642 master-0 kubenswrapper[9136]: E1203 21:53:52.478517 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 21:53:52.893706 master-0 kubenswrapper[9136]: I1203 21:53:52.893605 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:53:52.893706 master-0 kubenswrapper[9136]: I1203 21:53:52.893687 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:53:52.894098 master-0 kubenswrapper[9136]: I1203 21:53:52.893644 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:53:52.894098 master-0 kubenswrapper[9136]: I1203 21:53:52.893848 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:53:55.459197 master-0 kubenswrapper[9136]: E1203 21:53:55.459094 9136 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" Dec 03 21:53:55.460155 master-0 kubenswrapper[9136]: E1203 21:53:55.460080 9136 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:53:55.460240 master-0 kubenswrapper[9136]: E1203 21:53:55.460158 9136 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = DeadlineExceeded desc = context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:53:55.460315 master-0 kubenswrapper[9136]: E1203 21:53:55.460252 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"network-check-target-78hts_openshift-network-diagnostics(e6d5d61a-c5de-4619-9afb-7fad63ba0525)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"network-check-target-78hts_openshift-network-diagnostics(e6d5d61a-c5de-4619-9afb-7fad63ba0525)\\\": rpc error: code = DeadlineExceeded desc = context deadline exceeded\"" pod="openshift-network-diagnostics/network-check-target-78hts" podUID="e6d5d61a-c5de-4619-9afb-7fad63ba0525" Dec 03 21:53:55.664036 master-0 kubenswrapper[9136]: I1203 21:53:55.663936 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:53:55.664693 master-0 kubenswrapper[9136]: I1203 21:53:55.664646 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:53:55.923030 master-0 kubenswrapper[9136]: E1203 21:53:55.922931 9136 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff" to get inode usage: stat /var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff: no such file or directory, extraDiskErr: Dec 03 21:53:56.280805 master-0 kubenswrapper[9136]: I1203 21:53:56.279747 9136 patch_prober.go:28] interesting pod/machine-config-daemon-j9wwr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 21:53:56.280805 master-0 kubenswrapper[9136]: I1203 21:53:56.279864 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podUID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 21:53:56.310901 master-0 kubenswrapper[9136]: E1203 21:53:56.310711 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 21:54:01.709510 master-0 kubenswrapper[9136]: I1203 21:54:01.709423 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/1.log" Dec 03 21:54:01.710304 master-0 kubenswrapper[9136]: I1203 21:54:01.710248 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/0.log" Dec 03 21:54:01.710304 master-0 kubenswrapper[9136]: I1203 21:54:01.710273 9136 generic.go:334] "Generic (PLEG): container finished" podID="fa9b5917-d4f3-4372-a200-45b57412f92f" containerID="391a76f62114231d834f9c2db03252c4e7685cf51b83cad3819b8f095737b885" exitCode=1 Dec 03 21:54:02.894248 master-0 kubenswrapper[9136]: I1203 21:54:02.894124 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:54:02.894248 master-0 kubenswrapper[9136]: I1203 21:54:02.894267 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:54:04.095676 master-0 kubenswrapper[9136]: I1203 21:54:04.095606 9136 patch_prober.go:28] interesting pod/etcd-operator-7978bf889c-w8hsm container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.20:8443/healthz\": dial tcp 10.128.0.20:8443: connect: connection refused" start-of-body= Dec 03 21:54:04.096298 master-0 kubenswrapper[9136]: I1203 21:54:04.095695 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.20:8443/healthz\": dial tcp 10.128.0.20:8443: connect: connection refused" Dec 03 21:54:07.983644 master-0 kubenswrapper[9136]: E1203 21:54:07.983550 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Dec 03 21:54:07.984575 master-0 kubenswrapper[9136]: E1203 21:54:07.983883 9136 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.019s" Dec 03 21:54:07.996450 master-0 kubenswrapper[9136]: I1203 21:54:07.996367 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 21:54:08.158677 master-0 kubenswrapper[9136]: I1203 21:54:08.158546 9136 status_manager.go:851] "Failed to get status for pod" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods cluster-olm-operator-589f5cdc9d-25qxh)" Dec 03 21:54:12.761079 master-0 kubenswrapper[9136]: E1203 21:54:12.760566 9136 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="4.777s" Dec 03 21:54:12.761079 master-0 kubenswrapper[9136]: I1203 21:54:12.760629 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:54:12.782501 master-0 kubenswrapper[9136]: I1203 21:54:12.782430 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Dec 03 21:54:12.784824 master-0 kubenswrapper[9136]: W1203 21:54:12.784746 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a49c320_f31d_4f6d_98c3_48d24346b873.slice/crio-23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e WatchSource:0}: Error finding container 23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e: Status 404 returned error can't find the container with id 23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e Dec 03 21:54:12.788220 master-0 kubenswrapper[9136]: W1203 21:54:12.788175 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda0d36d7_fb62_4254_95b0_fb81dcf372cd.slice/crio-da2e3f2530639eb26a748339c2a8743a14a7f76380a63f9bb6ef6739b88a428b WatchSource:0}: Error finding container da2e3f2530639eb26a748339c2a8743a14a7f76380a63f9bb6ef6739b88a428b: Status 404 returned error can't find the container with id da2e3f2530639eb26a748339c2a8743a14a7f76380a63f9bb6ef6739b88a428b Dec 03 21:54:12.790590 master-0 kubenswrapper[9136]: I1203 21:54:12.790468 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b"} Dec 03 21:54:12.790590 master-0 kubenswrapper[9136]: I1203 21:54:12.790532 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerStarted","Data":"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1"} Dec 03 21:54:12.790590 master-0 kubenswrapper[9136]: I1203 21:54:12.790552 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerDied","Data":"8c01bfaede47fa025c4081862076a0aa6db537aee1b97a2732cbc64498e3dbda"} Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790669 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790694 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790712 9136 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="b4967a26-442a-4493-8a9e-8cb24269f811" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790731 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790751 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790826 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kp794"] Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790857 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790878 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790897 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerStarted","Data":"d544b409fd7217f0540c9141cdb2568edb09ef4b3389d770b5af4b42fbce8988"} Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.790919 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerStarted","Data":"b5c51e812810828f3868a9b23aec61a5698eeb487f28ca92e49346f8d56c307a"} Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.791042 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.791106 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.791136 9136 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="b4967a26-442a-4493-8a9e-8cb24269f811" Dec 03 21:54:12.791162 master-0 kubenswrapper[9136]: I1203 21:54:12.791171 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k98b2"] Dec 03 21:54:12.792855 master-0 kubenswrapper[9136]: I1203 21:54:12.791208 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerStarted","Data":"afb5b53ec6fc0fe7cc9eca316bae9563bccc04673dbe4d97d1079ce470fa5a03"} Dec 03 21:54:12.792855 master-0 kubenswrapper[9136]: I1203 21:54:12.791249 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"fdc23b60da9292f04f004b2100cec48c57024e9040275eb930fc44673f8ac8bd"} Dec 03 21:54:12.792855 master-0 kubenswrapper[9136]: I1203 21:54:12.791271 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerStarted","Data":"ac757c18056258c4d53cfe54ce6d097e1c1edb7723c2d58dc025a8fed4ae755c"} Dec 03 21:54:12.792855 master-0 kubenswrapper[9136]: I1203 21:54:12.791292 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:54:12.792855 master-0 kubenswrapper[9136]: I1203 21:54:12.791312 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerStarted","Data":"8820c33df939aa1daa4c748a1b14d13d60d4ea6314f7234562766dc5cbdcadf8"} Dec 03 21:54:12.792855 master-0 kubenswrapper[9136]: I1203 21:54:12.791286 9136 scope.go:117] "RemoveContainer" containerID="311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b" Dec 03 21:54:12.818753 master-0 kubenswrapper[9136]: I1203 21:54:12.818700 9136 status_manager.go:317] "Container readiness changed for unknown container" pod="kube-system/bootstrap-kube-controller-manager-master-0" containerID="cri-o://fdc23b60da9292f04f004b2100cec48c57024e9040275eb930fc44673f8ac8bd" Dec 03 21:54:12.818753 master-0 kubenswrapper[9136]: I1203 21:54:12.818751 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.818826 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.818856 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwjnw"] Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.818941 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.818964 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerDied","Data":"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.818999 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819021 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerDied","Data":"7abe60a608c2243a0171688d4d7d094c30c03b0afcc0cdcafa8f846a9926432b"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819047 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerStarted","Data":"0e38d42aee36ec27dd6129e81cdc5f8517f478bd68d3edd07a620c1ebb8b1c14"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819068 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerDied","Data":"f5cb9e42c952aa7425c595831eb5aaa25eadc2bda2a857d2db27d386e674a3ef"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819106 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819128 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"3870fd945d6202d34fa9215ae4b7aebad43b7e8cbdd1682883399a063c91b9d2"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819148 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerDied","Data":"f8f51cc4e951397e53befcebae88ea2052970977c20019cfe472b3c9dcc46778"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819171 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819190 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0269ada6-cb6e-4c98-bd24-752ae0286498","Type":"ContainerDied","Data":"828e1c847e23b1857152d987d6c054b5de0c2ac4dd44ef540ab230ae90795ebb"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819212 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerDied","Data":"4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819253 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819274 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819295 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" event={"ID":"578b2d03-b8b3-4c75-adde-73899c472ad7","Type":"ContainerDied","Data":"69c66e8c33554080901dde3e7449b542159057d7baa15426e32c9d8a63162fa3"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819321 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819344 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerDied","Data":"82722f9afea36ecc1e8162d281159bb72f29817202c15f48a80157dddaa3525e"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819367 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerDied","Data":"ed377a768f3ae617a354166679d4faec90c18a1d576ddc666702319bbce47318"} Dec 03 21:54:12.819339 master-0 kubenswrapper[9136]: I1203 21:54:12.819407 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerDied","Data":"2e3a2cbe5d1d0f700f443f5a9e3ce8b40d65f0655e3c96ce54030b35eb469f7b"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819432 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerDied","Data":"fdc23b60da9292f04f004b2100cec48c57024e9040275eb930fc44673f8ac8bd"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819456 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerDied","Data":"6c63c3c05a429e1c2a2724aa6046ef16e3d07844020b76ab9c6e4dd4aedf14d1"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819482 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"0269ada6-cb6e-4c98-bd24-752ae0286498","Type":"ContainerDied","Data":"5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819506 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785" Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819524 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" event={"ID":"578b2d03-b8b3-4c75-adde-73899c472ad7","Type":"ContainerStarted","Data":"3445ebe854f42ae0ad53b657352b66491efe14d790f93fa83d0efffb68d3546d"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819544 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerStarted","Data":"257bfb23352e3c96e97fc399573f91ccd44a49ae756eac88f5f3eb6b84b4bf2e"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819572 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"6e8e854c9ec7a5043d35c55ccb3f34d12a4db391501f26bb9ce132cf680165af"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819592 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerStarted","Data":"8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819642 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerStarted","Data":"575a6b466c21bed63405fda68f3a17896616983e54bbfaf59e5919bc2381d15d"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819666 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerStarted","Data":"d1bdd8b92f1b53e27dbdc370a3e552cba7eb4014b85281c160413adf1ac3135b"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819689 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerStarted","Data":"e2c788e798e45c9b371c7db0d83b84623d2d9cabb25ab2e95a2d107b202c0add"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819709 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"391a76f62114231d834f9c2db03252c4e7685cf51b83cad3819b8f095737b885"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819739 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerDied","Data":"0f6a85e4a73afc226173f2b4c67fa571e667d3c81985c4b5d23669be9018152c"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819762 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerDied","Data":"5ae5503bf1205bc663cc1204fe09557a6737e824df9d6d84a86d54a184a45a47"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819812 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerDied","Data":"12901db2ae1fd13e3c0aef0f6572f021784e96590f0cc7bfafbbf5cbabb162c5"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819838 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerDied","Data":"db4f2041dbd606d645ef97beb6cb4387e780516338543969beb7168c5691d4f3"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819864 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerDied","Data":"14ce5aa72512f80cbae852a84578e1ef46771d28aa2cb1798dd058e96128786a"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819888 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerDied","Data":"d544b409fd7217f0540c9141cdb2568edb09ef4b3389d770b5af4b42fbce8988"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819923 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerDied","Data":"b5c51e812810828f3868a9b23aec61a5698eeb487f28ca92e49346f8d56c307a"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819945 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerDied","Data":"ac757c18056258c4d53cfe54ce6d097e1c1edb7723c2d58dc025a8fed4ae755c"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819970 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerDied","Data":"8820c33df939aa1daa4c748a1b14d13d60d4ea6314f7234562766dc5cbdcadf8"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.819991 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerDied","Data":"efd056e2ae598223549439d740b1aee6a580ddac5417acae41c47f9e3c130bb0"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820010 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerDied","Data":"0e38d42aee36ec27dd6129e81cdc5f8517f478bd68d3edd07a620c1ebb8b1c14"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820030 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" event={"ID":"ba624ed0-32cc-4c87-81a5-708a8a8a7f88","Type":"ContainerDied","Data":"6f91acf82d50566dd54057e9b1be20f0d89f3f616993406ed044d52651283dc2"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820060 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" event={"ID":"89b8bdab-89ba-4a7d-a464-71a2f240e3c7","Type":"ContainerDied","Data":"3bfa8deaf08a2a1e0e9189e6bd019c4a1fa728ac1cad25a37ff4255d9c02c3f0"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820080 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerDied","Data":"3870fd945d6202d34fa9215ae4b7aebad43b7e8cbdd1682883399a063c91b9d2"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820102 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerDied","Data":"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820123 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820142 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerStarted","Data":"bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820160 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerStarted","Data":"b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820190 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerStarted","Data":"4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820211 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"7bce50c457ac1f4721bc81a570dd238a","Type":"ContainerStarted","Data":"7077c81bca08e96587337a4956b3fcf6545b925a4a38770d09ecbdc14b1ceaa4"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820231 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerStarted","Data":"ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820250 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerStarted","Data":"9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820269 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"c0dc443f1527bf49aa993d4373b075ba2cac81ea231c3894c047c835a26b4c9f"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820288 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"03a6c4cf49b5dfdb537d453011bfa5fccbeecd91abc60d2409d33529fbb25f01"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820305 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"e1b125fe53f3c1e5501ca76af42c64d1123972595dfade5a4e4e7d7f0045d4fa"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820330 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"2cc1c5605242df3437ae984a9e1e4d7f667ba732f16f6113d93fc241030ae6be"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820348 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"ebf07eb54db570834b7c9a90b6b07403","Type":"ContainerStarted","Data":"26f2c19baa42c22f852104341244a584ce87ef1be47ee995c3a898bbf27ea037"} Dec 03 21:54:12.821737 master-0 kubenswrapper[9136]: I1203 21:54:12.820365 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerDied","Data":"391a76f62114231d834f9c2db03252c4e7685cf51b83cad3819b8f095737b885"} Dec 03 21:54:12.831084 master-0 kubenswrapper[9136]: I1203 21:54:12.831023 9136 scope.go:117] "RemoveContainer" containerID="3bfa8deaf08a2a1e0e9189e6bd019c4a1fa728ac1cad25a37ff4255d9c02c3f0" Dec 03 21:54:12.831816 master-0 kubenswrapper[9136]: I1203 21:54:12.831750 9136 scope.go:117] "RemoveContainer" containerID="391a76f62114231d834f9c2db03252c4e7685cf51b83cad3819b8f095737b885" Dec 03 21:54:12.834108 master-0 kubenswrapper[9136]: I1203 21:54:12.834043 9136 scope.go:117] "RemoveContainer" containerID="6f91acf82d50566dd54057e9b1be20f0d89f3f616993406ed044d52651283dc2" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.836720 9136 scope.go:117] "RemoveContainer" containerID="8820c33df939aa1daa4c748a1b14d13d60d4ea6314f7234562766dc5cbdcadf8" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.836874 9136 scope.go:117] "RemoveContainer" containerID="ac757c18056258c4d53cfe54ce6d097e1c1edb7723c2d58dc025a8fed4ae755c" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.837666 9136 scope.go:117] "RemoveContainer" containerID="d544b409fd7217f0540c9141cdb2568edb09ef4b3389d770b5af4b42fbce8988" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.838190 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc"] Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.838261 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" podStartSLOduration=183.370335263 podStartE2EDuration="3m25.838240358s" podCreationTimestamp="2025-12-03 21:50:47 +0000 UTC" firstStartedPulling="2025-12-03 21:50:52.168586916 +0000 UTC m=+58.443763298" lastFinishedPulling="2025-12-03 21:51:14.636491971 +0000 UTC m=+80.911668393" observedRunningTime="2025-12-03 21:54:12.818736963 +0000 UTC m=+259.093913415" watchObservedRunningTime="2025-12-03 21:54:12.838240358 +0000 UTC m=+259.113416750" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.838342 9136 scope.go:117] "RemoveContainer" containerID="14ce5aa72512f80cbae852a84578e1ef46771d28aa2cb1798dd058e96128786a" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.839453 9136 scope.go:117] "RemoveContainer" containerID="b5c51e812810828f3868a9b23aec61a5698eeb487f28ca92e49346f8d56c307a" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.840554 9136 scope.go:117] "RemoveContainer" containerID="f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036" Dec 03 21:54:12.841116 master-0 kubenswrapper[9136]: I1203 21:54:12.840631 9136 scope.go:117] "RemoveContainer" containerID="12901db2ae1fd13e3c0aef0f6572f021784e96590f0cc7bfafbbf5cbabb162c5" Dec 03 21:54:12.842238 master-0 kubenswrapper[9136]: I1203 21:54:12.841603 9136 scope.go:117] "RemoveContainer" containerID="efd056e2ae598223549439d740b1aee6a580ddac5417acae41c47f9e3c130bb0" Dec 03 21:54:12.864812 master-0 kubenswrapper[9136]: I1203 21:54:12.864726 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 03 21:54:12.870458 master-0 kubenswrapper[9136]: I1203 21:54:12.870386 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-78hts" event={"ID":"e6d5d61a-c5de-4619-9afb-7fad63ba0525","Type":"ContainerStarted","Data":"670cc7960481086b8965fb2da3afeafd568cc7eac708174b6ff365eec4bad5b9"} Dec 03 21:54:12.871979 master-0 kubenswrapper[9136]: I1203 21:54:12.871937 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwjnw" event={"ID":"da0d36d7-fb62-4254-95b0-fb81dcf372cd","Type":"ContainerStarted","Data":"da2e3f2530639eb26a748339c2a8743a14a7f76380a63f9bb6ef6739b88a428b"} Dec 03 21:54:12.873672 master-0 kubenswrapper[9136]: I1203 21:54:12.873626 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" event={"ID":"77e36f4e-845b-4b82-8abc-b634636c087a","Type":"ContainerStarted","Data":"9c6d88ab184c0e7fbb3e1e5fe4f3291f9c0461b4a5747d2c7087ffe16e8ba75d"} Dec 03 21:54:12.874710 master-0 kubenswrapper[9136]: I1203 21:54:12.874668 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k98b2" event={"ID":"e403ab42-1840-4292-a37c-a8d4feeb54ca","Type":"ContainerStarted","Data":"27cf3d4301968602620f0710474b0cd1874a47ae80ca26e646bde5b1b38a2e9d"} Dec 03 21:54:12.876255 master-0 kubenswrapper[9136]: I1203 21:54:12.876216 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp794" event={"ID":"0a49c320-f31d-4f6d-98c3-48d24346b873","Type":"ContainerStarted","Data":"23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e"} Dec 03 21:54:12.882239 master-0 kubenswrapper[9136]: I1203 21:54:12.881930 9136 scope.go:117] "RemoveContainer" containerID="aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133" Dec 03 21:54:12.892418 master-0 kubenswrapper[9136]: I1203 21:54:12.892387 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:54:12.906750 master-0 kubenswrapper[9136]: I1203 21:54:12.906703 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 21:54:12.909793 master-0 kubenswrapper[9136]: I1203 21:54:12.909746 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Dec 03 21:54:12.949822 master-0 kubenswrapper[9136]: I1203 21:54:12.948243 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podStartSLOduration=197.948216663 podStartE2EDuration="3m17.948216663s" podCreationTimestamp="2025-12-03 21:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:54:12.946466944 +0000 UTC m=+259.221643346" watchObservedRunningTime="2025-12-03 21:54:12.948216663 +0000 UTC m=+259.223393045" Dec 03 21:54:12.998320 master-0 kubenswrapper[9136]: I1203 21:54:12.998238 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" podStartSLOduration=186.311362111 podStartE2EDuration="3m24.998213023s" podCreationTimestamp="2025-12-03 21:50:48 +0000 UTC" firstStartedPulling="2025-12-03 21:50:52.457911578 +0000 UTC m=+58.733087960" lastFinishedPulling="2025-12-03 21:51:11.14476248 +0000 UTC m=+77.419938872" observedRunningTime="2025-12-03 21:54:12.997297782 +0000 UTC m=+259.272474174" watchObservedRunningTime="2025-12-03 21:54:12.998213023 +0000 UTC m=+259.273389415" Dec 03 21:54:13.033200 master-0 kubenswrapper[9136]: I1203 21:54:13.032179 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 21:54:13.037336 master-0 kubenswrapper[9136]: I1203 21:54:13.036831 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Dec 03 21:54:13.076001 master-0 kubenswrapper[9136]: I1203 21:54:13.075917 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" podStartSLOduration=180.842591382 podStartE2EDuration="3m29.075896623s" podCreationTimestamp="2025-12-03 21:50:44 +0000 UTC" firstStartedPulling="2025-12-03 21:50:46.511611851 +0000 UTC m=+52.786788233" lastFinishedPulling="2025-12-03 21:51:14.744917092 +0000 UTC m=+81.020093474" observedRunningTime="2025-12-03 21:54:13.074388623 +0000 UTC m=+259.349565045" watchObservedRunningTime="2025-12-03 21:54:13.075896623 +0000 UTC m=+259.351073035" Dec 03 21:54:13.102830 master-0 kubenswrapper[9136]: I1203 21:54:13.102468 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" podStartSLOduration=183.197557336 podStartE2EDuration="3m26.102445105s" podCreationTimestamp="2025-12-03 21:50:47 +0000 UTC" firstStartedPulling="2025-12-03 21:50:51.732015355 +0000 UTC m=+58.007191737" lastFinishedPulling="2025-12-03 21:51:14.636903114 +0000 UTC m=+80.912079506" observedRunningTime="2025-12-03 21:54:13.097843791 +0000 UTC m=+259.373020223" watchObservedRunningTime="2025-12-03 21:54:13.102445105 +0000 UTC m=+259.377621507" Dec 03 21:54:13.138833 master-0 kubenswrapper[9136]: I1203 21:54:13.138762 9136 scope.go:117] "RemoveContainer" containerID="2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce" Dec 03 21:54:13.227294 master-0 kubenswrapper[9136]: I1203 21:54:13.227220 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" podStartSLOduration=180.24945395 podStartE2EDuration="3m22.227202217s" podCreationTimestamp="2025-12-03 21:50:51 +0000 UTC" firstStartedPulling="2025-12-03 21:50:52.669959477 +0000 UTC m=+58.945135859" lastFinishedPulling="2025-12-03 21:51:14.647707744 +0000 UTC m=+80.922884126" observedRunningTime="2025-12-03 21:54:13.22313618 +0000 UTC m=+259.498312592" watchObservedRunningTime="2025-12-03 21:54:13.227202217 +0000 UTC m=+259.502378599" Dec 03 21:54:13.268757 master-0 kubenswrapper[9136]: I1203 21:54:13.268154 9136 scope.go:117] "RemoveContainer" containerID="4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1" Dec 03 21:54:13.286094 master-0 kubenswrapper[9136]: I1203 21:54:13.285982 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" podStartSLOduration=145.565412099 podStartE2EDuration="3m21.285966222s" podCreationTimestamp="2025-12-03 21:50:52 +0000 UTC" firstStartedPulling="2025-12-03 21:50:54.071704881 +0000 UTC m=+60.346881263" lastFinishedPulling="2025-12-03 21:51:49.792258994 +0000 UTC m=+116.067435386" observedRunningTime="2025-12-03 21:54:13.285640211 +0000 UTC m=+259.560816603" watchObservedRunningTime="2025-12-03 21:54:13.285966222 +0000 UTC m=+259.561142604" Dec 03 21:54:13.312338 master-0 kubenswrapper[9136]: I1203 21:54:13.312233 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podStartSLOduration=183.389432106 podStartE2EDuration="3m26.312216814s" podCreationTimestamp="2025-12-03 21:50:47 +0000 UTC" firstStartedPulling="2025-12-03 21:50:51.716377479 +0000 UTC m=+57.991553861" lastFinishedPulling="2025-12-03 21:51:14.639162137 +0000 UTC m=+80.914338569" observedRunningTime="2025-12-03 21:54:13.308901363 +0000 UTC m=+259.584077755" watchObservedRunningTime="2025-12-03 21:54:13.312216814 +0000 UTC m=+259.587393196" Dec 03 21:54:13.323112 master-0 kubenswrapper[9136]: I1203 21:54:13.323053 9136 scope.go:117] "RemoveContainer" containerID="ad5b230b5b0a6050c3e00f60f378eb9862c13b3605b44eb150bd273e89f0bd98" Dec 03 21:54:13.434163 master-0 kubenswrapper[9136]: I1203 21:54:13.434072 9136 scope.go:117] "RemoveContainer" containerID="311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b" Dec 03 21:54:13.434898 master-0 kubenswrapper[9136]: E1203 21:54:13.434847 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b\": container with ID starting with 311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b not found: ID does not exist" containerID="311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b" Dec 03 21:54:13.434948 master-0 kubenswrapper[9136]: I1203 21:54:13.434917 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b"} err="failed to get container status \"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b\": rpc error: code = NotFound desc = could not find container \"311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b\": container with ID starting with 311dc55506086da9301593a49b8172ac4a99ec7de9822aa371ca907002801f3b not found: ID does not exist" Dec 03 21:54:13.434985 master-0 kubenswrapper[9136]: I1203 21:54:13.434955 9136 scope.go:117] "RemoveContainer" containerID="aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133" Dec 03 21:54:13.435584 master-0 kubenswrapper[9136]: E1203 21:54:13.435549 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133\": container with ID starting with aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133 not found: ID does not exist" containerID="aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133" Dec 03 21:54:13.435617 master-0 kubenswrapper[9136]: I1203 21:54:13.435579 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133"} err="failed to get container status \"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133\": rpc error: code = NotFound desc = could not find container \"aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133\": container with ID starting with aa58f3895f62ec09d5ef9e6b34fc0e312185cf52148407e549b354fa1c0ab133 not found: ID does not exist" Dec 03 21:54:13.435617 master-0 kubenswrapper[9136]: I1203 21:54:13.435597 9136 scope.go:117] "RemoveContainer" containerID="d1b75fc955087be48690894945186bc731e2bed0635b14d30a5226a6cb7dbae4" Dec 03 21:54:13.481731 master-0 kubenswrapper[9136]: I1203 21:54:13.481687 9136 scope.go:117] "RemoveContainer" containerID="7abc4d8635b4469a4776710f14e691f06a0b7b60d5e937f6ea27f069d519024a" Dec 03 21:54:13.599562 master-0 kubenswrapper[9136]: I1203 21:54:13.595422 9136 scope.go:117] "RemoveContainer" containerID="129bb34723f1a3b3bc6376ce7e4a5163dea17767bc888149557663566136439c" Dec 03 21:54:13.670576 master-0 kubenswrapper[9136]: I1203 21:54:13.670535 9136 scope.go:117] "RemoveContainer" containerID="930ace1c47675e8f2e46f1361fdd688d5b7098c0fe077d099c56f05d2371a225" Dec 03 21:54:13.723616 master-0 kubenswrapper[9136]: I1203 21:54:13.723562 9136 scope.go:117] "RemoveContainer" containerID="f1f81a88c3e9df6920f7399941c169fb03094cedcb1ae2463c0147ff10995db1" Dec 03 21:54:13.755309 master-0 kubenswrapper[9136]: I1203 21:54:13.755244 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podStartSLOduration=138.388984077 podStartE2EDuration="3m26.755227009s" podCreationTimestamp="2025-12-03 21:50:47 +0000 UTC" firstStartedPulling="2025-12-03 21:50:51.723799559 +0000 UTC m=+57.998975941" lastFinishedPulling="2025-12-03 21:52:00.090042461 +0000 UTC m=+126.365218873" observedRunningTime="2025-12-03 21:54:13.731569065 +0000 UTC m=+260.006745467" watchObservedRunningTime="2025-12-03 21:54:13.755227009 +0000 UTC m=+260.030403391" Dec 03 21:54:13.775363 master-0 kubenswrapper[9136]: I1203 21:54:13.775034 9136 scope.go:117] "RemoveContainer" containerID="bfa48bdaae3ed0aa3b0f3696051d4e5dea6449cb42ff81d3ed5c9026e7ac1908" Dec 03 21:54:13.836935 master-0 kubenswrapper[9136]: I1203 21:54:13.836899 9136 scope.go:117] "RemoveContainer" containerID="2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce" Dec 03 21:54:13.840695 master-0 kubenswrapper[9136]: E1203 21:54:13.840657 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce\": container with ID starting with 2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce not found: ID does not exist" containerID="2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce" Dec 03 21:54:13.840875 master-0 kubenswrapper[9136]: I1203 21:54:13.840848 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce"} err="failed to get container status \"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce\": rpc error: code = NotFound desc = could not find container \"2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce\": container with ID starting with 2fceab910d9371338a7dd0385481d8d7ec4a4f5c735799673f28d7383395f7ce not found: ID does not exist" Dec 03 21:54:13.840955 master-0 kubenswrapper[9136]: I1203 21:54:13.840944 9136 scope.go:117] "RemoveContainer" containerID="4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1" Dec 03 21:54:13.847871 master-0 kubenswrapper[9136]: E1203 21:54:13.842951 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1\": container with ID starting with 4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1 not found: ID does not exist" containerID="4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1" Dec 03 21:54:13.847871 master-0 kubenswrapper[9136]: I1203 21:54:13.843009 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1"} err="failed to get container status \"4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1\": rpc error: code = NotFound desc = could not find container \"4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1\": container with ID starting with 4c07d4e4369b8c0d699450d33f504a86ecf0d4386a624ddfc9430b48ea5d04e1 not found: ID does not exist" Dec 03 21:54:13.848898 master-0 kubenswrapper[9136]: I1203 21:54:13.848842 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.848820554 podStartE2EDuration="1.848820554s" podCreationTimestamp="2025-12-03 21:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:54:13.843374431 +0000 UTC m=+260.118550823" watchObservedRunningTime="2025-12-03 21:54:13.848820554 +0000 UTC m=+260.123996926" Dec 03 21:54:13.883896 master-0 kubenswrapper[9136]: I1203 21:54:13.883688 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" event={"ID":"77e36f4e-845b-4b82-8abc-b634636c087a","Type":"ContainerStarted","Data":"d36d56534c3866c225831dd1b270c32d6dd192916468e16e605ea1b73877a5b0"} Dec 03 21:54:13.884203 master-0 kubenswrapper[9136]: I1203 21:54:13.884170 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:54:13.892246 master-0 kubenswrapper[9136]: I1203 21:54:13.892218 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/1.log" Dec 03 21:54:13.893529 master-0 kubenswrapper[9136]: I1203 21:54:13.893501 9136 generic.go:334] "Generic (PLEG): container finished" podID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerID="28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237" exitCode=0 Dec 03 21:54:13.893609 master-0 kubenswrapper[9136]: I1203 21:54:13.893555 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwjnw" event={"ID":"da0d36d7-fb62-4254-95b0-fb81dcf372cd","Type":"ContainerDied","Data":"28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237"} Dec 03 21:54:13.900166 master-0 kubenswrapper[9136]: I1203 21:54:13.900143 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/1.log" Dec 03 21:54:13.906344 master-0 kubenswrapper[9136]: I1203 21:54:13.906267 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerStarted","Data":"bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd"} Dec 03 21:54:13.959727 master-0 kubenswrapper[9136]: I1203 21:54:13.959324 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f66f491-f82d-4e8c-8929-4675f99aa5b7" path="/var/lib/kubelet/pods/4f66f491-f82d-4e8c-8929-4675f99aa5b7/volumes" Dec 03 21:54:13.960431 master-0 kubenswrapper[9136]: I1203 21:54:13.960388 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" path="/var/lib/kubelet/pods/ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75/volumes" Dec 03 21:54:13.961086 master-0 kubenswrapper[9136]: I1203 21:54:13.961049 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:54:13.961148 master-0 kubenswrapper[9136]: I1203 21:54:13.961087 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-78hts" event={"ID":"e6d5d61a-c5de-4619-9afb-7fad63ba0525","Type":"ContainerStarted","Data":"30fa10e223c77166008902988e471fc75f67dd65dafadb5d2eec03b240181e3f"} Dec 03 21:54:13.967100 master-0 kubenswrapper[9136]: I1203 21:54:13.966298 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerStarted","Data":"5c6cff0db5f54508702a0378b5a2fcfe25e7d02bb8251b1a5dc1e5b0d7ac5a5a"} Dec 03 21:54:13.985235 master-0 kubenswrapper[9136]: I1203 21:54:13.985179 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" event={"ID":"89b8bdab-89ba-4a7d-a464-71a2f240e3c7","Type":"ContainerStarted","Data":"9f4678d3801f7b92e605525d79efc364684c9662011dd7ae5dd7a458afa02c37"} Dec 03 21:54:13.986554 master-0 kubenswrapper[9136]: I1203 21:54:13.985673 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:54:13.992038 master-0 kubenswrapper[9136]: I1203 21:54:13.991997 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/1.log" Dec 03 21:54:13.992548 master-0 kubenswrapper[9136]: I1203 21:54:13.992521 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:54:13.992786 master-0 kubenswrapper[9136]: I1203 21:54:13.992737 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerStarted","Data":"2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7"} Dec 03 21:54:14.000785 master-0 kubenswrapper[9136]: I1203 21:54:13.997694 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5775bfbf6d-f54vs_5a5ef85a-d878-4253-ba90-9306c9490e3c/machine-approver-controller/0.log" Dec 03 21:54:14.000785 master-0 kubenswrapper[9136]: I1203 21:54:13.998066 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerStarted","Data":"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2"} Dec 03 21:54:14.007226 master-0 kubenswrapper[9136]: I1203 21:54:14.007179 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/1.log" Dec 03 21:54:14.007812 master-0 kubenswrapper[9136]: I1203 21:54:14.007743 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerStarted","Data":"5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04"} Dec 03 21:54:14.021379 master-0 kubenswrapper[9136]: I1203 21:54:14.021332 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/1.log" Dec 03 21:54:14.021801 master-0 kubenswrapper[9136]: I1203 21:54:14.021743 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerStarted","Data":"48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f"} Dec 03 21:54:14.028562 master-0 kubenswrapper[9136]: I1203 21:54:14.027814 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/0.log" Dec 03 21:54:14.028562 master-0 kubenswrapper[9136]: I1203 21:54:14.028489 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" event={"ID":"ba624ed0-32cc-4c87-81a5-708a8a8a7f88","Type":"ContainerStarted","Data":"3be403144fdd17b31c23c14e8eecf3d61113ddb620955d19d68fd287cf34269c"} Dec 03 21:54:14.046790 master-0 kubenswrapper[9136]: I1203 21:54:14.043913 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/0.log" Dec 03 21:54:14.046790 master-0 kubenswrapper[9136]: I1203 21:54:14.044052 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"6c77deafc27c8d81ee5c6b61c8ac0bccd46b80a3eeff9275f94b1bb228702268"} Dec 03 21:54:14.055607 master-0 kubenswrapper[9136]: I1203 21:54:14.055552 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/1.log" Dec 03 21:54:14.055850 master-0 kubenswrapper[9136]: I1203 21:54:14.055668 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerStarted","Data":"4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569"} Dec 03 21:54:14.081279 master-0 kubenswrapper[9136]: I1203 21:54:14.081077 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/1.log" Dec 03 21:54:14.085636 master-0 kubenswrapper[9136]: I1203 21:54:14.084962 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16"} Dec 03 21:54:14.093877 master-0 kubenswrapper[9136]: I1203 21:54:14.090123 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/1.log" Dec 03 21:54:14.093877 master-0 kubenswrapper[9136]: I1203 21:54:14.091939 9136 generic.go:334] "Generic (PLEG): container finished" podID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerID="60c50abe2ec8c7459d390c08606126e779403c662dcf37b0171073aa9b774934" exitCode=0 Dec 03 21:54:14.093877 master-0 kubenswrapper[9136]: I1203 21:54:14.091980 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k98b2" event={"ID":"e403ab42-1840-4292-a37c-a8d4feeb54ca","Type":"ContainerDied","Data":"60c50abe2ec8c7459d390c08606126e779403c662dcf37b0171073aa9b774934"} Dec 03 21:54:14.100142 master-0 kubenswrapper[9136]: I1203 21:54:14.100100 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/1.log" Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.120982 9136 generic.go:334] "Generic (PLEG): container finished" podID="0a49c320-f31d-4f6d-98c3-48d24346b873" containerID="3a30bbf848145aedbf5309536c3bdd4398da29c24efe84cdd51cfed325666388" exitCode=0 Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.122036 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp794" event={"ID":"0a49c320-f31d-4f6d-98c3-48d24346b873","Type":"ContainerDied","Data":"3a30bbf848145aedbf5309536c3bdd4398da29c24efe84cdd51cfed325666388"} Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.123526 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.123607 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.123626 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.125352 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 21:54:14.129892 master-0 kubenswrapper[9136]: I1203 21:54:14.128222 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 21:54:14.136603 master-0 kubenswrapper[9136]: I1203 21:54:14.136480 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 21:54:14.137864 master-0 kubenswrapper[9136]: E1203 21:54:14.137175 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Dec 03 21:54:14.196261 master-0 kubenswrapper[9136]: I1203 21:54:14.196209 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 21:54:14.281944 master-0 kubenswrapper[9136]: I1203 21:54:14.281877 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 03 21:54:14.670632 master-0 kubenswrapper[9136]: I1203 21:54:14.670528 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" podStartSLOduration=199.670502353 podStartE2EDuration="3m19.670502353s" podCreationTimestamp="2025-12-03 21:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:54:14.670287125 +0000 UTC m=+260.945463517" watchObservedRunningTime="2025-12-03 21:54:14.670502353 +0000 UTC m=+260.945678745" Dec 03 21:54:18.323993 master-0 kubenswrapper[9136]: I1203 21:54:18.323856 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:18.329919 master-0 kubenswrapper[9136]: I1203 21:54:18.329895 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:18.395393 master-0 kubenswrapper[9136]: I1203 21:54:18.395336 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:18.401061 master-0 kubenswrapper[9136]: I1203 21:54:18.401016 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 21:54:19.282512 master-0 kubenswrapper[9136]: I1203 21:54:19.282438 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 03 21:54:19.318695 master-0 kubenswrapper[9136]: I1203 21:54:19.318558 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 03 21:54:20.174682 master-0 kubenswrapper[9136]: I1203 21:54:20.174607 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 03 21:54:20.247170 master-0 kubenswrapper[9136]: E1203 21:54:20.246974 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd30b277303dd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:13.859052509 +0000 UTC m=+80.134228931,LastTimestamp:2025-12-03 21:51:13.859052509 +0000 UTC m=+80.134228931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:54:26.280213 master-0 kubenswrapper[9136]: I1203 21:54:26.280099 9136 patch_prober.go:28] interesting pod/machine-config-daemon-j9wwr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 21:54:26.280213 master-0 kubenswrapper[9136]: I1203 21:54:26.280197 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podUID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 21:54:26.281534 master-0 kubenswrapper[9136]: I1203 21:54:26.280264 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 21:54:26.281534 master-0 kubenswrapper[9136]: I1203 21:54:26.281178 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"70f67b41dc589699749acb86409088e137724e0bbf05196ad9f606340dc1c95e"} pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 21:54:26.281534 master-0 kubenswrapper[9136]: I1203 21:54:26.281284 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podUID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerName="machine-config-daemon" containerID="cri-o://70f67b41dc589699749acb86409088e137724e0bbf05196ad9f606340dc1c95e" gracePeriod=600 Dec 03 21:54:29.226381 master-0 kubenswrapper[9136]: I1203 21:54:29.226262 9136 generic.go:334] "Generic (PLEG): container finished" podID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerID="70f67b41dc589699749acb86409088e137724e0bbf05196ad9f606340dc1c95e" exitCode=0 Dec 03 21:54:29.226381 master-0 kubenswrapper[9136]: I1203 21:54:29.226351 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" event={"ID":"bd18a700-53b2-430c-a34f-dbb6331cfbe5","Type":"ContainerDied","Data":"70f67b41dc589699749acb86409088e137724e0bbf05196ad9f606340dc1c95e"} Dec 03 21:54:36.287657 master-0 kubenswrapper[9136]: I1203 21:54:36.287574 9136 generic.go:334] "Generic (PLEG): container finished" podID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerID="7458eacc3a5edc54c5cf843060c75af4d4324f599075c78c8fcbd5a674afd301" exitCode=0 Dec 03 21:54:36.288412 master-0 kubenswrapper[9136]: I1203 21:54:36.287713 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k98b2" event={"ID":"e403ab42-1840-4292-a37c-a8d4feeb54ca","Type":"ContainerDied","Data":"7458eacc3a5edc54c5cf843060c75af4d4324f599075c78c8fcbd5a674afd301"} Dec 03 21:54:36.294308 master-0 kubenswrapper[9136]: I1203 21:54:36.294194 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" event={"ID":"bd18a700-53b2-430c-a34f-dbb6331cfbe5","Type":"ContainerStarted","Data":"8a7922823397254005a95e0f928275b569c8c140e0f4cf80ce28060edd824b73"} Dec 03 21:54:36.297101 master-0 kubenswrapper[9136]: I1203 21:54:36.297041 9136 generic.go:334] "Generic (PLEG): container finished" podID="0a49c320-f31d-4f6d-98c3-48d24346b873" containerID="c96d65a2700e3dd19239b4928664815aa3051ca2f5ecf0c1ae3c50129c0815b7" exitCode=0 Dec 03 21:54:36.297226 master-0 kubenswrapper[9136]: I1203 21:54:36.297137 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp794" event={"ID":"0a49c320-f31d-4f6d-98c3-48d24346b873","Type":"ContainerDied","Data":"c96d65a2700e3dd19239b4928664815aa3051ca2f5ecf0c1ae3c50129c0815b7"} Dec 03 21:54:36.302603 master-0 kubenswrapper[9136]: I1203 21:54:36.302541 9136 generic.go:334] "Generic (PLEG): container finished" podID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerID="821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d" exitCode=0 Dec 03 21:54:36.302719 master-0 kubenswrapper[9136]: I1203 21:54:36.302607 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwjnw" event={"ID":"da0d36d7-fb62-4254-95b0-fb81dcf372cd","Type":"ContainerDied","Data":"821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d"} Dec 03 21:54:44.356174 master-0 kubenswrapper[9136]: I1203 21:54:44.356110 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/1.log" Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: I1203 21:54:44.356668 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/0.log" Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: I1203 21:54:44.356725 9136 generic.go:334] "Generic (PLEG): container finished" podID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" containerID="6c77deafc27c8d81ee5c6b61c8ac0bccd46b80a3eeff9275f94b1bb228702268" exitCode=1 Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: I1203 21:54:44.356865 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerDied","Data":"6c77deafc27c8d81ee5c6b61c8ac0bccd46b80a3eeff9275f94b1bb228702268"} Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: I1203 21:54:44.356921 9136 scope.go:117] "RemoveContainer" containerID="12901db2ae1fd13e3c0aef0f6572f021784e96590f0cc7bfafbbf5cbabb162c5" Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: I1203 21:54:44.357625 9136 scope.go:117] "RemoveContainer" containerID="6c77deafc27c8d81ee5c6b61c8ac0bccd46b80a3eeff9275f94b1bb228702268" Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: E1203 21:54:44.357945 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 21:54:44.362239 master-0 kubenswrapper[9136]: I1203 21:54:44.361762 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kp794" event={"ID":"0a49c320-f31d-4f6d-98c3-48d24346b873","Type":"ContainerStarted","Data":"d5105685f05de10042bfa4e0087ae657a4fb61cb4c6a5418a722a8a50a66fa3c"} Dec 03 21:54:44.367836 master-0 kubenswrapper[9136]: I1203 21:54:44.367757 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwjnw" event={"ID":"da0d36d7-fb62-4254-95b0-fb81dcf372cd","Type":"ContainerStarted","Data":"1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9"} Dec 03 21:54:44.648142 master-0 kubenswrapper[9136]: I1203 21:54:44.648031 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kp794" podStartSLOduration=201.405132829 podStartE2EDuration="3m50.648000047s" podCreationTimestamp="2025-12-03 21:50:54 +0000 UTC" firstStartedPulling="2025-12-03 21:54:14.122967836 +0000 UTC m=+260.398144218" lastFinishedPulling="2025-12-03 21:54:43.365835024 +0000 UTC m=+289.641011436" observedRunningTime="2025-12-03 21:54:44.642888994 +0000 UTC m=+290.918065446" watchObservedRunningTime="2025-12-03 21:54:44.648000047 +0000 UTC m=+290.923176449" Dec 03 21:54:44.682845 master-0 kubenswrapper[9136]: I1203 21:54:44.682741 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nwjnw" podStartSLOduration=200.143799417 podStartE2EDuration="3m49.682719192s" podCreationTimestamp="2025-12-03 21:50:55 +0000 UTC" firstStartedPulling="2025-12-03 21:54:13.900054386 +0000 UTC m=+260.175230768" lastFinishedPulling="2025-12-03 21:54:43.438974141 +0000 UTC m=+289.714150543" observedRunningTime="2025-12-03 21:54:44.680622982 +0000 UTC m=+290.955799374" watchObservedRunningTime="2025-12-03 21:54:44.682719192 +0000 UTC m=+290.957895574" Dec 03 21:54:45.077305 master-0 kubenswrapper[9136]: I1203 21:54:45.077224 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:54:45.077305 master-0 kubenswrapper[9136]: I1203 21:54:45.077312 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:54:45.378052 master-0 kubenswrapper[9136]: I1203 21:54:45.377993 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/1.log" Dec 03 21:54:45.379364 master-0 kubenswrapper[9136]: I1203 21:54:45.379299 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 21:54:45.382102 master-0 kubenswrapper[9136]: I1203 21:54:45.382021 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k98b2" event={"ID":"e403ab42-1840-4292-a37c-a8d4feeb54ca","Type":"ContainerStarted","Data":"a4cced4dc8dcac3e7254d9fd507acb0e83f426face2c2e3f878759a96cd3bd74"} Dec 03 21:54:45.582177 master-0 kubenswrapper[9136]: I1203 21:54:45.582075 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k98b2" podStartSLOduration=201.480210371 podStartE2EDuration="3m51.582050541s" podCreationTimestamp="2025-12-03 21:50:54 +0000 UTC" firstStartedPulling="2025-12-03 21:54:14.093142014 +0000 UTC m=+260.368318396" lastFinishedPulling="2025-12-03 21:54:44.194982194 +0000 UTC m=+290.470158566" observedRunningTime="2025-12-03 21:54:45.580877381 +0000 UTC m=+291.856053803" watchObservedRunningTime="2025-12-03 21:54:45.582050541 +0000 UTC m=+291.857226933" Dec 03 21:54:46.139654 master-0 kubenswrapper[9136]: I1203 21:54:46.139527 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kp794" podUID="0a49c320-f31d-4f6d-98c3-48d24346b873" containerName="registry-server" probeResult="failure" output=< Dec 03 21:54:46.139654 master-0 kubenswrapper[9136]: timeout: failed to connect service ":50051" within 1s Dec 03 21:54:46.139654 master-0 kubenswrapper[9136]: > Dec 03 21:54:46.245481 master-0 kubenswrapper[9136]: I1203 21:54:46.245365 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:54:46.245481 master-0 kubenswrapper[9136]: I1203 21:54:46.245440 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:54:46.287619 master-0 kubenswrapper[9136]: I1203 21:54:46.282995 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:54:54.022761 master-0 kubenswrapper[9136]: I1203 21:54:54.022638 9136 scope.go:117] "RemoveContainer" containerID="5e573ccc03a6c280986237abcd7396968c1017f2190b689fe94d1f1000629079" Dec 03 21:54:54.250627 master-0 kubenswrapper[9136]: E1203 21:54:54.250462 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.187dd30b27749020 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:7bce50c457ac1f4721bc81a570dd238a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:13.859153952 +0000 UTC m=+80.134330354,LastTimestamp:2025-12-03 21:51:13.859153952 +0000 UTC m=+80.134330354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:54:55.099111 master-0 kubenswrapper[9136]: I1203 21:54:55.099007 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:54:55.099111 master-0 kubenswrapper[9136]: I1203 21:54:55.099108 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:54:55.140822 master-0 kubenswrapper[9136]: I1203 21:54:55.140718 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:54:55.161513 master-0 kubenswrapper[9136]: I1203 21:54:55.161445 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:54:55.210917 master-0 kubenswrapper[9136]: I1203 21:54:55.210823 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kp794" Dec 03 21:54:55.501112 master-0 kubenswrapper[9136]: I1203 21:54:55.500982 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k98b2" Dec 03 21:54:56.300684 master-0 kubenswrapper[9136]: I1203 21:54:56.300617 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:54:58.907741 master-0 kubenswrapper[9136]: I1203 21:54:58.907650 9136 scope.go:117] "RemoveContainer" containerID="6c77deafc27c8d81ee5c6b61c8ac0bccd46b80a3eeff9275f94b1bb228702268" Dec 03 21:54:59.488803 master-0 kubenswrapper[9136]: I1203 21:54:59.488734 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/1.log" Dec 03 21:54:59.489088 master-0 kubenswrapper[9136]: I1203 21:54:59.488864 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6"} Dec 03 21:55:01.503566 master-0 kubenswrapper[9136]: I1203 21:55:01.503491 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/1.log" Dec 03 21:55:01.504548 master-0 kubenswrapper[9136]: I1203 21:55:01.504516 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/0.log" Dec 03 21:55:01.504641 master-0 kubenswrapper[9136]: I1203 21:55:01.504568 9136 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="6e8e854c9ec7a5043d35c55ccb3f34d12a4db391501f26bb9ce132cf680165af" exitCode=1 Dec 03 21:55:01.504641 master-0 kubenswrapper[9136]: I1203 21:55:01.504600 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerDied","Data":"6e8e854c9ec7a5043d35c55ccb3f34d12a4db391501f26bb9ce132cf680165af"} Dec 03 21:55:01.504641 master-0 kubenswrapper[9136]: I1203 21:55:01.504637 9136 scope.go:117] "RemoveContainer" containerID="6c63c3c05a429e1c2a2724aa6046ef16e3d07844020b76ab9c6e4dd4aedf14d1" Dec 03 21:55:01.505058 master-0 kubenswrapper[9136]: I1203 21:55:01.505016 9136 scope.go:117] "RemoveContainer" containerID="6e8e854c9ec7a5043d35c55ccb3f34d12a4db391501f26bb9ce132cf680165af" Dec 03 21:55:01.505233 master-0 kubenswrapper[9136]: E1203 21:55:01.505196 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 21:55:02.514130 master-0 kubenswrapper[9136]: I1203 21:55:02.514028 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/1.log" Dec 03 21:55:13.912661 master-0 kubenswrapper[9136]: I1203 21:55:13.912488 9136 scope.go:117] "RemoveContainer" containerID="6e8e854c9ec7a5043d35c55ccb3f34d12a4db391501f26bb9ce132cf680165af" Dec 03 21:55:14.616016 master-0 kubenswrapper[9136]: I1203 21:55:14.615919 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/2.log" Dec 03 21:55:14.617064 master-0 kubenswrapper[9136]: I1203 21:55:14.616982 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/1.log" Dec 03 21:55:14.617971 master-0 kubenswrapper[9136]: I1203 21:55:14.617913 9136 generic.go:334] "Generic (PLEG): container finished" podID="fa9b5917-d4f3-4372-a200-45b57412f92f" containerID="2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16" exitCode=1 Dec 03 21:55:14.618136 master-0 kubenswrapper[9136]: I1203 21:55:14.618015 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerDied","Data":"2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16"} Dec 03 21:55:14.618136 master-0 kubenswrapper[9136]: I1203 21:55:14.618123 9136 scope.go:117] "RemoveContainer" containerID="391a76f62114231d834f9c2db03252c4e7685cf51b83cad3819b8f095737b885" Dec 03 21:55:14.619139 master-0 kubenswrapper[9136]: I1203 21:55:14.619047 9136 scope.go:117] "RemoveContainer" containerID="2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16" Dec 03 21:55:14.619734 master-0 kubenswrapper[9136]: E1203 21:55:14.619467 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5fdc576499-q9tf6_openshift-machine-api(fa9b5917-d4f3-4372-a200-45b57412f92f)\"" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podUID="fa9b5917-d4f3-4372-a200-45b57412f92f" Dec 03 21:55:14.623006 master-0 kubenswrapper[9136]: I1203 21:55:14.622925 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/1.log" Dec 03 21:55:14.625421 master-0 kubenswrapper[9136]: I1203 21:55:14.623564 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af"} Dec 03 21:55:15.632429 master-0 kubenswrapper[9136]: I1203 21:55:15.632360 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/2.log" Dec 03 21:55:25.907664 master-0 kubenswrapper[9136]: I1203 21:55:25.907567 9136 scope.go:117] "RemoveContainer" containerID="2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16" Dec 03 21:55:25.908599 master-0 kubenswrapper[9136]: E1203 21:55:25.907992 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5fdc576499-q9tf6_openshift-machine-api(fa9b5917-d4f3-4372-a200-45b57412f92f)\"" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podUID="fa9b5917-d4f3-4372-a200-45b57412f92f" Dec 03 21:55:28.253310 master-0 kubenswrapper[9136]: E1203 21:55:28.253145 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cluster-autoscaler-operator-7f88444875-kb5rx.187dd30b55caf5eb openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:cluster-autoscaler-operator-7f88444875-kb5rx,UID:858384f3-5741-4e67-8669-2eb2b2dcaf7f,APIVersion:v1,ResourceVersion:8567,FieldPath:spec.containers{cluster-autoscaler-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2d41c3e944e86b73b4ba0d037ff016562211988f3206b9deb6cc7dccca708248\" in 22.467s (22.467s including waiting). Image size: 450855746 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:14.636568043 +0000 UTC m=+80.911744505,LastTimestamp:2025-12-03 21:51:14.636568043 +0000 UTC m=+80.911744505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 21:55:29.731422 master-0 kubenswrapper[9136]: I1203 21:55:29.731352 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/2.log" Dec 03 21:55:29.732316 master-0 kubenswrapper[9136]: I1203 21:55:29.732269 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/1.log" Dec 03 21:55:29.732371 master-0 kubenswrapper[9136]: I1203 21:55:29.732341 9136 generic.go:334] "Generic (PLEG): container finished" podID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" containerID="a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6" exitCode=1 Dec 03 21:55:29.732411 master-0 kubenswrapper[9136]: I1203 21:55:29.732382 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerDied","Data":"a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6"} Dec 03 21:55:29.732448 master-0 kubenswrapper[9136]: I1203 21:55:29.732427 9136 scope.go:117] "RemoveContainer" containerID="6c77deafc27c8d81ee5c6b61c8ac0bccd46b80a3eeff9275f94b1bb228702268" Dec 03 21:55:29.733121 master-0 kubenswrapper[9136]: I1203 21:55:29.733075 9136 scope.go:117] "RemoveContainer" containerID="a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6" Dec 03 21:55:29.733432 master-0 kubenswrapper[9136]: E1203 21:55:29.733388 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 21:55:30.742090 master-0 kubenswrapper[9136]: I1203 21:55:30.742030 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/2.log" Dec 03 21:55:39.908056 master-0 kubenswrapper[9136]: I1203 21:55:39.908017 9136 scope.go:117] "RemoveContainer" containerID="2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16" Dec 03 21:55:40.819318 master-0 kubenswrapper[9136]: I1203 21:55:40.819139 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/2.log" Dec 03 21:55:40.819835 master-0 kubenswrapper[9136]: I1203 21:55:40.819720 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff"} Dec 03 21:55:42.908543 master-0 kubenswrapper[9136]: I1203 21:55:42.908465 9136 scope.go:117] "RemoveContainer" containerID="a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6" Dec 03 21:55:42.909464 master-0 kubenswrapper[9136]: E1203 21:55:42.908910 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 21:55:43.813188 master-0 kubenswrapper[9136]: I1203 21:55:43.813108 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwjnw"] Dec 03 21:55:43.813629 master-0 kubenswrapper[9136]: I1203 21:55:43.813577 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nwjnw" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="registry-server" containerID="cri-o://1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9" gracePeriod=2 Dec 03 21:55:44.177221 master-0 kubenswrapper[9136]: I1203 21:55:44.177129 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tcqzq"] Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: E1203 21:55:44.177476 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f66f491-f82d-4e8c-8929-4675f99aa5b7" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: I1203 21:55:44.177505 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f66f491-f82d-4e8c-8929-4675f99aa5b7" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: E1203 21:55:44.177551 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: I1203 21:55:44.177564 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: E1203 21:55:44.177589 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: I1203 21:55:44.177602 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: E1203 21:55:44.177626 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: I1203 21:55:44.177638 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: E1203 21:55:44.177654 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: I1203 21:55:44.177666 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerName="installer" Dec 03 21:55:44.177854 master-0 kubenswrapper[9136]: I1203 21:55:44.177869 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca8a36ae-e8fb-4a96-b42d-0e39b51fdc75" containerName="installer" Dec 03 21:55:44.178416 master-0 kubenswrapper[9136]: I1203 21:55:44.177906 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerName="installer" Dec 03 21:55:44.178416 master-0 kubenswrapper[9136]: I1203 21:55:44.177927 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f66f491-f82d-4e8c-8929-4675f99aa5b7" containerName="installer" Dec 03 21:55:44.178416 master-0 kubenswrapper[9136]: I1203 21:55:44.177948 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerName="installer" Dec 03 21:55:44.178416 master-0 kubenswrapper[9136]: I1203 21:55:44.177962 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerName="installer" Dec 03 21:55:44.179364 master-0 kubenswrapper[9136]: I1203 21:55:44.179311 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.195804 master-0 kubenswrapper[9136]: I1203 21:55:44.195706 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tcqzq"] Dec 03 21:55:44.296008 master-0 kubenswrapper[9136]: I1203 21:55:44.295952 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:55:44.316726 master-0 kubenswrapper[9136]: I1203 21:55:44.316288 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-utilities\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.316726 master-0 kubenswrapper[9136]: I1203 21:55:44.316418 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9pcw\" (UniqueName: \"kubernetes.io/projected/10fc6516-cd4d-4291-a26d-8376ba0affef-kube-api-access-h9pcw\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.316726 master-0 kubenswrapper[9136]: I1203 21:55:44.316489 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-catalog-content\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.418018 master-0 kubenswrapper[9136]: I1203 21:55:44.417957 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-catalog-content\") pod \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " Dec 03 21:55:44.418263 master-0 kubenswrapper[9136]: I1203 21:55:44.418062 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-utilities\") pod \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " Dec 03 21:55:44.418263 master-0 kubenswrapper[9136]: I1203 21:55:44.418114 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm7kd\" (UniqueName: \"kubernetes.io/projected/da0d36d7-fb62-4254-95b0-fb81dcf372cd-kube-api-access-rm7kd\") pod \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\" (UID: \"da0d36d7-fb62-4254-95b0-fb81dcf372cd\") " Dec 03 21:55:44.418349 master-0 kubenswrapper[9136]: I1203 21:55:44.418307 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-utilities\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.418404 master-0 kubenswrapper[9136]: I1203 21:55:44.418366 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-catalog-content\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.418404 master-0 kubenswrapper[9136]: I1203 21:55:44.418391 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9pcw\" (UniqueName: \"kubernetes.io/projected/10fc6516-cd4d-4291-a26d-8376ba0affef-kube-api-access-h9pcw\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.419161 master-0 kubenswrapper[9136]: I1203 21:55:44.419005 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-catalog-content\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.419311 master-0 kubenswrapper[9136]: I1203 21:55:44.419290 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-utilities\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.419550 master-0 kubenswrapper[9136]: I1203 21:55:44.419517 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-utilities" (OuterVolumeSpecName: "utilities") pod "da0d36d7-fb62-4254-95b0-fb81dcf372cd" (UID: "da0d36d7-fb62-4254-95b0-fb81dcf372cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 21:55:44.422203 master-0 kubenswrapper[9136]: I1203 21:55:44.422177 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da0d36d7-fb62-4254-95b0-fb81dcf372cd-kube-api-access-rm7kd" (OuterVolumeSpecName: "kube-api-access-rm7kd") pod "da0d36d7-fb62-4254-95b0-fb81dcf372cd" (UID: "da0d36d7-fb62-4254-95b0-fb81dcf372cd"). InnerVolumeSpecName "kube-api-access-rm7kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:55:44.433341 master-0 kubenswrapper[9136]: I1203 21:55:44.433315 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9pcw\" (UniqueName: \"kubernetes.io/projected/10fc6516-cd4d-4291-a26d-8376ba0affef-kube-api-access-h9pcw\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.443851 master-0 kubenswrapper[9136]: I1203 21:55:44.443797 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da0d36d7-fb62-4254-95b0-fb81dcf372cd" (UID: "da0d36d7-fb62-4254-95b0-fb81dcf372cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 21:55:44.519236 master-0 kubenswrapper[9136]: I1203 21:55:44.519148 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:44.520401 master-0 kubenswrapper[9136]: I1203 21:55:44.520375 9136 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:44.520454 master-0 kubenswrapper[9136]: I1203 21:55:44.520407 9136 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da0d36d7-fb62-4254-95b0-fb81dcf372cd-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:44.520454 master-0 kubenswrapper[9136]: I1203 21:55:44.520418 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm7kd\" (UniqueName: \"kubernetes.io/projected/da0d36d7-fb62-4254-95b0-fb81dcf372cd-kube-api-access-rm7kd\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:44.790267 master-0 kubenswrapper[9136]: I1203 21:55:44.790096 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qht46"] Dec 03 21:55:44.790528 master-0 kubenswrapper[9136]: E1203 21:55:44.790430 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="registry-server" Dec 03 21:55:44.790528 master-0 kubenswrapper[9136]: I1203 21:55:44.790453 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="registry-server" Dec 03 21:55:44.790528 master-0 kubenswrapper[9136]: E1203 21:55:44.790476 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="extract-utilities" Dec 03 21:55:44.790528 master-0 kubenswrapper[9136]: I1203 21:55:44.790490 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="extract-utilities" Dec 03 21:55:44.790528 master-0 kubenswrapper[9136]: E1203 21:55:44.790516 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="extract-content" Dec 03 21:55:44.790528 master-0 kubenswrapper[9136]: I1203 21:55:44.790530 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="extract-content" Dec 03 21:55:44.790995 master-0 kubenswrapper[9136]: I1203 21:55:44.790727 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerName="registry-server" Dec 03 21:55:44.792867 master-0 kubenswrapper[9136]: I1203 21:55:44.792814 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.795815 master-0 kubenswrapper[9136]: I1203 21:55:44.795745 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-mss7s" Dec 03 21:55:44.810049 master-0 kubenswrapper[9136]: I1203 21:55:44.809896 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qht46"] Dec 03 21:55:44.824246 master-0 kubenswrapper[9136]: I1203 21:55:44.824178 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-utilities\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.824440 master-0 kubenswrapper[9136]: I1203 21:55:44.824309 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-catalog-content\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.824440 master-0 kubenswrapper[9136]: I1203 21:55:44.824366 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlw7s\" (UniqueName: \"kubernetes.io/projected/c807d487-5b8f-4747-87ee-df0637e2e11f-kube-api-access-nlw7s\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.851584 master-0 kubenswrapper[9136]: I1203 21:55:44.851519 9136 generic.go:334] "Generic (PLEG): container finished" podID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" containerID="1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9" exitCode=0 Dec 03 21:55:44.851584 master-0 kubenswrapper[9136]: I1203 21:55:44.851580 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwjnw" event={"ID":"da0d36d7-fb62-4254-95b0-fb81dcf372cd","Type":"ContainerDied","Data":"1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9"} Dec 03 21:55:44.851938 master-0 kubenswrapper[9136]: I1203 21:55:44.851619 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nwjnw" event={"ID":"da0d36d7-fb62-4254-95b0-fb81dcf372cd","Type":"ContainerDied","Data":"da2e3f2530639eb26a748339c2a8743a14a7f76380a63f9bb6ef6739b88a428b"} Dec 03 21:55:44.851938 master-0 kubenswrapper[9136]: I1203 21:55:44.851652 9136 scope.go:117] "RemoveContainer" containerID="1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9" Dec 03 21:55:44.851938 master-0 kubenswrapper[9136]: I1203 21:55:44.851840 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nwjnw" Dec 03 21:55:44.874617 master-0 kubenswrapper[9136]: I1203 21:55:44.874557 9136 scope.go:117] "RemoveContainer" containerID="821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d" Dec 03 21:55:44.899461 master-0 kubenswrapper[9136]: I1203 21:55:44.899402 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwjnw"] Dec 03 21:55:44.899595 master-0 kubenswrapper[9136]: I1203 21:55:44.899542 9136 scope.go:117] "RemoveContainer" containerID="28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237" Dec 03 21:55:44.909145 master-0 kubenswrapper[9136]: I1203 21:55:44.909078 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nwjnw"] Dec 03 21:55:44.924765 master-0 kubenswrapper[9136]: I1203 21:55:44.924716 9136 scope.go:117] "RemoveContainer" containerID="1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9" Dec 03 21:55:44.925454 master-0 kubenswrapper[9136]: I1203 21:55:44.925397 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-utilities\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.925674 master-0 kubenswrapper[9136]: E1203 21:55:44.925466 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9\": container with ID starting with 1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9 not found: ID does not exist" containerID="1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9" Dec 03 21:55:44.925674 master-0 kubenswrapper[9136]: I1203 21:55:44.925520 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9"} err="failed to get container status \"1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9\": rpc error: code = NotFound desc = could not find container \"1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9\": container with ID starting with 1591dd90489be20d151e83e80dcc236083797fdbbe54ea75191732813bd21cc9 not found: ID does not exist" Dec 03 21:55:44.925674 master-0 kubenswrapper[9136]: I1203 21:55:44.925560 9136 scope.go:117] "RemoveContainer" containerID="821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d" Dec 03 21:55:44.926003 master-0 kubenswrapper[9136]: I1203 21:55:44.925949 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-catalog-content\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.926003 master-0 kubenswrapper[9136]: I1203 21:55:44.925995 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlw7s\" (UniqueName: \"kubernetes.io/projected/c807d487-5b8f-4747-87ee-df0637e2e11f-kube-api-access-nlw7s\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.926137 master-0 kubenswrapper[9136]: E1203 21:55:44.926050 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d\": container with ID starting with 821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d not found: ID does not exist" containerID="821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d" Dec 03 21:55:44.926204 master-0 kubenswrapper[9136]: I1203 21:55:44.926128 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d"} err="failed to get container status \"821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d\": rpc error: code = NotFound desc = could not find container \"821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d\": container with ID starting with 821d52f2ac2ba57fb30c42b4f964432bd0ea66df4d1d6342d12e4cb9f361077d not found: ID does not exist" Dec 03 21:55:44.926204 master-0 kubenswrapper[9136]: I1203 21:55:44.926162 9136 scope.go:117] "RemoveContainer" containerID="28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237" Dec 03 21:55:44.926514 master-0 kubenswrapper[9136]: E1203 21:55:44.926477 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237\": container with ID starting with 28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237 not found: ID does not exist" containerID="28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237" Dec 03 21:55:44.926616 master-0 kubenswrapper[9136]: I1203 21:55:44.926515 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237"} err="failed to get container status \"28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237\": rpc error: code = NotFound desc = could not find container \"28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237\": container with ID starting with 28f848d6aa8ee349d70a3202852dc240a6e4938f9e13c3967bc7dc28bda06237 not found: ID does not exist" Dec 03 21:55:44.926616 master-0 kubenswrapper[9136]: I1203 21:55:44.926545 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-utilities\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.927482 master-0 kubenswrapper[9136]: I1203 21:55:44.927418 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-catalog-content\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.948124 master-0 kubenswrapper[9136]: I1203 21:55:44.948011 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlw7s\" (UniqueName: \"kubernetes.io/projected/c807d487-5b8f-4747-87ee-df0637e2e11f-kube-api-access-nlw7s\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:44.948921 master-0 kubenswrapper[9136]: I1203 21:55:44.948854 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tcqzq"] Dec 03 21:55:44.957154 master-0 kubenswrapper[9136]: W1203 21:55:44.956960 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fc6516_cd4d_4291_a26d_8376ba0affef.slice/crio-5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9 WatchSource:0}: Error finding container 5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9: Status 404 returned error can't find the container with id 5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9 Dec 03 21:55:45.134388 master-0 kubenswrapper[9136]: I1203 21:55:45.134304 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:55:45.587945 master-0 kubenswrapper[9136]: I1203 21:55:45.587873 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qht46"] Dec 03 21:55:45.596033 master-0 kubenswrapper[9136]: W1203 21:55:45.595975 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc807d487_5b8f_4747_87ee_df0637e2e11f.slice/crio-5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983 WatchSource:0}: Error finding container 5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983: Status 404 returned error can't find the container with id 5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983 Dec 03 21:55:45.863301 master-0 kubenswrapper[9136]: I1203 21:55:45.863116 9136 generic.go:334] "Generic (PLEG): container finished" podID="10fc6516-cd4d-4291-a26d-8376ba0affef" containerID="824ae9e8313e0e1638464c8629a2eb0b5c0bd9befe3d9076d5a3df5161cfa30c" exitCode=0 Dec 03 21:55:45.863301 master-0 kubenswrapper[9136]: I1203 21:55:45.863176 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tcqzq" event={"ID":"10fc6516-cd4d-4291-a26d-8376ba0affef","Type":"ContainerDied","Data":"824ae9e8313e0e1638464c8629a2eb0b5c0bd9befe3d9076d5a3df5161cfa30c"} Dec 03 21:55:45.863301 master-0 kubenswrapper[9136]: I1203 21:55:45.863244 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tcqzq" event={"ID":"10fc6516-cd4d-4291-a26d-8376ba0affef","Type":"ContainerStarted","Data":"5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9"} Dec 03 21:55:45.865056 master-0 kubenswrapper[9136]: I1203 21:55:45.865002 9136 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 21:55:45.866031 master-0 kubenswrapper[9136]: I1203 21:55:45.865978 9136 generic.go:334] "Generic (PLEG): container finished" podID="c807d487-5b8f-4747-87ee-df0637e2e11f" containerID="c36651d25828b26f062b98a2d24259d3a418e128fa8528be38ba41a00ecbed18" exitCode=0 Dec 03 21:55:45.866124 master-0 kubenswrapper[9136]: I1203 21:55:45.866066 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qht46" event={"ID":"c807d487-5b8f-4747-87ee-df0637e2e11f","Type":"ContainerDied","Data":"c36651d25828b26f062b98a2d24259d3a418e128fa8528be38ba41a00ecbed18"} Dec 03 21:55:45.866124 master-0 kubenswrapper[9136]: I1203 21:55:45.866106 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qht46" event={"ID":"c807d487-5b8f-4747-87ee-df0637e2e11f","Type":"ContainerStarted","Data":"5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983"} Dec 03 21:55:45.919417 master-0 kubenswrapper[9136]: I1203 21:55:45.919346 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da0d36d7-fb62-4254-95b0-fb81dcf372cd" path="/var/lib/kubelet/pods/da0d36d7-fb62-4254-95b0-fb81dcf372cd/volumes" Dec 03 21:55:46.716437 master-0 kubenswrapper[9136]: I1203 21:55:46.716340 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs"] Dec 03 21:55:46.716437 master-0 kubenswrapper[9136]: I1203 21:55:46.716676 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="kube-rbac-proxy" containerID="cri-o://9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86" gracePeriod=30 Dec 03 21:55:46.717934 master-0 kubenswrapper[9136]: I1203 21:55:46.716889 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" containerID="cri-o://d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2" gracePeriod=30 Dec 03 21:55:46.874009 master-0 kubenswrapper[9136]: I1203 21:55:46.873798 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5775bfbf6d-f54vs_5a5ef85a-d878-4253-ba90-9306c9490e3c/machine-approver-controller/0.log" Dec 03 21:55:46.874437 master-0 kubenswrapper[9136]: I1203 21:55:46.874399 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:55:46.877039 master-0 kubenswrapper[9136]: I1203 21:55:46.876445 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-5775bfbf6d-f54vs_5a5ef85a-d878-4253-ba90-9306c9490e3c/machine-approver-controller/0.log" Dec 03 21:55:46.878524 master-0 kubenswrapper[9136]: I1203 21:55:46.877264 9136 generic.go:334] "Generic (PLEG): container finished" podID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerID="d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2" exitCode=0 Dec 03 21:55:46.878524 master-0 kubenswrapper[9136]: I1203 21:55:46.877324 9136 generic.go:334] "Generic (PLEG): container finished" podID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerID="9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86" exitCode=0 Dec 03 21:55:46.878524 master-0 kubenswrapper[9136]: I1203 21:55:46.877366 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerDied","Data":"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2"} Dec 03 21:55:46.878524 master-0 kubenswrapper[9136]: I1203 21:55:46.877431 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerDied","Data":"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86"} Dec 03 21:55:46.878524 master-0 kubenswrapper[9136]: I1203 21:55:46.877459 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" event={"ID":"5a5ef85a-d878-4253-ba90-9306c9490e3c","Type":"ContainerDied","Data":"c99595cdd7fb8a317ad7460b612a48324cc29b45b511573083700dbd1f3293bf"} Dec 03 21:55:46.878524 master-0 kubenswrapper[9136]: I1203 21:55:46.877492 9136 scope.go:117] "RemoveContainer" containerID="d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2" Dec 03 21:55:46.881757 master-0 kubenswrapper[9136]: I1203 21:55:46.881622 9136 generic.go:334] "Generic (PLEG): container finished" podID="10fc6516-cd4d-4291-a26d-8376ba0affef" containerID="508b98fabad3f511f41e1f4fccc27dd67c19dfb8a3b5dfe228e24e0f17fa5a05" exitCode=0 Dec 03 21:55:46.881757 master-0 kubenswrapper[9136]: I1203 21:55:46.881665 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tcqzq" event={"ID":"10fc6516-cd4d-4291-a26d-8376ba0affef","Type":"ContainerDied","Data":"508b98fabad3f511f41e1f4fccc27dd67c19dfb8a3b5dfe228e24e0f17fa5a05"} Dec 03 21:55:46.900816 master-0 kubenswrapper[9136]: I1203 21:55:46.900790 9136 scope.go:117] "RemoveContainer" containerID="f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036" Dec 03 21:55:46.924801 master-0 kubenswrapper[9136]: I1203 21:55:46.924645 9136 scope.go:117] "RemoveContainer" containerID="9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86" Dec 03 21:55:46.952167 master-0 kubenswrapper[9136]: I1203 21:55:46.952028 9136 scope.go:117] "RemoveContainer" containerID="d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2" Dec 03 21:55:46.952676 master-0 kubenswrapper[9136]: E1203 21:55:46.952567 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2\": container with ID starting with d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2 not found: ID does not exist" containerID="d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2" Dec 03 21:55:46.952676 master-0 kubenswrapper[9136]: I1203 21:55:46.952602 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2"} err="failed to get container status \"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2\": rpc error: code = NotFound desc = could not find container \"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2\": container with ID starting with d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2 not found: ID does not exist" Dec 03 21:55:46.952676 master-0 kubenswrapper[9136]: I1203 21:55:46.952631 9136 scope.go:117] "RemoveContainer" containerID="f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036" Dec 03 21:55:46.953662 master-0 kubenswrapper[9136]: E1203 21:55:46.953234 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036\": container with ID starting with f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036 not found: ID does not exist" containerID="f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036" Dec 03 21:55:46.953662 master-0 kubenswrapper[9136]: I1203 21:55:46.953295 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036"} err="failed to get container status \"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036\": rpc error: code = NotFound desc = could not find container \"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036\": container with ID starting with f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036 not found: ID does not exist" Dec 03 21:55:46.953662 master-0 kubenswrapper[9136]: I1203 21:55:46.953336 9136 scope.go:117] "RemoveContainer" containerID="9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86" Dec 03 21:55:46.954097 master-0 kubenswrapper[9136]: E1203 21:55:46.954051 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86\": container with ID starting with 9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86 not found: ID does not exist" containerID="9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86" Dec 03 21:55:46.954151 master-0 kubenswrapper[9136]: I1203 21:55:46.954108 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86"} err="failed to get container status \"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86\": rpc error: code = NotFound desc = could not find container \"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86\": container with ID starting with 9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86 not found: ID does not exist" Dec 03 21:55:46.954151 master-0 kubenswrapper[9136]: I1203 21:55:46.954140 9136 scope.go:117] "RemoveContainer" containerID="d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2" Dec 03 21:55:46.955382 master-0 kubenswrapper[9136]: I1203 21:55:46.955339 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2"} err="failed to get container status \"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2\": rpc error: code = NotFound desc = could not find container \"d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2\": container with ID starting with d295c15672d0c6b933f81ea3966fc92c64efeba1b90104ebbabf62a163ffa8e2 not found: ID does not exist" Dec 03 21:55:46.955382 master-0 kubenswrapper[9136]: I1203 21:55:46.955369 9136 scope.go:117] "RemoveContainer" containerID="f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036" Dec 03 21:55:46.955693 master-0 kubenswrapper[9136]: I1203 21:55:46.955660 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036"} err="failed to get container status \"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036\": rpc error: code = NotFound desc = could not find container \"f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036\": container with ID starting with f7aab83dc3ab2473e76d6814fe5ce94d46e777358d15dd41ef4310c14d40e036 not found: ID does not exist" Dec 03 21:55:46.955693 master-0 kubenswrapper[9136]: I1203 21:55:46.955682 9136 scope.go:117] "RemoveContainer" containerID="9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86" Dec 03 21:55:46.956903 master-0 kubenswrapper[9136]: I1203 21:55:46.955920 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86"} err="failed to get container status \"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86\": rpc error: code = NotFound desc = could not find container \"9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86\": container with ID starting with 9ff2d8f07ca354fb627df3d83d6d4567eb126dd571cca4464c981e3db9823d86 not found: ID does not exist" Dec 03 21:55:47.055269 master-0 kubenswrapper[9136]: I1203 21:55:47.055220 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5a5ef85a-d878-4253-ba90-9306c9490e3c-machine-approver-tls\") pod \"5a5ef85a-d878-4253-ba90-9306c9490e3c\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " Dec 03 21:55:47.055700 master-0 kubenswrapper[9136]: I1203 21:55:47.055682 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-config\") pod \"5a5ef85a-d878-4253-ba90-9306c9490e3c\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " Dec 03 21:55:47.056347 master-0 kubenswrapper[9136]: I1203 21:55:47.056190 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-config" (OuterVolumeSpecName: "config") pod "5a5ef85a-d878-4253-ba90-9306c9490e3c" (UID: "5a5ef85a-d878-4253-ba90-9306c9490e3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:55:47.056493 master-0 kubenswrapper[9136]: I1203 21:55:47.056479 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q57t2\" (UniqueName: \"kubernetes.io/projected/5a5ef85a-d878-4253-ba90-9306c9490e3c-kube-api-access-q57t2\") pod \"5a5ef85a-d878-4253-ba90-9306c9490e3c\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " Dec 03 21:55:47.056675 master-0 kubenswrapper[9136]: I1203 21:55:47.056661 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-auth-proxy-config\") pod \"5a5ef85a-d878-4253-ba90-9306c9490e3c\" (UID: \"5a5ef85a-d878-4253-ba90-9306c9490e3c\") " Dec 03 21:55:47.057293 master-0 kubenswrapper[9136]: I1203 21:55:47.057181 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "5a5ef85a-d878-4253-ba90-9306c9490e3c" (UID: "5a5ef85a-d878-4253-ba90-9306c9490e3c"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:55:47.058154 master-0 kubenswrapper[9136]: I1203 21:55:47.058139 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:47.058243 master-0 kubenswrapper[9136]: I1203 21:55:47.058230 9136 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5a5ef85a-d878-4253-ba90-9306c9490e3c-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:47.059782 master-0 kubenswrapper[9136]: I1203 21:55:47.059717 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a5ef85a-d878-4253-ba90-9306c9490e3c-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "5a5ef85a-d878-4253-ba90-9306c9490e3c" (UID: "5a5ef85a-d878-4253-ba90-9306c9490e3c"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:55:47.060184 master-0 kubenswrapper[9136]: I1203 21:55:47.060147 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a5ef85a-d878-4253-ba90-9306c9490e3c-kube-api-access-q57t2" (OuterVolumeSpecName: "kube-api-access-q57t2") pod "5a5ef85a-d878-4253-ba90-9306c9490e3c" (UID: "5a5ef85a-d878-4253-ba90-9306c9490e3c"). InnerVolumeSpecName "kube-api-access-q57t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:55:47.159528 master-0 kubenswrapper[9136]: I1203 21:55:47.159464 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q57t2\" (UniqueName: \"kubernetes.io/projected/5a5ef85a-d878-4253-ba90-9306c9490e3c-kube-api-access-q57t2\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:47.159528 master-0 kubenswrapper[9136]: I1203 21:55:47.159506 9136 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5a5ef85a-d878-4253-ba90-9306c9490e3c-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:47.890062 master-0 kubenswrapper[9136]: I1203 21:55:47.890009 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs" Dec 03 21:55:47.894721 master-0 kubenswrapper[9136]: I1203 21:55:47.894690 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tcqzq" event={"ID":"10fc6516-cd4d-4291-a26d-8376ba0affef","Type":"ContainerStarted","Data":"d408321f44658648241a087a35a6c70a749a8f569c70bbc9331e384c8002e47a"} Dec 03 21:55:47.926711 master-0 kubenswrapper[9136]: I1203 21:55:47.926637 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tcqzq" podStartSLOduration=2.479976888 podStartE2EDuration="3.926614268s" podCreationTimestamp="2025-12-03 21:55:44 +0000 UTC" firstStartedPulling="2025-12-03 21:55:45.864913099 +0000 UTC m=+352.140089491" lastFinishedPulling="2025-12-03 21:55:47.311550479 +0000 UTC m=+353.586726871" observedRunningTime="2025-12-03 21:55:47.919101838 +0000 UTC m=+354.194278250" watchObservedRunningTime="2025-12-03 21:55:47.926614268 +0000 UTC m=+354.201790660" Dec 03 21:55:47.942578 master-0 kubenswrapper[9136]: I1203 21:55:47.942517 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs"] Dec 03 21:55:47.953551 master-0 kubenswrapper[9136]: I1203 21:55:47.953501 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-5775bfbf6d-f54vs"] Dec 03 21:55:47.986028 master-0 kubenswrapper[9136]: I1203 21:55:47.985899 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd"] Dec 03 21:55:47.986279 master-0 kubenswrapper[9136]: E1203 21:55:47.986232 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="kube-rbac-proxy" Dec 03 21:55:47.986279 master-0 kubenswrapper[9136]: I1203 21:55:47.986251 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="kube-rbac-proxy" Dec 03 21:55:47.986279 master-0 kubenswrapper[9136]: E1203 21:55:47.986264 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" Dec 03 21:55:47.986279 master-0 kubenswrapper[9136]: I1203 21:55:47.986272 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" Dec 03 21:55:47.986279 master-0 kubenswrapper[9136]: E1203 21:55:47.986280 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" Dec 03 21:55:47.986279 master-0 kubenswrapper[9136]: I1203 21:55:47.986287 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" Dec 03 21:55:47.986960 master-0 kubenswrapper[9136]: I1203 21:55:47.986392 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" Dec 03 21:55:47.986960 master-0 kubenswrapper[9136]: I1203 21:55:47.986954 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="kube-rbac-proxy" Dec 03 21:55:47.987124 master-0 kubenswrapper[9136]: I1203 21:55:47.986971 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" containerName="machine-approver-controller" Dec 03 21:55:47.988132 master-0 kubenswrapper[9136]: I1203 21:55:47.987925 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:47.991034 master-0 kubenswrapper[9136]: I1203 21:55:47.990937 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 21:55:47.991034 master-0 kubenswrapper[9136]: I1203 21:55:47.990982 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 21:55:47.991034 master-0 kubenswrapper[9136]: I1203 21:55:47.990982 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 21:55:47.991213 master-0 kubenswrapper[9136]: I1203 21:55:47.990941 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 21:55:47.991213 master-0 kubenswrapper[9136]: I1203 21:55:47.991060 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-k4746" Dec 03 21:55:47.991511 master-0 kubenswrapper[9136]: I1203 21:55:47.991449 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 21:55:48.179429 master-0 kubenswrapper[9136]: I1203 21:55:48.178300 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.179429 master-0 kubenswrapper[9136]: I1203 21:55:48.178368 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.179429 master-0 kubenswrapper[9136]: I1203 21:55:48.178411 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v28xw\" (UniqueName: \"kubernetes.io/projected/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-kube-api-access-v28xw\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.179429 master-0 kubenswrapper[9136]: I1203 21:55:48.178953 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.282742 master-0 kubenswrapper[9136]: I1203 21:55:48.282638 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.282742 master-0 kubenswrapper[9136]: I1203 21:55:48.282741 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v28xw\" (UniqueName: \"kubernetes.io/projected/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-kube-api-access-v28xw\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.283031 master-0 kubenswrapper[9136]: I1203 21:55:48.282813 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.283031 master-0 kubenswrapper[9136]: I1203 21:55:48.282928 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.283819 master-0 kubenswrapper[9136]: I1203 21:55:48.283758 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.285415 master-0 kubenswrapper[9136]: I1203 21:55:48.285359 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.290009 master-0 kubenswrapper[9136]: I1203 21:55:48.289962 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.304594 master-0 kubenswrapper[9136]: I1203 21:55:48.304509 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28xw\" (UniqueName: \"kubernetes.io/projected/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-kube-api-access-v28xw\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.314377 master-0 kubenswrapper[9136]: I1203 21:55:48.314319 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 21:55:48.341847 master-0 kubenswrapper[9136]: W1203 21:55:48.341263 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b80ad5_7d7c_4ecd_90b0_2913d4559b5f.slice/crio-5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4 WatchSource:0}: Error finding container 5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4: Status 404 returned error can't find the container with id 5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4 Dec 03 21:55:48.902757 master-0 kubenswrapper[9136]: I1203 21:55:48.902683 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" event={"ID":"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f","Type":"ContainerStarted","Data":"627c6aedb1aa983e6e6dc6c1e7265386ba60120ffcae5ff232be95dd9b9911e5"} Dec 03 21:55:48.902757 master-0 kubenswrapper[9136]: I1203 21:55:48.902743 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" event={"ID":"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f","Type":"ContainerStarted","Data":"b09aa2ce0974e79950ebb97a2bca6a2822137b86e33ed2883782ce907d8ee39b"} Dec 03 21:55:48.902757 master-0 kubenswrapper[9136]: I1203 21:55:48.902757 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" event={"ID":"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f","Type":"ContainerStarted","Data":"5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4"} Dec 03 21:55:48.923258 master-0 kubenswrapper[9136]: I1203 21:55:48.923147 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" podStartSLOduration=1.923124823 podStartE2EDuration="1.923124823s" podCreationTimestamp="2025-12-03 21:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:55:48.919991269 +0000 UTC m=+355.195167661" watchObservedRunningTime="2025-12-03 21:55:48.923124823 +0000 UTC m=+355.198301205" Dec 03 21:55:49.920548 master-0 kubenswrapper[9136]: I1203 21:55:49.919618 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a5ef85a-d878-4253-ba90-9306c9490e3c" path="/var/lib/kubelet/pods/5a5ef85a-d878-4253-ba90-9306c9490e3c/volumes" Dec 03 21:55:53.110490 master-0 kubenswrapper[9136]: I1203 21:55:53.110418 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56"] Dec 03 21:55:53.112637 master-0 kubenswrapper[9136]: I1203 21:55:53.111611 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.113327 master-0 kubenswrapper[9136]: I1203 21:55:53.113287 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-5q257" Dec 03 21:55:53.113570 master-0 kubenswrapper[9136]: I1203 21:55:53.113542 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 21:55:53.132197 master-0 kubenswrapper[9136]: I1203 21:55:53.132137 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56"] Dec 03 21:55:53.261742 master-0 kubenswrapper[9136]: I1203 21:55:53.261621 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.261952 master-0 kubenswrapper[9136]: I1203 21:55:53.261716 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq55c\" (UniqueName: \"kubernetes.io/projected/a124c14f-20c6-4df3-956f-a858de0c73c9-kube-api-access-fq55c\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.261952 master-0 kubenswrapper[9136]: I1203 21:55:53.261836 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.363457 master-0 kubenswrapper[9136]: I1203 21:55:53.363333 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.363457 master-0 kubenswrapper[9136]: I1203 21:55:53.363386 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq55c\" (UniqueName: \"kubernetes.io/projected/a124c14f-20c6-4df3-956f-a858de0c73c9-kube-api-access-fq55c\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.363457 master-0 kubenswrapper[9136]: I1203 21:55:53.363410 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.364868 master-0 kubenswrapper[9136]: I1203 21:55:53.364841 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.369159 master-0 kubenswrapper[9136]: I1203 21:55:53.369099 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.383519 master-0 kubenswrapper[9136]: I1203 21:55:53.382954 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq55c\" (UniqueName: \"kubernetes.io/projected/a124c14f-20c6-4df3-956f-a858de0c73c9-kube-api-access-fq55c\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.438261 master-0 kubenswrapper[9136]: I1203 21:55:53.438202 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 21:55:53.470368 master-0 kubenswrapper[9136]: I1203 21:55:53.468002 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf"] Dec 03 21:55:53.470368 master-0 kubenswrapper[9136]: I1203 21:55:53.468385 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="cluster-cloud-controller-manager" containerID="cri-o://cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" gracePeriod=30 Dec 03 21:55:53.470368 master-0 kubenswrapper[9136]: I1203 21:55:53.468483 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="kube-rbac-proxy" containerID="cri-o://1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" gracePeriod=30 Dec 03 21:55:53.470368 master-0 kubenswrapper[9136]: I1203 21:55:53.468497 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="config-sync-controllers" containerID="cri-o://f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" gracePeriod=30 Dec 03 21:55:53.625737 master-0 kubenswrapper[9136]: I1203 21:55:53.625691 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:55:53.769325 master-0 kubenswrapper[9136]: I1203 21:55:53.769243 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m4mf\" (UniqueName: \"kubernetes.io/projected/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-kube-api-access-9m4mf\") pod \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " Dec 03 21:55:53.769559 master-0 kubenswrapper[9136]: I1203 21:55:53.769366 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-host-etc-kube\") pod \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " Dec 03 21:55:53.769559 master-0 kubenswrapper[9136]: I1203 21:55:53.769414 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-cloud-controller-manager-operator-tls\") pod \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " Dec 03 21:55:53.769559 master-0 kubenswrapper[9136]: I1203 21:55:53.769461 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-auth-proxy-config\") pod \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " Dec 03 21:55:53.769559 master-0 kubenswrapper[9136]: I1203 21:55:53.769483 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" (UID: "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 21:55:53.769559 master-0 kubenswrapper[9136]: I1203 21:55:53.769513 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-images\") pod \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\" (UID: \"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7\") " Dec 03 21:55:53.770069 master-0 kubenswrapper[9136]: I1203 21:55:53.770041 9136 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:53.770372 master-0 kubenswrapper[9136]: I1203 21:55:53.770314 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" (UID: "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:55:53.770420 master-0 kubenswrapper[9136]: I1203 21:55:53.770387 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-images" (OuterVolumeSpecName: "images") pod "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" (UID: "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:55:53.772230 master-0 kubenswrapper[9136]: I1203 21:55:53.772200 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-kube-api-access-9m4mf" (OuterVolumeSpecName: "kube-api-access-9m4mf") pod "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" (UID: "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7"). InnerVolumeSpecName "kube-api-access-9m4mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:55:53.773913 master-0 kubenswrapper[9136]: I1203 21:55:53.773880 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" (UID: "ef057c6e-7d96-4db8-ab3c-8e81d6f29df7"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:55:53.848289 master-0 kubenswrapper[9136]: I1203 21:55:53.848212 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56"] Dec 03 21:55:53.864270 master-0 kubenswrapper[9136]: W1203 21:55:53.864206 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda124c14f_20c6_4df3_956f_a858de0c73c9.slice/crio-ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a WatchSource:0}: Error finding container ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a: Status 404 returned error can't find the container with id ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a Dec 03 21:55:53.871870 master-0 kubenswrapper[9136]: I1203 21:55:53.871843 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m4mf\" (UniqueName: \"kubernetes.io/projected/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-kube-api-access-9m4mf\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:53.871974 master-0 kubenswrapper[9136]: I1203 21:55:53.871963 9136 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:53.872040 master-0 kubenswrapper[9136]: I1203 21:55:53.872028 9136 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:53.872099 master-0 kubenswrapper[9136]: I1203 21:55:53.872090 9136 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7-images\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:53.911256 master-0 kubenswrapper[9136]: I1203 21:55:53.911218 9136 scope.go:117] "RemoveContainer" containerID="a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6" Dec 03 21:55:53.933491 master-0 kubenswrapper[9136]: I1203 21:55:53.933440 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" event={"ID":"a124c14f-20c6-4df3-956f-a858de0c73c9","Type":"ContainerStarted","Data":"ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a"} Dec 03 21:55:53.936285 master-0 kubenswrapper[9136]: I1203 21:55:53.936242 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qht46" event={"ID":"c807d487-5b8f-4747-87ee-df0637e2e11f","Type":"ContainerStarted","Data":"a2eed5b6aacd84387e58a76d08c7f60cf4fc01bda67cb81b8e95dd99934d41f2"} Dec 03 21:55:53.939614 master-0 kubenswrapper[9136]: I1203 21:55:53.939562 9136 generic.go:334] "Generic (PLEG): container finished" podID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerID="1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" exitCode=0 Dec 03 21:55:53.939614 master-0 kubenswrapper[9136]: I1203 21:55:53.939600 9136 generic.go:334] "Generic (PLEG): container finished" podID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerID="f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" exitCode=0 Dec 03 21:55:53.939614 master-0 kubenswrapper[9136]: I1203 21:55:53.939610 9136 generic.go:334] "Generic (PLEG): container finished" podID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerID="cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" exitCode=0 Dec 03 21:55:53.939890 master-0 kubenswrapper[9136]: I1203 21:55:53.939634 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerDied","Data":"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1"} Dec 03 21:55:53.939890 master-0 kubenswrapper[9136]: I1203 21:55:53.939665 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerDied","Data":"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4"} Dec 03 21:55:53.939890 master-0 kubenswrapper[9136]: I1203 21:55:53.939674 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerDied","Data":"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7"} Dec 03 21:55:53.939890 master-0 kubenswrapper[9136]: I1203 21:55:53.939683 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" event={"ID":"ef057c6e-7d96-4db8-ab3c-8e81d6f29df7","Type":"ContainerDied","Data":"f657f4b1134dbf93e45130449fa579d92c7ed47b59015d49fe477b69dc4d5559"} Dec 03 21:55:53.939890 master-0 kubenswrapper[9136]: I1203 21:55:53.939699 9136 scope.go:117] "RemoveContainer" containerID="1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" Dec 03 21:55:53.940260 master-0 kubenswrapper[9136]: I1203 21:55:53.940228 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf" Dec 03 21:55:53.969789 master-0 kubenswrapper[9136]: I1203 21:55:53.969739 9136 scope.go:117] "RemoveContainer" containerID="f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" Dec 03 21:55:54.021832 master-0 kubenswrapper[9136]: I1203 21:55:54.021739 9136 scope.go:117] "RemoveContainer" containerID="cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" Dec 03 21:55:54.059030 master-0 kubenswrapper[9136]: I1203 21:55:54.058990 9136 scope.go:117] "RemoveContainer" containerID="1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" Dec 03 21:55:54.059469 master-0 kubenswrapper[9136]: E1203 21:55:54.059439 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": container with ID starting with 1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1 not found: ID does not exist" containerID="1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" Dec 03 21:55:54.059599 master-0 kubenswrapper[9136]: I1203 21:55:54.059470 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1"} err="failed to get container status \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": rpc error: code = NotFound desc = could not find container \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": container with ID starting with 1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1 not found: ID does not exist" Dec 03 21:55:54.059599 master-0 kubenswrapper[9136]: I1203 21:55:54.059492 9136 scope.go:117] "RemoveContainer" containerID="f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" Dec 03 21:55:54.059806 master-0 kubenswrapper[9136]: E1203 21:55:54.059742 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": container with ID starting with f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4 not found: ID does not exist" containerID="f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" Dec 03 21:55:54.059806 master-0 kubenswrapper[9136]: I1203 21:55:54.059763 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4"} err="failed to get container status \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": rpc error: code = NotFound desc = could not find container \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": container with ID starting with f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4 not found: ID does not exist" Dec 03 21:55:54.059806 master-0 kubenswrapper[9136]: I1203 21:55:54.059790 9136 scope.go:117] "RemoveContainer" containerID="cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" Dec 03 21:55:54.060121 master-0 kubenswrapper[9136]: E1203 21:55:54.059984 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": container with ID starting with cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7 not found: ID does not exist" containerID="cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" Dec 03 21:55:54.060121 master-0 kubenswrapper[9136]: I1203 21:55:54.060011 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7"} err="failed to get container status \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": rpc error: code = NotFound desc = could not find container \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": container with ID starting with cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7 not found: ID does not exist" Dec 03 21:55:54.060121 master-0 kubenswrapper[9136]: I1203 21:55:54.060028 9136 scope.go:117] "RemoveContainer" containerID="1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" Dec 03 21:55:54.060486 master-0 kubenswrapper[9136]: I1203 21:55:54.060378 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1"} err="failed to get container status \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": rpc error: code = NotFound desc = could not find container \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": container with ID starting with 1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1 not found: ID does not exist" Dec 03 21:55:54.060486 master-0 kubenswrapper[9136]: I1203 21:55:54.060447 9136 scope.go:117] "RemoveContainer" containerID="f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.060792 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4"} err="failed to get container status \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": rpc error: code = NotFound desc = could not find container \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": container with ID starting with f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4 not found: ID does not exist" Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.060820 9136 scope.go:117] "RemoveContainer" containerID="cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.061969 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7"} err="failed to get container status \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": rpc error: code = NotFound desc = could not find container \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": container with ID starting with cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7 not found: ID does not exist" Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.062051 9136 scope.go:117] "RemoveContainer" containerID="1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1" Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.062827 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf"] Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.062884 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1"} err="failed to get container status \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": rpc error: code = NotFound desc = could not find container \"1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1\": container with ID starting with 1b9e6f264599f031a44c56d1cca0bf26db70d3ce8a94ecbcdb29b4fd31245dd1 not found: ID does not exist" Dec 03 21:55:54.061062 master-0 kubenswrapper[9136]: I1203 21:55:54.062912 9136 scope.go:117] "RemoveContainer" containerID="f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4" Dec 03 21:55:54.065590 master-0 kubenswrapper[9136]: I1203 21:55:54.065111 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-76f56467d7-c4ngf"] Dec 03 21:55:54.067822 master-0 kubenswrapper[9136]: I1203 21:55:54.067679 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4"} err="failed to get container status \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": rpc error: code = NotFound desc = could not find container \"f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4\": container with ID starting with f56246dd14ce9f4daa6dcca08c75be514ab4b98fc1333ab20d38c9255544fda4 not found: ID does not exist" Dec 03 21:55:54.067822 master-0 kubenswrapper[9136]: I1203 21:55:54.067722 9136 scope.go:117] "RemoveContainer" containerID="cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7" Dec 03 21:55:54.068325 master-0 kubenswrapper[9136]: I1203 21:55:54.068274 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7"} err="failed to get container status \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": rpc error: code = NotFound desc = could not find container \"cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7\": container with ID starting with cd805615b5dd7fa7a0fa614f6788abe9ebe5fbc166994f183b95da154f7d6bc7 not found: ID does not exist" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116441 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5"] Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: E1203 21:55:54.116725 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="config-sync-controllers" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116741 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="config-sync-controllers" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: E1203 21:55:54.116753 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="cluster-cloud-controller-manager" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116762 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="cluster-cloud-controller-manager" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: E1203 21:55:54.116815 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="kube-rbac-proxy" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116825 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="kube-rbac-proxy" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116951 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="config-sync-controllers" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116972 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="kube-rbac-proxy" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.116982 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" containerName="cluster-cloud-controller-manager" Dec 03 21:55:54.117696 master-0 kubenswrapper[9136]: I1203 21:55:54.117442 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 21:55:54.120335 master-0 kubenswrapper[9136]: I1203 21:55:54.118898 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-54f97f57-xq6ch"] Dec 03 21:55:54.120335 master-0 kubenswrapper[9136]: I1203 21:55:54.119842 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.122889 master-0 kubenswrapper[9136]: I1203 21:55:54.122841 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv"] Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.123672 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.123748 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.123886 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.123985 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.124009 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.124080 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.124221 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 21:55:54.126787 master-0 kubenswrapper[9136]: I1203 21:55:54.126282 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 21:55:54.129227 master-0 kubenswrapper[9136]: I1203 21:55:54.128537 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb"] Dec 03 21:55:54.133783 master-0 kubenswrapper[9136]: I1203 21:55:54.129411 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:54.133783 master-0 kubenswrapper[9136]: I1203 21:55:54.133495 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 03 21:55:54.133783 master-0 kubenswrapper[9136]: I1203 21:55:54.133723 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx"] Dec 03 21:55:54.139793 master-0 kubenswrapper[9136]: I1203 21:55:54.135244 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.139793 master-0 kubenswrapper[9136]: I1203 21:55:54.138166 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 21:55:54.139793 master-0 kubenswrapper[9136]: I1203 21:55:54.138492 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-w6qk9" Dec 03 21:55:54.139793 master-0 kubenswrapper[9136]: I1203 21:55:54.138502 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 21:55:54.139793 master-0 kubenswrapper[9136]: I1203 21:55:54.138500 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 21:55:54.139793 master-0 kubenswrapper[9136]: I1203 21:55:54.138884 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 21:55:54.140083 master-0 kubenswrapper[9136]: I1203 21:55:54.139896 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 21:55:54.141402 master-0 kubenswrapper[9136]: I1203 21:55:54.141358 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv"] Dec 03 21:55:54.145298 master-0 kubenswrapper[9136]: I1203 21:55:54.145250 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb"] Dec 03 21:55:54.150916 master-0 kubenswrapper[9136]: I1203 21:55:54.148989 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5"] Dec 03 21:55:54.282981 master-0 kubenswrapper[9136]: I1203 21:55:54.282878 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dcf886d-2028-4acd-83ac-850c4b278810-config-volume\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.283296 master-0 kubenswrapper[9136]: I1203 21:55:54.283128 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-default-certificate\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.283296 master-0 kubenswrapper[9136]: I1203 21:55:54.283154 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzl8x\" (UniqueName: \"kubernetes.io/projected/787c50e1-35b5-43d7-9c26-8dd5399693d3-kube-api-access-jzl8x\") pod \"network-check-source-6964bb78b7-lntt5\" (UID: \"787c50e1-35b5-43d7-9c26-8dd5399693d3\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 21:55:54.283635 master-0 kubenswrapper[9136]: I1203 21:55:54.283589 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/698e6d87-1a58-493c-8b69-d22c89d26ac5-service-ca-bundle\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.283947 master-0 kubenswrapper[9136]: I1203 21:55:54.283914 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvkk\" (UniqueName: \"kubernetes.io/projected/def52ba3-77c1-4e0c-8a0d-44ff4d677607-kube-api-access-dcvkk\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.284026 master-0 kubenswrapper[9136]: I1203 21:55:54.283998 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-stats-auth\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.284078 master-0 kubenswrapper[9136]: I1203 21:55:54.284027 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-metrics-certs\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.284121 master-0 kubenswrapper[9136]: I1203 21:55:54.284089 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dcf886d-2028-4acd-83ac-850c4b278810-secret-volume\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.284172 master-0 kubenswrapper[9136]: I1203 21:55:54.284120 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.284221 master-0 kubenswrapper[9136]: I1203 21:55:54.284173 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.284221 master-0 kubenswrapper[9136]: I1203 21:55:54.284215 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/922419d4-b528-472e-8215-4a55a96dab08-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-rqszb\" (UID: \"922419d4-b528-472e-8215-4a55a96dab08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:54.284356 master-0 kubenswrapper[9136]: I1203 21:55:54.284267 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97kqz\" (UniqueName: \"kubernetes.io/projected/698e6d87-1a58-493c-8b69-d22c89d26ac5-kube-api-access-97kqz\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.284413 master-0 kubenswrapper[9136]: I1203 21:55:54.284382 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf9wf\" (UniqueName: \"kubernetes.io/projected/2dcf886d-2028-4acd-83ac-850c4b278810-kube-api-access-rf9wf\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.284465 master-0 kubenswrapper[9136]: I1203 21:55:54.284448 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.284516 master-0 kubenswrapper[9136]: I1203 21:55:54.284496 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/def52ba3-77c1-4e0c-8a0d-44ff4d677607-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.385286 master-0 kubenswrapper[9136]: I1203 21:55:54.385152 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvkk\" (UniqueName: \"kubernetes.io/projected/def52ba3-77c1-4e0c-8a0d-44ff4d677607-kube-api-access-dcvkk\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.385286 master-0 kubenswrapper[9136]: I1203 21:55:54.385207 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-stats-auth\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.385286 master-0 kubenswrapper[9136]: I1203 21:55:54.385234 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-metrics-certs\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.385286 master-0 kubenswrapper[9136]: I1203 21:55:54.385260 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dcf886d-2028-4acd-83ac-850c4b278810-secret-volume\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.385286 master-0 kubenswrapper[9136]: I1203 21:55:54.385284 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385314 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385349 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/922419d4-b528-472e-8215-4a55a96dab08-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-rqszb\" (UID: \"922419d4-b528-472e-8215-4a55a96dab08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385372 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97kqz\" (UniqueName: \"kubernetes.io/projected/698e6d87-1a58-493c-8b69-d22c89d26ac5-kube-api-access-97kqz\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385399 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf9wf\" (UniqueName: \"kubernetes.io/projected/2dcf886d-2028-4acd-83ac-850c4b278810-kube-api-access-rf9wf\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385435 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385461 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/def52ba3-77c1-4e0c-8a0d-44ff4d677607-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385488 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dcf886d-2028-4acd-83ac-850c4b278810-config-volume\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385527 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-default-certificate\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385559 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzl8x\" (UniqueName: \"kubernetes.io/projected/787c50e1-35b5-43d7-9c26-8dd5399693d3-kube-api-access-jzl8x\") pod \"network-check-source-6964bb78b7-lntt5\" (UID: \"787c50e1-35b5-43d7-9c26-8dd5399693d3\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385599 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/698e6d87-1a58-493c-8b69-d22c89d26ac5-service-ca-bundle\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.385650 master-0 kubenswrapper[9136]: I1203 21:55:54.385641 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/def52ba3-77c1-4e0c-8a0d-44ff4d677607-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.387685 master-0 kubenswrapper[9136]: I1203 21:55:54.387643 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/698e6d87-1a58-493c-8b69-d22c89d26ac5-service-ca-bundle\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.388227 master-0 kubenswrapper[9136]: I1203 21:55:54.388127 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.388548 master-0 kubenswrapper[9136]: I1203 21:55:54.388517 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.388762 master-0 kubenswrapper[9136]: I1203 21:55:54.388708 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.388854 master-0 kubenswrapper[9136]: I1203 21:55:54.388804 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dcf886d-2028-4acd-83ac-850c4b278810-secret-volume\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.389182 master-0 kubenswrapper[9136]: I1203 21:55:54.389153 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dcf886d-2028-4acd-83ac-850c4b278810-config-volume\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.389469 master-0 kubenswrapper[9136]: I1203 21:55:54.389440 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-default-certificate\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.390644 master-0 kubenswrapper[9136]: I1203 21:55:54.390589 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/922419d4-b528-472e-8215-4a55a96dab08-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-rqszb\" (UID: \"922419d4-b528-472e-8215-4a55a96dab08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:54.391043 master-0 kubenswrapper[9136]: I1203 21:55:54.390984 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-stats-auth\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.391244 master-0 kubenswrapper[9136]: I1203 21:55:54.391206 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-metrics-certs\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.479508 master-0 kubenswrapper[9136]: I1203 21:55:54.479417 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzl8x\" (UniqueName: \"kubernetes.io/projected/787c50e1-35b5-43d7-9c26-8dd5399693d3-kube-api-access-jzl8x\") pod \"network-check-source-6964bb78b7-lntt5\" (UID: \"787c50e1-35b5-43d7-9c26-8dd5399693d3\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 21:55:54.488795 master-0 kubenswrapper[9136]: I1203 21:55:54.488707 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf9wf\" (UniqueName: \"kubernetes.io/projected/2dcf886d-2028-4acd-83ac-850c4b278810-kube-api-access-rf9wf\") pod \"collect-profiles-29413305-sztmv\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.493724 master-0 kubenswrapper[9136]: I1203 21:55:54.493575 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97kqz\" (UniqueName: \"kubernetes.io/projected/698e6d87-1a58-493c-8b69-d22c89d26ac5-kube-api-access-97kqz\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.493724 master-0 kubenswrapper[9136]: I1203 21:55:54.493664 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvkk\" (UniqueName: \"kubernetes.io/projected/def52ba3-77c1-4e0c-8a0d-44ff4d677607-kube-api-access-dcvkk\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.499907 master-0 kubenswrapper[9136]: I1203 21:55:54.499850 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:54.519003 master-0 kubenswrapper[9136]: I1203 21:55:54.518911 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 21:55:54.519649 master-0 kubenswrapper[9136]: I1203 21:55:54.519589 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:54.519649 master-0 kubenswrapper[9136]: I1203 21:55:54.519651 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:54.558966 master-0 kubenswrapper[9136]: W1203 21:55:54.558912 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef52ba3_77c1_4e0c_8a0d_44ff4d677607.slice/crio-66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704 WatchSource:0}: Error finding container 66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704: Status 404 returned error can't find the container with id 66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704 Dec 03 21:55:54.575030 master-0 kubenswrapper[9136]: I1203 21:55:54.573160 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:54.738614 master-0 kubenswrapper[9136]: I1203 21:55:54.738553 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 21:55:54.759938 master-0 kubenswrapper[9136]: I1203 21:55:54.759873 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:54.783373 master-0 kubenswrapper[9136]: I1203 21:55:54.783323 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:54.784120 master-0 kubenswrapper[9136]: W1203 21:55:54.784068 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698e6d87_1a58_493c_8b69_d22c89d26ac5.slice/crio-2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8 WatchSource:0}: Error finding container 2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8: Status 404 returned error can't find the container with id 2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8 Dec 03 21:55:54.961649 master-0 kubenswrapper[9136]: I1203 21:55:54.961598 9136 generic.go:334] "Generic (PLEG): container finished" podID="c807d487-5b8f-4747-87ee-df0637e2e11f" containerID="a2eed5b6aacd84387e58a76d08c7f60cf4fc01bda67cb81b8e95dd99934d41f2" exitCode=0 Dec 03 21:55:54.961955 master-0 kubenswrapper[9136]: I1203 21:55:54.961671 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qht46" event={"ID":"c807d487-5b8f-4747-87ee-df0637e2e11f","Type":"ContainerDied","Data":"a2eed5b6aacd84387e58a76d08c7f60cf4fc01bda67cb81b8e95dd99934d41f2"} Dec 03 21:55:54.966506 master-0 kubenswrapper[9136]: I1203 21:55:54.966471 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerStarted","Data":"e063c475fe48713685c7da56849e11be35ce10c982e2192d9447df0278644182"} Dec 03 21:55:54.966657 master-0 kubenswrapper[9136]: I1203 21:55:54.966509 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerStarted","Data":"66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704"} Dec 03 21:55:54.973925 master-0 kubenswrapper[9136]: I1203 21:55:54.973862 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8"} Dec 03 21:55:54.985235 master-0 kubenswrapper[9136]: I1203 21:55:54.985151 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb"] Dec 03 21:55:54.989998 master-0 kubenswrapper[9136]: I1203 21:55:54.987109 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/2.log" Dec 03 21:55:54.989998 master-0 kubenswrapper[9136]: I1203 21:55:54.987231 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e"} Dec 03 21:55:54.997890 master-0 kubenswrapper[9136]: I1203 21:55:54.997841 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" event={"ID":"a124c14f-20c6-4df3-956f-a858de0c73c9","Type":"ContainerStarted","Data":"3cdab5224c164e1b7ef2ca7c855c71f5526977aef147afa869e4b3250b7d7b1c"} Dec 03 21:55:54.997890 master-0 kubenswrapper[9136]: I1203 21:55:54.997890 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" event={"ID":"a124c14f-20c6-4df3-956f-a858de0c73c9","Type":"ContainerStarted","Data":"af6236462e324978e8cd817baf06fb75710668886258289a9caf50532bbe9141"} Dec 03 21:55:55.000623 master-0 kubenswrapper[9136]: W1203 21:55:55.000580 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod922419d4_b528_472e_8215_4a55a96dab08.slice/crio-302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a WatchSource:0}: Error finding container 302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a: Status 404 returned error can't find the container with id 302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a Dec 03 21:55:55.055878 master-0 kubenswrapper[9136]: I1203 21:55:55.050540 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" podStartSLOduration=2.050506511 podStartE2EDuration="2.050506511s" podCreationTimestamp="2025-12-03 21:55:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:55:55.041016086 +0000 UTC m=+361.316192488" watchObservedRunningTime="2025-12-03 21:55:55.050506511 +0000 UTC m=+361.325682903" Dec 03 21:55:55.055878 master-0 kubenswrapper[9136]: I1203 21:55:55.052449 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 21:55:55.217186 master-0 kubenswrapper[9136]: I1203 21:55:55.217030 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5"] Dec 03 21:55:55.231847 master-0 kubenswrapper[9136]: I1203 21:55:55.229236 9136 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 21:55:55.236851 master-0 kubenswrapper[9136]: W1203 21:55:55.235596 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod787c50e1_35b5_43d7_9c26_8dd5399693d3.slice/crio-3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f WatchSource:0}: Error finding container 3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f: Status 404 returned error can't find the container with id 3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f Dec 03 21:55:55.266531 master-0 kubenswrapper[9136]: I1203 21:55:55.266476 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv"] Dec 03 21:55:55.918908 master-0 kubenswrapper[9136]: I1203 21:55:55.918852 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef057c6e-7d96-4db8-ab3c-8e81d6f29df7" path="/var/lib/kubelet/pods/ef057c6e-7d96-4db8-ab3c-8e81d6f29df7/volumes" Dec 03 21:55:56.008048 master-0 kubenswrapper[9136]: I1203 21:55:56.007989 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qht46" event={"ID":"c807d487-5b8f-4747-87ee-df0637e2e11f","Type":"ContainerStarted","Data":"c8f04f8403773294786866c9a4cfe98107b25f55109db3f0a2fa931b67109dde"} Dec 03 21:55:56.018575 master-0 kubenswrapper[9136]: I1203 21:55:56.018520 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerStarted","Data":"19b69ed5217b59325f7aa6c57b3ae16dd0d2ff97f165b6a04e1fae3a34a07dd8"} Dec 03 21:55:56.018635 master-0 kubenswrapper[9136]: I1203 21:55:56.018578 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerStarted","Data":"19a9bc6e0f1f8798f87d4882e3c42bf8b1af00134db1a21061f75b7ece3744fd"} Dec 03 21:55:56.020673 master-0 kubenswrapper[9136]: I1203 21:55:56.020635 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" event={"ID":"2dcf886d-2028-4acd-83ac-850c4b278810","Type":"ContainerStarted","Data":"8e98bd7632e10184a3905755491b0bf5b7f9601daae4ea74b3820228d382dac8"} Dec 03 21:55:56.020673 master-0 kubenswrapper[9136]: I1203 21:55:56.020668 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" event={"ID":"2dcf886d-2028-4acd-83ac-850c4b278810","Type":"ContainerStarted","Data":"54fabba34f6ed1546c826bd2346dfd527db4cdc2878650997141f56d5e70ee87"} Dec 03 21:55:56.023012 master-0 kubenswrapper[9136]: I1203 21:55:56.022969 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" event={"ID":"787c50e1-35b5-43d7-9c26-8dd5399693d3","Type":"ContainerStarted","Data":"972abbd823b42065f7e2289961eedbaafb8ce5c1be34769ac0f336edbac8d67c"} Dec 03 21:55:56.023012 master-0 kubenswrapper[9136]: I1203 21:55:56.023008 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" event={"ID":"787c50e1-35b5-43d7-9c26-8dd5399693d3","Type":"ContainerStarted","Data":"3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f"} Dec 03 21:55:56.024739 master-0 kubenswrapper[9136]: I1203 21:55:56.024677 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" event={"ID":"922419d4-b528-472e-8215-4a55a96dab08","Type":"ContainerStarted","Data":"302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a"} Dec 03 21:55:56.035076 master-0 kubenswrapper[9136]: I1203 21:55:56.034989 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qht46" podStartSLOduration=2.486231885 podStartE2EDuration="12.034960405s" podCreationTimestamp="2025-12-03 21:55:44 +0000 UTC" firstStartedPulling="2025-12-03 21:55:45.867230116 +0000 UTC m=+352.142406508" lastFinishedPulling="2025-12-03 21:55:55.415958636 +0000 UTC m=+361.691135028" observedRunningTime="2025-12-03 21:55:56.028067007 +0000 UTC m=+362.303243409" watchObservedRunningTime="2025-12-03 21:55:56.034960405 +0000 UTC m=+362.310136787" Dec 03 21:55:56.072607 master-0 kubenswrapper[9136]: I1203 21:55:56.070947 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" podStartSLOduration=409.070921328 podStartE2EDuration="6m49.070921328s" podCreationTimestamp="2025-12-03 21:49:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:55:56.067585318 +0000 UTC m=+362.342761700" watchObservedRunningTime="2025-12-03 21:55:56.070921328 +0000 UTC m=+362.346097730" Dec 03 21:55:56.072607 master-0 kubenswrapper[9136]: I1203 21:55:56.072025 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" podStartSLOduration=485.072020115 podStartE2EDuration="8m5.072020115s" podCreationTimestamp="2025-12-03 21:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:55:56.050362147 +0000 UTC m=+362.325538539" watchObservedRunningTime="2025-12-03 21:55:56.072020115 +0000 UTC m=+362.347196517" Dec 03 21:55:56.088177 master-0 kubenswrapper[9136]: I1203 21:55:56.088080 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" podStartSLOduration=2.088055487 podStartE2EDuration="2.088055487s" podCreationTimestamp="2025-12-03 21:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:55:56.084115357 +0000 UTC m=+362.359291749" watchObservedRunningTime="2025-12-03 21:55:56.088055487 +0000 UTC m=+362.363231869" Dec 03 21:55:57.611454 master-0 kubenswrapper[9136]: I1203 21:55:57.610065 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vgm8c"] Dec 03 21:55:57.611454 master-0 kubenswrapper[9136]: I1203 21:55:57.610979 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.614189 master-0 kubenswrapper[9136]: I1203 21:55:57.613954 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 21:55:57.614389 master-0 kubenswrapper[9136]: I1203 21:55:57.614205 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-xm8lg" Dec 03 21:55:57.614734 master-0 kubenswrapper[9136]: I1203 21:55:57.614539 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 21:55:57.643707 master-0 kubenswrapper[9136]: I1203 21:55:57.643620 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.643707 master-0 kubenswrapper[9136]: I1203 21:55:57.643709 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b658f\" (UniqueName: \"kubernetes.io/projected/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-kube-api-access-b658f\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.643994 master-0 kubenswrapper[9136]: I1203 21:55:57.643909 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.746087 master-0 kubenswrapper[9136]: I1203 21:55:57.745864 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.746516 master-0 kubenswrapper[9136]: I1203 21:55:57.746482 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.746750 master-0 kubenswrapper[9136]: I1203 21:55:57.746718 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b658f\" (UniqueName: \"kubernetes.io/projected/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-kube-api-access-b658f\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.751581 master-0 kubenswrapper[9136]: I1203 21:55:57.751515 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.768644 master-0 kubenswrapper[9136]: I1203 21:55:57.768558 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.775155 master-0 kubenswrapper[9136]: I1203 21:55:57.775096 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b658f\" (UniqueName: \"kubernetes.io/projected/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-kube-api-access-b658f\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:57.984915 master-0 kubenswrapper[9136]: I1203 21:55:57.984839 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 21:55:58.019243 master-0 kubenswrapper[9136]: W1203 21:55:58.018605 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66f7b08c_52e8_4795_9cf0_74402a9cc0bb.slice/crio-27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432 WatchSource:0}: Error finding container 27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432: Status 404 returned error can't find the container with id 27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432 Dec 03 21:55:58.048149 master-0 kubenswrapper[9136]: I1203 21:55:58.048066 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" event={"ID":"922419d4-b528-472e-8215-4a55a96dab08","Type":"ContainerStarted","Data":"385b2d474720971a6a770bb1cd450c8313430f866f88ea0de2f69d791b6efada"} Dec 03 21:55:58.048447 master-0 kubenswrapper[9136]: I1203 21:55:58.048408 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:58.051444 master-0 kubenswrapper[9136]: I1203 21:55:58.051350 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"7ecf6575752194e376e6db6b9227fb236567480b2e2793db9b6ea4ab3a895c8f"} Dec 03 21:55:58.056907 master-0 kubenswrapper[9136]: I1203 21:55:58.056840 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vgm8c" event={"ID":"66f7b08c-52e8-4795-9cf0-74402a9cc0bb","Type":"ContainerStarted","Data":"27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432"} Dec 03 21:55:58.057035 master-0 kubenswrapper[9136]: I1203 21:55:58.056937 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 21:55:58.058627 master-0 kubenswrapper[9136]: I1203 21:55:58.058586 9136 generic.go:334] "Generic (PLEG): container finished" podID="2dcf886d-2028-4acd-83ac-850c4b278810" containerID="8e98bd7632e10184a3905755491b0bf5b7f9601daae4ea74b3820228d382dac8" exitCode=0 Dec 03 21:55:58.058729 master-0 kubenswrapper[9136]: I1203 21:55:58.058637 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" event={"ID":"2dcf886d-2028-4acd-83ac-850c4b278810","Type":"ContainerDied","Data":"8e98bd7632e10184a3905755491b0bf5b7f9601daae4ea74b3820228d382dac8"} Dec 03 21:55:58.072594 master-0 kubenswrapper[9136]: I1203 21:55:58.072499 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" podStartSLOduration=314.588691118 podStartE2EDuration="5m17.072470741s" podCreationTimestamp="2025-12-03 21:50:41 +0000 UTC" firstStartedPulling="2025-12-03 21:55:55.00558864 +0000 UTC m=+361.280765022" lastFinishedPulling="2025-12-03 21:55:57.489368253 +0000 UTC m=+363.764544645" observedRunningTime="2025-12-03 21:55:58.069940986 +0000 UTC m=+364.345117378" watchObservedRunningTime="2025-12-03 21:55:58.072470741 +0000 UTC m=+364.347647153" Dec 03 21:55:58.113861 master-0 kubenswrapper[9136]: I1203 21:55:58.113704 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-54f97f57-xq6ch" podStartSLOduration=330.420638093 podStartE2EDuration="5m33.113672048s" podCreationTimestamp="2025-12-03 21:50:25 +0000 UTC" firstStartedPulling="2025-12-03 21:55:54.793506354 +0000 UTC m=+361.068682736" lastFinishedPulling="2025-12-03 21:55:57.486540309 +0000 UTC m=+363.761716691" observedRunningTime="2025-12-03 21:55:58.108143524 +0000 UTC m=+364.383319976" watchObservedRunningTime="2025-12-03 21:55:58.113672048 +0000 UTC m=+364.388848480" Dec 03 21:55:58.760899 master-0 kubenswrapper[9136]: I1203 21:55:58.760795 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:55:58.764584 master-0 kubenswrapper[9136]: I1203 21:55:58.764515 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:55:58.764584 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:55:58.764584 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:55:58.764584 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:55:58.764816 master-0 kubenswrapper[9136]: I1203 21:55:58.764605 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:55:59.066197 master-0 kubenswrapper[9136]: I1203 21:55:59.066023 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vgm8c" event={"ID":"66f7b08c-52e8-4795-9cf0-74402a9cc0bb","Type":"ContainerStarted","Data":"79bd34a2eaaf72ea3da99d7873f87a8819ed72aa6ac1d30edef7f5a83e578083"} Dec 03 21:55:59.089880 master-0 kubenswrapper[9136]: I1203 21:55:59.085922 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vgm8c" podStartSLOduration=2.085900837 podStartE2EDuration="2.085900837s" podCreationTimestamp="2025-12-03 21:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:55:59.0839184 +0000 UTC m=+365.359094802" watchObservedRunningTime="2025-12-03 21:55:59.085900837 +0000 UTC m=+365.361077239" Dec 03 21:55:59.122861 master-0 kubenswrapper[9136]: I1203 21:55:59.122712 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-565bdcb8-7s9vg"] Dec 03 21:55:59.123973 master-0 kubenswrapper[9136]: I1203 21:55:59.123938 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.129161 master-0 kubenswrapper[9136]: I1203 21:55:59.129098 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 03 21:55:59.129316 master-0 kubenswrapper[9136]: I1203 21:55:59.129174 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 03 21:55:59.129316 master-0 kubenswrapper[9136]: I1203 21:55:59.129218 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 03 21:55:59.129316 master-0 kubenswrapper[9136]: I1203 21:55:59.129128 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-9g5tm" Dec 03 21:55:59.139045 master-0 kubenswrapper[9136]: I1203 21:55:59.138958 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-565bdcb8-7s9vg"] Dec 03 21:55:59.168792 master-0 kubenswrapper[9136]: I1203 21:55:59.166570 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcb88\" (UniqueName: \"kubernetes.io/projected/d6fafa97-812d-4588-95f8-7c4d85f53098-kube-api-access-gcb88\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.168792 master-0 kubenswrapper[9136]: I1203 21:55:59.166696 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.168792 master-0 kubenswrapper[9136]: I1203 21:55:59.166782 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.168792 master-0 kubenswrapper[9136]: I1203 21:55:59.166802 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.269891 master-0 kubenswrapper[9136]: I1203 21:55:59.268195 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.269891 master-0 kubenswrapper[9136]: I1203 21:55:59.268257 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.269891 master-0 kubenswrapper[9136]: I1203 21:55:59.268307 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcb88\" (UniqueName: \"kubernetes.io/projected/d6fafa97-812d-4588-95f8-7c4d85f53098-kube-api-access-gcb88\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.269891 master-0 kubenswrapper[9136]: I1203 21:55:59.268375 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.269891 master-0 kubenswrapper[9136]: I1203 21:55:59.269347 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.275896 master-0 kubenswrapper[9136]: I1203 21:55:59.272115 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.275896 master-0 kubenswrapper[9136]: I1203 21:55:59.272322 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.288335 master-0 kubenswrapper[9136]: I1203 21:55:59.288261 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcb88\" (UniqueName: \"kubernetes.io/projected/d6fafa97-812d-4588-95f8-7c4d85f53098-kube-api-access-gcb88\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.413207 master-0 kubenswrapper[9136]: I1203 21:55:59.413159 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:55:59.459850 master-0 kubenswrapper[9136]: I1203 21:55:59.459520 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 21:55:59.574858 master-0 kubenswrapper[9136]: I1203 21:55:59.571417 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dcf886d-2028-4acd-83ac-850c4b278810-config-volume\") pod \"2dcf886d-2028-4acd-83ac-850c4b278810\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " Dec 03 21:55:59.574858 master-0 kubenswrapper[9136]: I1203 21:55:59.571591 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf9wf\" (UniqueName: \"kubernetes.io/projected/2dcf886d-2028-4acd-83ac-850c4b278810-kube-api-access-rf9wf\") pod \"2dcf886d-2028-4acd-83ac-850c4b278810\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " Dec 03 21:55:59.574858 master-0 kubenswrapper[9136]: I1203 21:55:59.571764 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dcf886d-2028-4acd-83ac-850c4b278810-secret-volume\") pod \"2dcf886d-2028-4acd-83ac-850c4b278810\" (UID: \"2dcf886d-2028-4acd-83ac-850c4b278810\") " Dec 03 21:55:59.574858 master-0 kubenswrapper[9136]: I1203 21:55:59.573445 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dcf886d-2028-4acd-83ac-850c4b278810-config-volume" (OuterVolumeSpecName: "config-volume") pod "2dcf886d-2028-4acd-83ac-850c4b278810" (UID: "2dcf886d-2028-4acd-83ac-850c4b278810"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:55:59.583254 master-0 kubenswrapper[9136]: I1203 21:55:59.583187 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dcf886d-2028-4acd-83ac-850c4b278810-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2dcf886d-2028-4acd-83ac-850c4b278810" (UID: "2dcf886d-2028-4acd-83ac-850c4b278810"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:55:59.583422 master-0 kubenswrapper[9136]: I1203 21:55:59.583402 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dcf886d-2028-4acd-83ac-850c4b278810-kube-api-access-rf9wf" (OuterVolumeSpecName: "kube-api-access-rf9wf") pod "2dcf886d-2028-4acd-83ac-850c4b278810" (UID: "2dcf886d-2028-4acd-83ac-850c4b278810"). InnerVolumeSpecName "kube-api-access-rf9wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:55:59.674460 master-0 kubenswrapper[9136]: I1203 21:55:59.674236 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf9wf\" (UniqueName: \"kubernetes.io/projected/2dcf886d-2028-4acd-83ac-850c4b278810-kube-api-access-rf9wf\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:59.674460 master-0 kubenswrapper[9136]: I1203 21:55:59.674325 9136 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2dcf886d-2028-4acd-83ac-850c4b278810-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:59.674460 master-0 kubenswrapper[9136]: I1203 21:55:59.674352 9136 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2dcf886d-2028-4acd-83ac-850c4b278810-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 21:55:59.763933 master-0 kubenswrapper[9136]: I1203 21:55:59.763823 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:55:59.763933 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:55:59.763933 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:55:59.763933 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:55:59.765572 master-0 kubenswrapper[9136]: I1203 21:55:59.763937 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:55:59.939266 master-0 kubenswrapper[9136]: I1203 21:55:59.939225 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-565bdcb8-7s9vg"] Dec 03 21:55:59.950291 master-0 kubenswrapper[9136]: W1203 21:55:59.950251 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6fafa97_812d_4588_95f8_7c4d85f53098.slice/crio-51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f WatchSource:0}: Error finding container 51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f: Status 404 returned error can't find the container with id 51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f Dec 03 21:56:00.074683 master-0 kubenswrapper[9136]: I1203 21:56:00.074521 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" event={"ID":"d6fafa97-812d-4588-95f8-7c4d85f53098","Type":"ContainerStarted","Data":"51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f"} Dec 03 21:56:00.076598 master-0 kubenswrapper[9136]: I1203 21:56:00.076530 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" event={"ID":"2dcf886d-2028-4acd-83ac-850c4b278810","Type":"ContainerDied","Data":"54fabba34f6ed1546c826bd2346dfd527db4cdc2878650997141f56d5e70ee87"} Dec 03 21:56:00.076698 master-0 kubenswrapper[9136]: I1203 21:56:00.076604 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54fabba34f6ed1546c826bd2346dfd527db4cdc2878650997141f56d5e70ee87" Dec 03 21:56:00.076749 master-0 kubenswrapper[9136]: I1203 21:56:00.076702 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 21:56:00.763256 master-0 kubenswrapper[9136]: I1203 21:56:00.763174 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:00.763256 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:00.763256 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:00.763256 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:00.763814 master-0 kubenswrapper[9136]: I1203 21:56:00.763296 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:01.764030 master-0 kubenswrapper[9136]: I1203 21:56:01.763975 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:01.764030 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:01.764030 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:01.764030 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:01.765066 master-0 kubenswrapper[9136]: I1203 21:56:01.764060 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:02.093535 master-0 kubenswrapper[9136]: I1203 21:56:02.093483 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" event={"ID":"d6fafa97-812d-4588-95f8-7c4d85f53098","Type":"ContainerStarted","Data":"a74143b0bf80ddebcc9cefcc44442e5ed45d82fb1a105d63482e48e1b3a3257f"} Dec 03 21:56:02.093853 master-0 kubenswrapper[9136]: I1203 21:56:02.093836 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" event={"ID":"d6fafa97-812d-4588-95f8-7c4d85f53098","Type":"ContainerStarted","Data":"2c4ee645155fafdf7ddd2a51e6caf08a011382c9e7a7305b1f0c04e929fef4c6"} Dec 03 21:56:02.117012 master-0 kubenswrapper[9136]: I1203 21:56:02.116888 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" podStartSLOduration=1.455977287 podStartE2EDuration="3.116861307s" podCreationTimestamp="2025-12-03 21:55:59 +0000 UTC" firstStartedPulling="2025-12-03 21:55:59.953933368 +0000 UTC m=+366.229109750" lastFinishedPulling="2025-12-03 21:56:01.614817388 +0000 UTC m=+367.889993770" observedRunningTime="2025-12-03 21:56:02.112508504 +0000 UTC m=+368.387684906" watchObservedRunningTime="2025-12-03 21:56:02.116861307 +0000 UTC m=+368.392037699" Dec 03 21:56:02.764236 master-0 kubenswrapper[9136]: I1203 21:56:02.764148 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:02.764236 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:02.764236 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:02.764236 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:02.764997 master-0 kubenswrapper[9136]: I1203 21:56:02.764239 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:03.764016 master-0 kubenswrapper[9136]: I1203 21:56:03.763493 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:03.764016 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:03.764016 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:03.764016 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:03.764016 master-0 kubenswrapper[9136]: I1203 21:56:03.763587 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:04.760333 master-0 kubenswrapper[9136]: I1203 21:56:04.760282 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:56:04.762911 master-0 kubenswrapper[9136]: I1203 21:56:04.762841 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:04.762911 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:04.762911 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:04.762911 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:04.763157 master-0 kubenswrapper[9136]: I1203 21:56:04.762941 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:05.136627 master-0 kubenswrapper[9136]: I1203 21:56:05.135202 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:56:05.136627 master-0 kubenswrapper[9136]: I1203 21:56:05.135318 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:56:05.209397 master-0 kubenswrapper[9136]: I1203 21:56:05.209336 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:56:05.763075 master-0 kubenswrapper[9136]: I1203 21:56:05.763012 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:05.763075 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:05.763075 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:05.763075 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:05.763370 master-0 kubenswrapper[9136]: I1203 21:56:05.763095 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:06.169514 master-0 kubenswrapper[9136]: I1203 21:56:06.169454 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 21:56:06.762891 master-0 kubenswrapper[9136]: I1203 21:56:06.762830 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:06.762891 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:06.762891 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:06.762891 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:06.763215 master-0 kubenswrapper[9136]: I1203 21:56:06.762916 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:07.046275 master-0 kubenswrapper[9136]: I1203 21:56:07.044562 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p"] Dec 03 21:56:07.046275 master-0 kubenswrapper[9136]: E1203 21:56:07.044826 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" containerName="collect-profiles" Dec 03 21:56:07.046275 master-0 kubenswrapper[9136]: I1203 21:56:07.044842 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" containerName="collect-profiles" Dec 03 21:56:07.046275 master-0 kubenswrapper[9136]: I1203 21:56:07.044992 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" containerName="collect-profiles" Dec 03 21:56:07.046275 master-0 kubenswrapper[9136]: I1203 21:56:07.045897 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.048292 master-0 kubenswrapper[9136]: I1203 21:56:07.048233 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 03 21:56:07.048434 master-0 kubenswrapper[9136]: I1203 21:56:07.048395 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 03 21:56:07.048824 master-0 kubenswrapper[9136]: I1203 21:56:07.048518 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-8whkp" Dec 03 21:56:07.060066 master-0 kubenswrapper[9136]: I1203 21:56:07.059451 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nkjnl"] Dec 03 21:56:07.061813 master-0 kubenswrapper[9136]: I1203 21:56:07.060762 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.062814 master-0 kubenswrapper[9136]: I1203 21:56:07.062172 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 03 21:56:07.062814 master-0 kubenswrapper[9136]: I1203 21:56:07.062752 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-cwgtj" Dec 03 21:56:07.063916 master-0 kubenswrapper[9136]: I1203 21:56:07.063722 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 03 21:56:07.071574 master-0 kubenswrapper[9136]: I1203 21:56:07.071315 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p"] Dec 03 21:56:07.071574 master-0 kubenswrapper[9136]: I1203 21:56:07.071373 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9"] Dec 03 21:56:07.072583 master-0 kubenswrapper[9136]: I1203 21:56:07.072514 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.076732 master-0 kubenswrapper[9136]: I1203 21:56:07.074713 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 03 21:56:07.076732 master-0 kubenswrapper[9136]: I1203 21:56:07.076499 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 03 21:56:07.076910 master-0 kubenswrapper[9136]: I1203 21:56:07.076864 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-t6rs7" Dec 03 21:56:07.078561 master-0 kubenswrapper[9136]: I1203 21:56:07.077009 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 03 21:56:07.155538 master-0 kubenswrapper[9136]: I1203 21:56:07.151844 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9"] Dec 03 21:56:07.189561 master-0 kubenswrapper[9136]: I1203 21:56:07.189470 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189573 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189599 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfrr\" (UniqueName: \"kubernetes.io/projected/2d592f19-c7b9-4b29-9ca2-848572067908-kube-api-access-8hfrr\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189628 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189696 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189719 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-root\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189761 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189811 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189828 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d592f19-c7b9-4b29-9ca2-848572067908-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189856 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnfgr\" (UniqueName: \"kubernetes.io/projected/d0be52f3-b318-4630-b4da-f3c4a57d5818-kube-api-access-qnfgr\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189873 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-sys\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189891 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-wtmp\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189907 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189925 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189943 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwdzk\" (UniqueName: \"kubernetes.io/projected/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-kube-api-access-dwdzk\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189962 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.189982 master-0 kubenswrapper[9136]: I1203 21:56:07.189981 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-textfile\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.190488 master-0 kubenswrapper[9136]: I1203 21:56:07.190001 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.291531 master-0 kubenswrapper[9136]: I1203 21:56:07.291421 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291546 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291581 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-root\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291610 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291656 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291676 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d592f19-c7b9-4b29-9ca2-848572067908-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291702 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnfgr\" (UniqueName: \"kubernetes.io/projected/d0be52f3-b318-4630-b4da-f3c4a57d5818-kube-api-access-qnfgr\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291727 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-sys\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291756 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291807 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-wtmp\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291802 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-root\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.291910 master-0 kubenswrapper[9136]: I1203 21:56:07.291833 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.291976 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwdzk\" (UniqueName: \"kubernetes.io/projected/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-kube-api-access-dwdzk\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.291972 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-sys\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292077 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292145 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292169 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-textfile\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292209 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292237 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-wtmp\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292265 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292377 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfrr\" (UniqueName: \"kubernetes.io/projected/2d592f19-c7b9-4b29-9ca2-848572067908-kube-api-access-8hfrr\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: E1203 21:56:07.292416 9136 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: E1203 21:56:07.292480 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls podName:2d592f19-c7b9-4b29-9ca2-848572067908 nodeName:}" failed. No retries permitted until 2025-12-03 21:56:07.792462072 +0000 UTC m=+374.067638444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls") pod "kube-state-metrics-7dcc7f9bd6-kldf9" (UID: "2d592f19-c7b9-4b29-9ca2-848572067908") : secret "kube-state-metrics-tls" not found Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292651 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d592f19-c7b9-4b29-9ca2-848572067908-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.292831 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.293157 master-0 kubenswrapper[9136]: I1203 21:56:07.293145 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293682 master-0 kubenswrapper[9136]: I1203 21:56:07.293245 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.293682 master-0 kubenswrapper[9136]: I1203 21:56:07.293415 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-textfile\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.293910 master-0 kubenswrapper[9136]: I1203 21:56:07.293847 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.295741 master-0 kubenswrapper[9136]: I1203 21:56:07.295650 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.295741 master-0 kubenswrapper[9136]: I1203 21:56:07.295666 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.297217 master-0 kubenswrapper[9136]: I1203 21:56:07.297056 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.298606 master-0 kubenswrapper[9136]: I1203 21:56:07.298282 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.303032 master-0 kubenswrapper[9136]: I1203 21:56:07.302970 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.316841 master-0 kubenswrapper[9136]: I1203 21:56:07.316750 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnfgr\" (UniqueName: \"kubernetes.io/projected/d0be52f3-b318-4630-b4da-f3c4a57d5818-kube-api-access-qnfgr\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.317507 master-0 kubenswrapper[9136]: I1203 21:56:07.317453 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfrr\" (UniqueName: \"kubernetes.io/projected/2d592f19-c7b9-4b29-9ca2-848572067908-kube-api-access-8hfrr\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.317673 master-0 kubenswrapper[9136]: I1203 21:56:07.317632 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwdzk\" (UniqueName: \"kubernetes.io/projected/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-kube-api-access-dwdzk\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.453612 master-0 kubenswrapper[9136]: I1203 21:56:07.453508 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 21:56:07.468537 master-0 kubenswrapper[9136]: I1203 21:56:07.468454 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 21:56:07.763461 master-0 kubenswrapper[9136]: I1203 21:56:07.763399 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:07.763461 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:07.763461 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:07.763461 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:07.763880 master-0 kubenswrapper[9136]: I1203 21:56:07.763483 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:07.799959 master-0 kubenswrapper[9136]: I1203 21:56:07.799876 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.802966 master-0 kubenswrapper[9136]: I1203 21:56:07.802889 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:07.901894 master-0 kubenswrapper[9136]: I1203 21:56:07.901830 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p"] Dec 03 21:56:07.906015 master-0 kubenswrapper[9136]: W1203 21:56:07.905972 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0be52f3_b318_4630_b4da_f3c4a57d5818.slice/crio-9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153 WatchSource:0}: Error finding container 9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153: Status 404 returned error can't find the container with id 9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153 Dec 03 21:56:08.083109 master-0 kubenswrapper[9136]: I1203 21:56:08.083045 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 21:56:08.162706 master-0 kubenswrapper[9136]: I1203 21:56:08.162646 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nkjnl" event={"ID":"bcbec7ef-0b98-4346-8c6b-c5fa37e90286","Type":"ContainerStarted","Data":"3446e10d150ba6b0a0d4376dd59be02274fd256ae9b67434db5d0c00d0a96a36"} Dec 03 21:56:08.163617 master-0 kubenswrapper[9136]: I1203 21:56:08.163592 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" event={"ID":"d0be52f3-b318-4630-b4da-f3c4a57d5818","Type":"ContainerStarted","Data":"f6dffb93324c5c5d4db8153c0b527b12b6f4a82fa42f29f8fb3b1929f0d555b9"} Dec 03 21:56:08.163691 master-0 kubenswrapper[9136]: I1203 21:56:08.163625 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" event={"ID":"d0be52f3-b318-4630-b4da-f3c4a57d5818","Type":"ContainerStarted","Data":"9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153"} Dec 03 21:56:08.763733 master-0 kubenswrapper[9136]: I1203 21:56:08.763499 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:08.763733 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:08.763733 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:08.763733 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:08.763733 master-0 kubenswrapper[9136]: I1203 21:56:08.763642 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:09.764923 master-0 kubenswrapper[9136]: I1203 21:56:09.764838 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:09.764923 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:09.764923 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:09.764923 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:09.765663 master-0 kubenswrapper[9136]: I1203 21:56:09.764944 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:10.763325 master-0 kubenswrapper[9136]: I1203 21:56:10.763226 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:10.763325 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:10.763325 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:10.763325 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:10.763325 master-0 kubenswrapper[9136]: I1203 21:56:10.763322 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:11.187597 master-0 kubenswrapper[9136]: I1203 21:56:11.187500 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" event={"ID":"d0be52f3-b318-4630-b4da-f3c4a57d5818","Type":"ContainerStarted","Data":"81cfb61e05e190253c9fe7a4f62d1005511219657196c45881a0bb44d156e6e3"} Dec 03 21:56:11.765720 master-0 kubenswrapper[9136]: I1203 21:56:11.765657 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:11.765720 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:11.765720 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:11.765720 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:11.766094 master-0 kubenswrapper[9136]: I1203 21:56:11.765738 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:12.181579 master-0 kubenswrapper[9136]: I1203 21:56:12.178848 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9"] Dec 03 21:56:12.502492 master-0 kubenswrapper[9136]: W1203 21:56:12.502358 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d592f19_c7b9_4b29_9ca2_848572067908.slice/crio-3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20 WatchSource:0}: Error finding container 3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20: Status 404 returned error can't find the container with id 3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20 Dec 03 21:56:12.763199 master-0 kubenswrapper[9136]: I1203 21:56:12.763040 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:12.763199 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:12.763199 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:12.763199 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:12.763199 master-0 kubenswrapper[9136]: I1203 21:56:12.763133 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:13.221497 master-0 kubenswrapper[9136]: I1203 21:56:13.221437 9136 generic.go:334] "Generic (PLEG): container finished" podID="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" containerID="968964eedf27b3e40df0428a52df5b822f50c33a2dfb85064c918086871bd63f" exitCode=0 Dec 03 21:56:13.221736 master-0 kubenswrapper[9136]: I1203 21:56:13.221516 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nkjnl" event={"ID":"bcbec7ef-0b98-4346-8c6b-c5fa37e90286","Type":"ContainerDied","Data":"968964eedf27b3e40df0428a52df5b822f50c33a2dfb85064c918086871bd63f"} Dec 03 21:56:13.222909 master-0 kubenswrapper[9136]: I1203 21:56:13.222854 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" event={"ID":"2d592f19-c7b9-4b29-9ca2-848572067908","Type":"ContainerStarted","Data":"3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20"} Dec 03 21:56:13.763036 master-0 kubenswrapper[9136]: I1203 21:56:13.762936 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:13.763036 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:13.763036 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:13.763036 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:13.763036 master-0 kubenswrapper[9136]: I1203 21:56:13.763023 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:14.762869 master-0 kubenswrapper[9136]: I1203 21:56:14.762820 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:14.762869 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:14.762869 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:14.762869 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:14.763711 master-0 kubenswrapper[9136]: I1203 21:56:14.762913 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:15.251695 master-0 kubenswrapper[9136]: I1203 21:56:15.251533 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" event={"ID":"d0be52f3-b318-4630-b4da-f3c4a57d5818","Type":"ContainerStarted","Data":"f1482d5f4bb3f863223d3c1f39dfac93df83cebe632c6172626992074ac1550f"} Dec 03 21:56:15.254213 master-0 kubenswrapper[9136]: I1203 21:56:15.254068 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" event={"ID":"2d592f19-c7b9-4b29-9ca2-848572067908","Type":"ContainerStarted","Data":"6d7511987f3642a10070d20d4ebab992768fe2f2e22f2707edf0c010fd0fe7be"} Dec 03 21:56:15.254213 master-0 kubenswrapper[9136]: I1203 21:56:15.254111 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" event={"ID":"2d592f19-c7b9-4b29-9ca2-848572067908","Type":"ContainerStarted","Data":"2a1fede73882dac67b0e7a20906693c0f01761835f1406754d2b8647193848a9"} Dec 03 21:56:15.254213 master-0 kubenswrapper[9136]: I1203 21:56:15.254138 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" event={"ID":"2d592f19-c7b9-4b29-9ca2-848572067908","Type":"ContainerStarted","Data":"1d2e9e2c68e7699383990d4a0589c0c97de3de3dd52845d2b571236084269a32"} Dec 03 21:56:15.256343 master-0 kubenswrapper[9136]: I1203 21:56:15.256317 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nkjnl" event={"ID":"bcbec7ef-0b98-4346-8c6b-c5fa37e90286","Type":"ContainerStarted","Data":"20df288f5244d62324ed90f602cc2e943e644c0217b4ac80a7c99de81d881579"} Dec 03 21:56:15.256343 master-0 kubenswrapper[9136]: I1203 21:56:15.256343 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nkjnl" event={"ID":"bcbec7ef-0b98-4346-8c6b-c5fa37e90286","Type":"ContainerStarted","Data":"e332ebe370672b77a4d996848813df601876781d94a5b95353e02e813bf12265"} Dec 03 21:56:15.273359 master-0 kubenswrapper[9136]: I1203 21:56:15.273279 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" podStartSLOduration=5.919669567 podStartE2EDuration="8.273258505s" podCreationTimestamp="2025-12-03 21:56:07 +0000 UTC" firstStartedPulling="2025-12-03 21:56:12.16630279 +0000 UTC m=+378.441479212" lastFinishedPulling="2025-12-03 21:56:14.519891748 +0000 UTC m=+380.795068150" observedRunningTime="2025-12-03 21:56:15.267950012 +0000 UTC m=+381.543126404" watchObservedRunningTime="2025-12-03 21:56:15.273258505 +0000 UTC m=+381.548434897" Dec 03 21:56:15.321384 master-0 kubenswrapper[9136]: I1203 21:56:15.321094 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" podStartSLOduration=6.307599169 podStartE2EDuration="8.321066947s" podCreationTimestamp="2025-12-03 21:56:07 +0000 UTC" firstStartedPulling="2025-12-03 21:56:12.505273372 +0000 UTC m=+378.780449754" lastFinishedPulling="2025-12-03 21:56:14.51874114 +0000 UTC m=+380.793917532" observedRunningTime="2025-12-03 21:56:15.295722249 +0000 UTC m=+381.570898651" watchObservedRunningTime="2025-12-03 21:56:15.321066947 +0000 UTC m=+381.596243399" Dec 03 21:56:15.327251 master-0 kubenswrapper[9136]: I1203 21:56:15.327044 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nkjnl" podStartSLOduration=3.257522111 podStartE2EDuration="8.32698653s" podCreationTimestamp="2025-12-03 21:56:07 +0000 UTC" firstStartedPulling="2025-12-03 21:56:07.495984019 +0000 UTC m=+373.771160401" lastFinishedPulling="2025-12-03 21:56:12.565448438 +0000 UTC m=+378.840624820" observedRunningTime="2025-12-03 21:56:15.318441071 +0000 UTC m=+381.593617493" watchObservedRunningTime="2025-12-03 21:56:15.32698653 +0000 UTC m=+381.602163002" Dec 03 21:56:15.506473 master-0 kubenswrapper[9136]: I1203 21:56:15.506399 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-b9f5dccb6-4h4jv"] Dec 03 21:56:15.507481 master-0 kubenswrapper[9136]: I1203 21:56:15.507439 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.510640 master-0 kubenswrapper[9136]: I1203 21:56:15.510594 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 03 21:56:15.511372 master-0 kubenswrapper[9136]: I1203 21:56:15.511326 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 03 21:56:15.511731 master-0 kubenswrapper[9136]: I1203 21:56:15.511688 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-gfs59jbdhk2g" Dec 03 21:56:15.512574 master-0 kubenswrapper[9136]: I1203 21:56:15.512368 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 03 21:56:15.513320 master-0 kubenswrapper[9136]: I1203 21:56:15.513286 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-lrw2p" Dec 03 21:56:15.517819 master-0 kubenswrapper[9136]: I1203 21:56:15.517606 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 03 21:56:15.525417 master-0 kubenswrapper[9136]: I1203 21:56:15.525382 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b9f5dccb6-4h4jv"] Dec 03 21:56:15.626472 master-0 kubenswrapper[9136]: I1203 21:56:15.626410 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.626472 master-0 kubenswrapper[9136]: I1203 21:56:15.626476 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.626799 master-0 kubenswrapper[9136]: I1203 21:56:15.626536 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.626799 master-0 kubenswrapper[9136]: I1203 21:56:15.626566 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.626799 master-0 kubenswrapper[9136]: I1203 21:56:15.626633 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.626799 master-0 kubenswrapper[9136]: I1203 21:56:15.626666 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.626799 master-0 kubenswrapper[9136]: I1203 21:56:15.626701 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727659 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727710 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727736 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727752 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727792 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727836 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728210 master-0 kubenswrapper[9136]: I1203 21:56:15.727860 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728753 master-0 kubenswrapper[9136]: I1203 21:56:15.728504 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.728753 master-0 kubenswrapper[9136]: I1203 21:56:15.728618 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.729043 master-0 kubenswrapper[9136]: I1203 21:56:15.729006 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.730979 master-0 kubenswrapper[9136]: I1203 21:56:15.730951 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.731292 master-0 kubenswrapper[9136]: I1203 21:56:15.731255 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.732743 master-0 kubenswrapper[9136]: I1203 21:56:15.732677 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.762612 master-0 kubenswrapper[9136]: I1203 21:56:15.762531 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:15.762612 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:15.762612 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:15.762612 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:15.762997 master-0 kubenswrapper[9136]: I1203 21:56:15.762615 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:15.775305 master-0 kubenswrapper[9136]: I1203 21:56:15.775256 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:15.827242 master-0 kubenswrapper[9136]: I1203 21:56:15.827176 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:16.288510 master-0 kubenswrapper[9136]: I1203 21:56:16.288367 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-b9f5dccb6-4h4jv"] Dec 03 21:56:16.300442 master-0 kubenswrapper[9136]: W1203 21:56:16.300400 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f8e70d_5f98_47f1_afa8_ea67242981fc.slice/crio-01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365 WatchSource:0}: Error finding container 01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365: Status 404 returned error can't find the container with id 01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365 Dec 03 21:56:16.763586 master-0 kubenswrapper[9136]: I1203 21:56:16.763496 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:16.763586 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:16.763586 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:16.763586 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:16.763586 master-0 kubenswrapper[9136]: I1203 21:56:16.763555 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:17.273001 master-0 kubenswrapper[9136]: I1203 21:56:17.272939 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" event={"ID":"40f8e70d-5f98-47f1-afa8-ea67242981fc","Type":"ContainerStarted","Data":"01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365"} Dec 03 21:56:17.763246 master-0 kubenswrapper[9136]: I1203 21:56:17.763160 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:17.763246 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:17.763246 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:17.763246 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:17.763246 master-0 kubenswrapper[9136]: I1203 21:56:17.763238 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:18.282410 master-0 kubenswrapper[9136]: I1203 21:56:18.282217 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" event={"ID":"40f8e70d-5f98-47f1-afa8-ea67242981fc","Type":"ContainerStarted","Data":"eac3faec501ffdc007c07d93a9508e47b671bbce7cc0a7b3a4970c2ac98f0e4b"} Dec 03 21:56:18.311647 master-0 kubenswrapper[9136]: I1203 21:56:18.311425 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" podStartSLOduration=1.6835174099999999 podStartE2EDuration="3.311405423s" podCreationTimestamp="2025-12-03 21:56:15 +0000 UTC" firstStartedPulling="2025-12-03 21:56:16.304650075 +0000 UTC m=+382.579826457" lastFinishedPulling="2025-12-03 21:56:17.932538058 +0000 UTC m=+384.207714470" observedRunningTime="2025-12-03 21:56:18.306204583 +0000 UTC m=+384.581381015" watchObservedRunningTime="2025-12-03 21:56:18.311405423 +0000 UTC m=+384.586581835" Dec 03 21:56:18.764004 master-0 kubenswrapper[9136]: I1203 21:56:18.763875 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:18.764004 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:18.764004 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:18.764004 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:18.764486 master-0 kubenswrapper[9136]: I1203 21:56:18.764005 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:19.763957 master-0 kubenswrapper[9136]: I1203 21:56:19.763866 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:19.763957 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:19.763957 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:19.763957 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:19.763957 master-0 kubenswrapper[9136]: I1203 21:56:19.763958 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:20.764196 master-0 kubenswrapper[9136]: I1203 21:56:20.764086 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:20.764196 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:20.764196 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:20.764196 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:20.765219 master-0 kubenswrapper[9136]: I1203 21:56:20.764214 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:21.763682 master-0 kubenswrapper[9136]: I1203 21:56:21.763566 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:21.763682 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:21.763682 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:21.763682 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:21.764188 master-0 kubenswrapper[9136]: I1203 21:56:21.763683 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:22.764085 master-0 kubenswrapper[9136]: I1203 21:56:22.763972 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:22.764085 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:22.764085 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:22.764085 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:22.765107 master-0 kubenswrapper[9136]: I1203 21:56:22.764090 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:23.763411 master-0 kubenswrapper[9136]: I1203 21:56:23.763339 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:23.763411 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:23.763411 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:23.763411 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:23.763969 master-0 kubenswrapper[9136]: I1203 21:56:23.763428 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:24.006573 master-0 kubenswrapper[9136]: I1203 21:56:24.006466 9136 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podef057c6e-7d96-4db8-ab3c-8e81d6f29df7"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podef057c6e-7d96-4db8-ab3c-8e81d6f29df7] : Timed out while waiting for systemd to remove kubepods-burstable-podef057c6e_7d96_4db8_ab3c_8e81d6f29df7.slice" Dec 03 21:56:24.763428 master-0 kubenswrapper[9136]: I1203 21:56:24.763354 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:24.763428 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:24.763428 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:24.763428 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:24.763809 master-0 kubenswrapper[9136]: I1203 21:56:24.763437 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:25.763397 master-0 kubenswrapper[9136]: I1203 21:56:25.763299 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:25.763397 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:25.763397 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:25.763397 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:25.764425 master-0 kubenswrapper[9136]: I1203 21:56:25.763431 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:26.763603 master-0 kubenswrapper[9136]: I1203 21:56:26.763522 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:26.763603 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:26.763603 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:26.763603 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:26.763603 master-0 kubenswrapper[9136]: I1203 21:56:26.763595 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:27.763868 master-0 kubenswrapper[9136]: I1203 21:56:27.763679 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:27.763868 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:27.763868 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:27.763868 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:27.763868 master-0 kubenswrapper[9136]: I1203 21:56:27.763830 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:28.763598 master-0 kubenswrapper[9136]: I1203 21:56:28.763519 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:28.763598 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:28.763598 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:28.763598 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:28.763598 master-0 kubenswrapper[9136]: I1203 21:56:28.763592 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:29.763225 master-0 kubenswrapper[9136]: I1203 21:56:29.763124 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:29.763225 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:29.763225 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:29.763225 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:29.763750 master-0 kubenswrapper[9136]: I1203 21:56:29.763235 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:30.763430 master-0 kubenswrapper[9136]: I1203 21:56:30.763328 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:30.763430 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:30.763430 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:30.763430 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:30.763430 master-0 kubenswrapper[9136]: I1203 21:56:30.763421 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:31.763494 master-0 kubenswrapper[9136]: I1203 21:56:31.762999 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:31.763494 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:31.763494 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:31.763494 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:31.763494 master-0 kubenswrapper[9136]: I1203 21:56:31.763098 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:32.763609 master-0 kubenswrapper[9136]: I1203 21:56:32.763525 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:32.763609 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:32.763609 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:32.763609 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:32.764448 master-0 kubenswrapper[9136]: I1203 21:56:32.763625 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:33.764081 master-0 kubenswrapper[9136]: I1203 21:56:33.764018 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:33.764081 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:33.764081 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:33.764081 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:33.764081 master-0 kubenswrapper[9136]: I1203 21:56:33.764087 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:34.762878 master-0 kubenswrapper[9136]: I1203 21:56:34.762790 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:34.762878 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:34.762878 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:34.762878 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:34.762878 master-0 kubenswrapper[9136]: I1203 21:56:34.762861 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:35.763347 master-0 kubenswrapper[9136]: I1203 21:56:35.763263 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:35.763347 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:35.763347 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:35.763347 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:35.763929 master-0 kubenswrapper[9136]: I1203 21:56:35.763384 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:35.827442 master-0 kubenswrapper[9136]: I1203 21:56:35.827378 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:35.827442 master-0 kubenswrapper[9136]: I1203 21:56:35.827440 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:36.763906 master-0 kubenswrapper[9136]: I1203 21:56:36.763760 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:36.763906 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:36.763906 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:36.763906 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:36.763906 master-0 kubenswrapper[9136]: I1203 21:56:36.763894 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:37.762581 master-0 kubenswrapper[9136]: I1203 21:56:37.762511 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:37.762581 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:37.762581 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:37.762581 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:37.762581 master-0 kubenswrapper[9136]: I1203 21:56:37.762577 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:38.762816 master-0 kubenswrapper[9136]: I1203 21:56:38.762699 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:38.762816 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:38.762816 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:38.762816 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:38.764007 master-0 kubenswrapper[9136]: I1203 21:56:38.762835 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:39.763655 master-0 kubenswrapper[9136]: I1203 21:56:39.763529 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:39.763655 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:39.763655 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:39.763655 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:39.764670 master-0 kubenswrapper[9136]: I1203 21:56:39.763650 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:40.762042 master-0 kubenswrapper[9136]: I1203 21:56:40.761921 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:40.762042 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:40.762042 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:40.762042 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:40.762512 master-0 kubenswrapper[9136]: I1203 21:56:40.762049 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:41.764163 master-0 kubenswrapper[9136]: I1203 21:56:41.764036 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:41.764163 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:41.764163 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:41.764163 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:41.765288 master-0 kubenswrapper[9136]: I1203 21:56:41.764178 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:42.764556 master-0 kubenswrapper[9136]: I1203 21:56:42.764460 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:42.764556 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:42.764556 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:42.764556 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:42.765233 master-0 kubenswrapper[9136]: I1203 21:56:42.764569 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:43.763253 master-0 kubenswrapper[9136]: I1203 21:56:43.763160 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:43.763253 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:43.763253 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:43.763253 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:43.763253 master-0 kubenswrapper[9136]: I1203 21:56:43.763252 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:44.764264 master-0 kubenswrapper[9136]: I1203 21:56:44.764125 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:44.764264 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:44.764264 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:44.764264 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:44.764264 master-0 kubenswrapper[9136]: I1203 21:56:44.764246 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:45.763170 master-0 kubenswrapper[9136]: I1203 21:56:45.763080 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:45.763170 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:45.763170 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:45.763170 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:45.763170 master-0 kubenswrapper[9136]: I1203 21:56:45.763157 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:46.764430 master-0 kubenswrapper[9136]: I1203 21:56:46.764325 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:46.764430 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:46.764430 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:46.764430 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:46.765498 master-0 kubenswrapper[9136]: I1203 21:56:46.764440 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:47.763221 master-0 kubenswrapper[9136]: I1203 21:56:47.763135 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:47.763221 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:47.763221 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:47.763221 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:47.763577 master-0 kubenswrapper[9136]: I1203 21:56:47.763225 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:48.763752 master-0 kubenswrapper[9136]: I1203 21:56:48.763621 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:48.763752 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:48.763752 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:48.763752 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:48.763752 master-0 kubenswrapper[9136]: I1203 21:56:48.763721 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:49.764432 master-0 kubenswrapper[9136]: I1203 21:56:49.764308 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:49.764432 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:49.764432 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:49.764432 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:49.764432 master-0 kubenswrapper[9136]: I1203 21:56:49.764413 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:50.763564 master-0 kubenswrapper[9136]: I1203 21:56:50.763446 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:50.763564 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:50.763564 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:50.763564 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:50.763564 master-0 kubenswrapper[9136]: I1203 21:56:50.763539 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:51.763806 master-0 kubenswrapper[9136]: I1203 21:56:51.763667 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:51.763806 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:51.763806 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:51.763806 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:51.764953 master-0 kubenswrapper[9136]: I1203 21:56:51.763841 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:52.764865 master-0 kubenswrapper[9136]: I1203 21:56:52.764729 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:52.764865 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:52.764865 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:52.764865 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:52.765589 master-0 kubenswrapper[9136]: I1203 21:56:52.764885 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:53.763586 master-0 kubenswrapper[9136]: I1203 21:56:53.763396 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:53.763586 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:53.763586 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:53.763586 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:53.763586 master-0 kubenswrapper[9136]: I1203 21:56:53.763518 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:54.214420 master-0 kubenswrapper[9136]: I1203 21:56:54.214345 9136 scope.go:117] "RemoveContainer" containerID="7db16df4a826110868fed85df42888a8aa70e542af4ac113bfb0d52af03cbab5" Dec 03 21:56:54.763578 master-0 kubenswrapper[9136]: I1203 21:56:54.763486 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:54.763578 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:54.763578 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:54.763578 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:54.763578 master-0 kubenswrapper[9136]: I1203 21:56:54.763575 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:55.763810 master-0 kubenswrapper[9136]: I1203 21:56:55.763656 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:55.763810 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:55.763810 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:55.763810 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:55.765168 master-0 kubenswrapper[9136]: I1203 21:56:55.763818 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:55.838105 master-0 kubenswrapper[9136]: I1203 21:56:55.837988 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:55.846187 master-0 kubenswrapper[9136]: I1203 21:56:55.846108 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 21:56:56.279800 master-0 kubenswrapper[9136]: I1203 21:56:56.279704 9136 patch_prober.go:28] interesting pod/machine-config-daemon-j9wwr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 21:56:56.279800 master-0 kubenswrapper[9136]: I1203 21:56:56.279790 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podUID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 21:56:56.763321 master-0 kubenswrapper[9136]: I1203 21:56:56.763220 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:56.763321 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:56.763321 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:56.763321 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:56.763321 master-0 kubenswrapper[9136]: I1203 21:56:56.763314 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:57.763431 master-0 kubenswrapper[9136]: I1203 21:56:57.763357 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:57.763431 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:57.763431 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:57.763431 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:57.764536 master-0 kubenswrapper[9136]: I1203 21:56:57.763457 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:58.763755 master-0 kubenswrapper[9136]: I1203 21:56:58.763649 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:58.763755 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:58.763755 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:58.763755 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:58.763755 master-0 kubenswrapper[9136]: I1203 21:56:58.763744 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:56:59.763490 master-0 kubenswrapper[9136]: I1203 21:56:59.763380 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:56:59.763490 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:56:59.763490 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:56:59.763490 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:56:59.764569 master-0 kubenswrapper[9136]: I1203 21:56:59.763479 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:00.763406 master-0 kubenswrapper[9136]: I1203 21:57:00.763281 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:00.763406 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:00.763406 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:00.763406 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:00.763406 master-0 kubenswrapper[9136]: I1203 21:57:00.763373 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:01.763963 master-0 kubenswrapper[9136]: I1203 21:57:01.763862 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:01.763963 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:01.763963 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:01.763963 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:01.765188 master-0 kubenswrapper[9136]: I1203 21:57:01.763962 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:02.764074 master-0 kubenswrapper[9136]: I1203 21:57:02.763978 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:02.764074 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:02.764074 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:02.764074 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:02.765291 master-0 kubenswrapper[9136]: I1203 21:57:02.764089 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:03.763077 master-0 kubenswrapper[9136]: I1203 21:57:03.763022 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:03.763077 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:03.763077 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:03.763077 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:03.763418 master-0 kubenswrapper[9136]: I1203 21:57:03.763093 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:04.763511 master-0 kubenswrapper[9136]: I1203 21:57:04.763394 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:04.763511 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:04.763511 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:04.763511 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:04.764571 master-0 kubenswrapper[9136]: I1203 21:57:04.763516 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:05.763748 master-0 kubenswrapper[9136]: I1203 21:57:05.763637 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:05.763748 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:05.763748 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:05.763748 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:05.763748 master-0 kubenswrapper[9136]: I1203 21:57:05.763749 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:06.763804 master-0 kubenswrapper[9136]: I1203 21:57:06.763627 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:06.763804 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:06.763804 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:06.763804 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:06.763804 master-0 kubenswrapper[9136]: I1203 21:57:06.763732 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:07.763378 master-0 kubenswrapper[9136]: I1203 21:57:07.763282 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:07.763378 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:07.763378 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:07.763378 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:07.763897 master-0 kubenswrapper[9136]: I1203 21:57:07.763380 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:08.765031 master-0 kubenswrapper[9136]: I1203 21:57:08.764956 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:08.765031 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:08.765031 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:08.765031 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:08.766189 master-0 kubenswrapper[9136]: I1203 21:57:08.766141 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:09.763874 master-0 kubenswrapper[9136]: I1203 21:57:09.763802 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:09.763874 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:09.763874 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:09.763874 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:09.764361 master-0 kubenswrapper[9136]: I1203 21:57:09.763883 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:10.763806 master-0 kubenswrapper[9136]: I1203 21:57:10.763680 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:10.763806 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:10.763806 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:10.763806 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:10.764709 master-0 kubenswrapper[9136]: I1203 21:57:10.763825 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:11.764037 master-0 kubenswrapper[9136]: I1203 21:57:11.763890 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:11.764037 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:11.764037 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:11.764037 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:11.765193 master-0 kubenswrapper[9136]: I1203 21:57:11.764039 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:12.764081 master-0 kubenswrapper[9136]: I1203 21:57:12.763992 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:12.764081 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:12.764081 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:12.764081 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:12.764081 master-0 kubenswrapper[9136]: I1203 21:57:12.764075 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:13.763169 master-0 kubenswrapper[9136]: I1203 21:57:13.763082 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:13.763169 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:13.763169 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:13.763169 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:13.763884 master-0 kubenswrapper[9136]: I1203 21:57:13.763193 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:14.764517 master-0 kubenswrapper[9136]: I1203 21:57:14.764404 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:14.764517 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:14.764517 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:14.764517 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:14.764517 master-0 kubenswrapper[9136]: I1203 21:57:14.764489 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:15.763861 master-0 kubenswrapper[9136]: I1203 21:57:15.763725 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:15.763861 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:15.763861 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:15.763861 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:15.764323 master-0 kubenswrapper[9136]: I1203 21:57:15.763867 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:16.764198 master-0 kubenswrapper[9136]: I1203 21:57:16.764093 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:16.764198 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:16.764198 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:16.764198 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:16.764198 master-0 kubenswrapper[9136]: I1203 21:57:16.764180 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:17.763614 master-0 kubenswrapper[9136]: I1203 21:57:17.763499 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:17.763614 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:17.763614 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:17.763614 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:17.763614 master-0 kubenswrapper[9136]: I1203 21:57:17.763604 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:18.763895 master-0 kubenswrapper[9136]: I1203 21:57:18.763843 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:18.763895 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:18.763895 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:18.763895 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:18.764623 master-0 kubenswrapper[9136]: I1203 21:57:18.764594 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:19.764648 master-0 kubenswrapper[9136]: I1203 21:57:19.764523 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:19.764648 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:19.764648 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:19.764648 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:19.766243 master-0 kubenswrapper[9136]: I1203 21:57:19.764645 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:20.763958 master-0 kubenswrapper[9136]: I1203 21:57:20.763838 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:20.763958 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:20.763958 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:20.763958 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:20.764409 master-0 kubenswrapper[9136]: I1203 21:57:20.763974 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:21.763934 master-0 kubenswrapper[9136]: I1203 21:57:21.763859 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:21.763934 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:21.763934 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:21.763934 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:21.764967 master-0 kubenswrapper[9136]: I1203 21:57:21.763996 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:22.765392 master-0 kubenswrapper[9136]: I1203 21:57:22.765300 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:22.765392 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:22.765392 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:22.765392 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:22.766444 master-0 kubenswrapper[9136]: I1203 21:57:22.765410 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:23.763417 master-0 kubenswrapper[9136]: I1203 21:57:23.763321 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:23.763417 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:23.763417 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:23.763417 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:23.763417 master-0 kubenswrapper[9136]: I1203 21:57:23.763407 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:24.763358 master-0 kubenswrapper[9136]: I1203 21:57:24.763285 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:24.763358 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:24.763358 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:24.763358 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:24.763741 master-0 kubenswrapper[9136]: I1203 21:57:24.763370 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:25.763249 master-0 kubenswrapper[9136]: I1203 21:57:25.763105 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:25.763249 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:25.763249 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:25.763249 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:25.763249 master-0 kubenswrapper[9136]: I1203 21:57:25.763207 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:26.280165 master-0 kubenswrapper[9136]: I1203 21:57:26.280046 9136 patch_prober.go:28] interesting pod/machine-config-daemon-j9wwr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 21:57:26.280497 master-0 kubenswrapper[9136]: I1203 21:57:26.280177 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" podUID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 21:57:26.762836 master-0 kubenswrapper[9136]: I1203 21:57:26.762740 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:26.762836 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:26.762836 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:26.762836 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:26.763198 master-0 kubenswrapper[9136]: I1203 21:57:26.762859 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:27.425260 master-0 kubenswrapper[9136]: I1203 21:57:27.424583 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qsfnw"] Dec 03 21:57:27.426625 master-0 kubenswrapper[9136]: I1203 21:57:27.426136 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:27.432550 master-0 kubenswrapper[9136]: I1203 21:57:27.432175 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 21:57:27.433116 master-0 kubenswrapper[9136]: I1203 21:57:27.433003 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 21:57:27.433720 master-0 kubenswrapper[9136]: I1203 21:57:27.433506 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 21:57:27.433981 master-0 kubenswrapper[9136]: I1203 21:57:27.433914 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-5qkbc" Dec 03 21:57:27.446892 master-0 kubenswrapper[9136]: I1203 21:57:27.443638 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qsfnw"] Dec 03 21:57:27.559325 master-0 kubenswrapper[9136]: I1203 21:57:27.559193 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xtm\" (UniqueName: \"kubernetes.io/projected/7c8ec36d-9179-40ab-a448-440b4501b3e0-kube-api-access-t2xtm\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:27.559628 master-0 kubenswrapper[9136]: I1203 21:57:27.559458 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:27.660831 master-0 kubenswrapper[9136]: I1203 21:57:27.660671 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:27.661198 master-0 kubenswrapper[9136]: I1203 21:57:27.660983 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xtm\" (UniqueName: \"kubernetes.io/projected/7c8ec36d-9179-40ab-a448-440b4501b3e0-kube-api-access-t2xtm\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:27.661198 master-0 kubenswrapper[9136]: E1203 21:57:27.661008 9136 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Dec 03 21:57:27.661198 master-0 kubenswrapper[9136]: E1203 21:57:27.661142 9136 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert podName:7c8ec36d-9179-40ab-a448-440b4501b3e0 nodeName:}" failed. No retries permitted until 2025-12-03 21:57:28.161098596 +0000 UTC m=+454.436275018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert") pod "ingress-canary-qsfnw" (UID: "7c8ec36d-9179-40ab-a448-440b4501b3e0") : secret "canary-serving-cert" not found Dec 03 21:57:27.698891 master-0 kubenswrapper[9136]: I1203 21:57:27.698736 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xtm\" (UniqueName: \"kubernetes.io/projected/7c8ec36d-9179-40ab-a448-440b4501b3e0-kube-api-access-t2xtm\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:27.764319 master-0 kubenswrapper[9136]: I1203 21:57:27.764211 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:27.764319 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:27.764319 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:27.764319 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:27.764746 master-0 kubenswrapper[9136]: I1203 21:57:27.764317 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:27.866116 master-0 kubenswrapper[9136]: I1203 21:57:27.866043 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/2.log" Dec 03 21:57:27.867135 master-0 kubenswrapper[9136]: I1203 21:57:27.867088 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/1.log" Dec 03 21:57:27.867945 master-0 kubenswrapper[9136]: I1203 21:57:27.867897 9136 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af" exitCode=1 Dec 03 21:57:27.868061 master-0 kubenswrapper[9136]: I1203 21:57:27.867958 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerDied","Data":"ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af"} Dec 03 21:57:27.868061 master-0 kubenswrapper[9136]: I1203 21:57:27.868024 9136 scope.go:117] "RemoveContainer" containerID="6e8e854c9ec7a5043d35c55ccb3f34d12a4db391501f26bb9ce132cf680165af" Dec 03 21:57:27.868961 master-0 kubenswrapper[9136]: I1203 21:57:27.868898 9136 scope.go:117] "RemoveContainer" containerID="ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af" Dec 03 21:57:27.869399 master-0 kubenswrapper[9136]: E1203 21:57:27.869337 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 21:57:28.168295 master-0 kubenswrapper[9136]: I1203 21:57:28.168239 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:28.173978 master-0 kubenswrapper[9136]: I1203 21:57:28.173904 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:28.367544 master-0 kubenswrapper[9136]: I1203 21:57:28.367492 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 21:57:28.764061 master-0 kubenswrapper[9136]: I1203 21:57:28.763806 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:28.764061 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:28.764061 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:28.764061 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:28.764061 master-0 kubenswrapper[9136]: I1203 21:57:28.763927 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:28.840529 master-0 kubenswrapper[9136]: I1203 21:57:28.840426 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qsfnw"] Dec 03 21:57:28.850636 master-0 kubenswrapper[9136]: W1203 21:57:28.849892 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8ec36d_9179_40ab_a448_440b4501b3e0.slice/crio-e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248 WatchSource:0}: Error finding container e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248: Status 404 returned error can't find the container with id e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248 Dec 03 21:57:28.882585 master-0 kubenswrapper[9136]: I1203 21:57:28.882116 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/2.log" Dec 03 21:57:28.907795 master-0 kubenswrapper[9136]: I1203 21:57:28.907613 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qsfnw" event={"ID":"7c8ec36d-9179-40ab-a448-440b4501b3e0","Type":"ContainerStarted","Data":"e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248"} Dec 03 21:57:29.763246 master-0 kubenswrapper[9136]: I1203 21:57:29.763138 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:29.763246 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:29.763246 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:29.763246 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:29.763246 master-0 kubenswrapper[9136]: I1203 21:57:29.763225 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:29.920232 master-0 kubenswrapper[9136]: I1203 21:57:29.920122 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qsfnw" event={"ID":"7c8ec36d-9179-40ab-a448-440b4501b3e0","Type":"ContainerStarted","Data":"18a5455ce1b182ff79f3a212014fd50ecf182821367e322a4cc2b8a1865badc2"} Dec 03 21:57:29.950220 master-0 kubenswrapper[9136]: I1203 21:57:29.950068 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qsfnw" podStartSLOduration=2.95003796 podStartE2EDuration="2.95003796s" podCreationTimestamp="2025-12-03 21:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:57:29.943725012 +0000 UTC m=+456.218901454" watchObservedRunningTime="2025-12-03 21:57:29.95003796 +0000 UTC m=+456.225214372" Dec 03 21:57:30.762656 master-0 kubenswrapper[9136]: I1203 21:57:30.762573 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:30.762656 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:30.762656 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:30.762656 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:30.763118 master-0 kubenswrapper[9136]: I1203 21:57:30.762666 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:31.763258 master-0 kubenswrapper[9136]: I1203 21:57:31.763152 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:31.763258 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:31.763258 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:31.763258 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:31.764490 master-0 kubenswrapper[9136]: I1203 21:57:31.763279 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:32.763967 master-0 kubenswrapper[9136]: I1203 21:57:32.763852 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:32.763967 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:32.763967 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:32.763967 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:32.763967 master-0 kubenswrapper[9136]: I1203 21:57:32.763933 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:33.763005 master-0 kubenswrapper[9136]: I1203 21:57:33.762912 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:33.763005 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:33.763005 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:33.763005 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:33.763376 master-0 kubenswrapper[9136]: I1203 21:57:33.763008 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:34.764256 master-0 kubenswrapper[9136]: I1203 21:57:34.764129 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:34.764256 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:34.764256 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:34.764256 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:34.764256 master-0 kubenswrapper[9136]: I1203 21:57:34.764238 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:35.764006 master-0 kubenswrapper[9136]: I1203 21:57:35.763915 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:35.764006 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:35.764006 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:35.764006 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:35.764006 master-0 kubenswrapper[9136]: I1203 21:57:35.764007 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:36.764646 master-0 kubenswrapper[9136]: I1203 21:57:36.764523 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:36.764646 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:36.764646 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:36.764646 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:36.764646 master-0 kubenswrapper[9136]: I1203 21:57:36.764627 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:37.764535 master-0 kubenswrapper[9136]: I1203 21:57:37.764441 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:37.764535 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:37.764535 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:37.764535 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:37.766824 master-0 kubenswrapper[9136]: I1203 21:57:37.766734 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:38.763363 master-0 kubenswrapper[9136]: I1203 21:57:38.763232 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:38.763363 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:38.763363 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:38.763363 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:38.763363 master-0 kubenswrapper[9136]: I1203 21:57:38.763341 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:38.908907 master-0 kubenswrapper[9136]: I1203 21:57:38.908738 9136 scope.go:117] "RemoveContainer" containerID="ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af" Dec 03 21:57:38.909861 master-0 kubenswrapper[9136]: E1203 21:57:38.909348 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 21:57:39.764421 master-0 kubenswrapper[9136]: I1203 21:57:39.764324 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:39.764421 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:39.764421 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:39.764421 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:39.764928 master-0 kubenswrapper[9136]: I1203 21:57:39.764431 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:40.763557 master-0 kubenswrapper[9136]: I1203 21:57:40.763439 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:40.763557 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:40.763557 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:40.763557 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:40.763557 master-0 kubenswrapper[9136]: I1203 21:57:40.763511 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:41.763549 master-0 kubenswrapper[9136]: I1203 21:57:41.763458 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:41.763549 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:41.763549 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:41.763549 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:41.764862 master-0 kubenswrapper[9136]: I1203 21:57:41.763583 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:42.762754 master-0 kubenswrapper[9136]: I1203 21:57:42.762663 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:42.762754 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:42.762754 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:42.762754 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:42.763264 master-0 kubenswrapper[9136]: I1203 21:57:42.762800 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:43.763559 master-0 kubenswrapper[9136]: I1203 21:57:43.763440 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:43.763559 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:43.763559 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:43.763559 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:43.764421 master-0 kubenswrapper[9136]: I1203 21:57:43.763585 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:44.763815 master-0 kubenswrapper[9136]: I1203 21:57:44.763635 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:44.763815 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:44.763815 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:44.763815 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:44.765093 master-0 kubenswrapper[9136]: I1203 21:57:44.764353 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:45.763034 master-0 kubenswrapper[9136]: I1203 21:57:45.762963 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:45.763034 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:45.763034 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:45.763034 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:45.763737 master-0 kubenswrapper[9136]: I1203 21:57:45.763684 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:46.763144 master-0 kubenswrapper[9136]: I1203 21:57:46.763096 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:46.763144 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:46.763144 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:46.763144 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:46.763984 master-0 kubenswrapper[9136]: I1203 21:57:46.763920 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:47.763878 master-0 kubenswrapper[9136]: I1203 21:57:47.763760 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:47.763878 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:47.763878 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:47.763878 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:47.764555 master-0 kubenswrapper[9136]: I1203 21:57:47.763897 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:48.764049 master-0 kubenswrapper[9136]: I1203 21:57:48.763973 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:48.764049 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:48.764049 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:48.764049 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:48.765124 master-0 kubenswrapper[9136]: I1203 21:57:48.765041 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:49.763563 master-0 kubenswrapper[9136]: I1203 21:57:49.763450 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:49.763563 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:49.763563 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:49.763563 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:49.764850 master-0 kubenswrapper[9136]: I1203 21:57:49.763568 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:50.763443 master-0 kubenswrapper[9136]: I1203 21:57:50.763348 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:50.763443 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:50.763443 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:50.763443 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:50.763443 master-0 kubenswrapper[9136]: I1203 21:57:50.763422 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:51.763354 master-0 kubenswrapper[9136]: I1203 21:57:51.763204 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:51.763354 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:51.763354 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:51.763354 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:51.764476 master-0 kubenswrapper[9136]: I1203 21:57:51.763448 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:52.763606 master-0 kubenswrapper[9136]: I1203 21:57:52.763500 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:52.763606 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:52.763606 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:52.763606 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:52.763606 master-0 kubenswrapper[9136]: I1203 21:57:52.763598 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:53.763372 master-0 kubenswrapper[9136]: I1203 21:57:53.763229 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:53.763372 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:53.763372 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:53.763372 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:53.763372 master-0 kubenswrapper[9136]: I1203 21:57:53.763331 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:53.910227 master-0 kubenswrapper[9136]: I1203 21:57:53.910159 9136 scope.go:117] "RemoveContainer" containerID="ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af" Dec 03 21:57:54.152797 master-0 kubenswrapper[9136]: I1203 21:57:54.152700 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/2.log" Dec 03 21:57:54.153346 master-0 kubenswrapper[9136]: I1203 21:57:54.153287 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d"} Dec 03 21:57:54.763861 master-0 kubenswrapper[9136]: I1203 21:57:54.763729 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:54.763861 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:54.763861 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:54.763861 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:54.764993 master-0 kubenswrapper[9136]: I1203 21:57:54.763890 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:55.764031 master-0 kubenswrapper[9136]: I1203 21:57:55.763913 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:55.764031 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:55.764031 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:55.764031 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:55.765329 master-0 kubenswrapper[9136]: I1203 21:57:55.764058 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:55.924059 master-0 kubenswrapper[9136]: E1203 21:57:55.923977 9136 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff" to get inode usage: stat /var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff: no such file or directory, extraDiskErr: Dec 03 21:57:56.763303 master-0 kubenswrapper[9136]: I1203 21:57:56.763196 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:56.763303 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:56.763303 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:56.763303 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:56.763849 master-0 kubenswrapper[9136]: I1203 21:57:56.763303 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:57.764066 master-0 kubenswrapper[9136]: I1203 21:57:57.763933 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:57:57.764066 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:57:57.764066 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:57:57.764066 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:57:57.765181 master-0 kubenswrapper[9136]: I1203 21:57:57.764069 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:57:57.765181 master-0 kubenswrapper[9136]: I1203 21:57:57.764185 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:57:57.765328 master-0 kubenswrapper[9136]: I1203 21:57:57.765178 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"7ecf6575752194e376e6db6b9227fb236567480b2e2793db9b6ea4ab3a895c8f"} pod="openshift-ingress/router-default-54f97f57-xq6ch" containerMessage="Container router failed startup probe, will be restarted" Dec 03 21:57:57.765328 master-0 kubenswrapper[9136]: I1203 21:57:57.765248 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" containerID="cri-o://7ecf6575752194e376e6db6b9227fb236567480b2e2793db9b6ea4ab3a895c8f" gracePeriod=3600 Dec 03 21:58:44.564897 master-0 kubenswrapper[9136]: I1203 21:58:44.564741 9136 generic.go:334] "Generic (PLEG): container finished" podID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerID="7ecf6575752194e376e6db6b9227fb236567480b2e2793db9b6ea4ab3a895c8f" exitCode=0 Dec 03 21:58:44.565893 master-0 kubenswrapper[9136]: I1203 21:58:44.564882 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerDied","Data":"7ecf6575752194e376e6db6b9227fb236567480b2e2793db9b6ea4ab3a895c8f"} Dec 03 21:58:44.565893 master-0 kubenswrapper[9136]: I1203 21:58:44.565002 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"81bfd9fda486fff0e9f3b1dee5719d6b24e5410ceadededcc88dc65916bf639e"} Dec 03 21:58:44.760245 master-0 kubenswrapper[9136]: I1203 21:58:44.760153 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:58:44.760245 master-0 kubenswrapper[9136]: I1203 21:58:44.760238 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 21:58:44.764134 master-0 kubenswrapper[9136]: I1203 21:58:44.764034 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:44.764134 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:44.764134 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:44.764134 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:44.764448 master-0 kubenswrapper[9136]: I1203 21:58:44.764185 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:45.764156 master-0 kubenswrapper[9136]: I1203 21:58:45.764065 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:45.764156 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:45.764156 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:45.764156 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:45.765203 master-0 kubenswrapper[9136]: I1203 21:58:45.764162 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:46.763730 master-0 kubenswrapper[9136]: I1203 21:58:46.763617 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:46.763730 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:46.763730 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:46.763730 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:46.763730 master-0 kubenswrapper[9136]: I1203 21:58:46.763725 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:47.763399 master-0 kubenswrapper[9136]: I1203 21:58:47.763322 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:47.763399 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:47.763399 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:47.763399 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:47.763792 master-0 kubenswrapper[9136]: I1203 21:58:47.763408 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:48.763707 master-0 kubenswrapper[9136]: I1203 21:58:48.763604 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:48.763707 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:48.763707 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:48.763707 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:48.764706 master-0 kubenswrapper[9136]: I1203 21:58:48.763714 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:49.763325 master-0 kubenswrapper[9136]: I1203 21:58:49.763229 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:49.763325 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:49.763325 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:49.763325 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:49.763325 master-0 kubenswrapper[9136]: I1203 21:58:49.763314 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:50.763155 master-0 kubenswrapper[9136]: I1203 21:58:50.763040 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:50.763155 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:50.763155 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:50.763155 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:50.763821 master-0 kubenswrapper[9136]: I1203 21:58:50.763153 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:51.763517 master-0 kubenswrapper[9136]: I1203 21:58:51.763412 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:51.763517 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:51.763517 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:51.763517 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:51.764486 master-0 kubenswrapper[9136]: I1203 21:58:51.763535 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:52.763982 master-0 kubenswrapper[9136]: I1203 21:58:52.763884 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:52.763982 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:52.763982 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:52.763982 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:52.763982 master-0 kubenswrapper[9136]: I1203 21:58:52.763983 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:53.764068 master-0 kubenswrapper[9136]: I1203 21:58:53.763848 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:53.764068 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:53.764068 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:53.764068 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:53.764068 master-0 kubenswrapper[9136]: I1203 21:58:53.764014 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:54.763346 master-0 kubenswrapper[9136]: I1203 21:58:54.763232 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:54.763346 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:54.763346 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:54.763346 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:54.763995 master-0 kubenswrapper[9136]: I1203 21:58:54.763370 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:55.763579 master-0 kubenswrapper[9136]: I1203 21:58:55.763458 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:55.763579 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:55.763579 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:55.763579 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:55.763579 master-0 kubenswrapper[9136]: I1203 21:58:55.763566 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:56.763664 master-0 kubenswrapper[9136]: I1203 21:58:56.763598 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:56.763664 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:56.763664 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:56.763664 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:56.764558 master-0 kubenswrapper[9136]: I1203 21:58:56.764507 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:57.762904 master-0 kubenswrapper[9136]: I1203 21:58:57.762837 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:57.762904 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:57.762904 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:57.762904 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:57.763207 master-0 kubenswrapper[9136]: I1203 21:58:57.762919 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:58.763247 master-0 kubenswrapper[9136]: I1203 21:58:58.763139 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:58.763247 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:58.763247 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:58.763247 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:58.763247 master-0 kubenswrapper[9136]: I1203 21:58:58.763241 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:58:59.763553 master-0 kubenswrapper[9136]: I1203 21:58:59.763433 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:58:59.763553 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:58:59.763553 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:58:59.763553 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:58:59.763553 master-0 kubenswrapper[9136]: I1203 21:58:59.763528 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:00.763221 master-0 kubenswrapper[9136]: I1203 21:59:00.763137 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:00.763221 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:00.763221 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:00.763221 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:00.763221 master-0 kubenswrapper[9136]: I1203 21:59:00.763206 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:01.763573 master-0 kubenswrapper[9136]: I1203 21:59:01.763457 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:01.763573 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:01.763573 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:01.763573 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:01.764895 master-0 kubenswrapper[9136]: I1203 21:59:01.763572 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:02.763334 master-0 kubenswrapper[9136]: I1203 21:59:02.763206 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:02.763334 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:02.763334 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:02.763334 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:02.763334 master-0 kubenswrapper[9136]: I1203 21:59:02.763301 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:03.763795 master-0 kubenswrapper[9136]: I1203 21:59:03.763689 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:03.763795 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:03.763795 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:03.763795 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:03.764552 master-0 kubenswrapper[9136]: I1203 21:59:03.763803 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:04.762745 master-0 kubenswrapper[9136]: I1203 21:59:04.762650 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:04.762745 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:04.762745 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:04.762745 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:04.762745 master-0 kubenswrapper[9136]: I1203 21:59:04.762737 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:05.763344 master-0 kubenswrapper[9136]: I1203 21:59:05.763231 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:05.763344 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:05.763344 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:05.763344 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:05.763344 master-0 kubenswrapper[9136]: I1203 21:59:05.763317 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:06.763628 master-0 kubenswrapper[9136]: I1203 21:59:06.763541 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:06.763628 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:06.763628 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:06.763628 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:06.764247 master-0 kubenswrapper[9136]: I1203 21:59:06.763656 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:07.763153 master-0 kubenswrapper[9136]: I1203 21:59:07.763080 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:07.763153 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:07.763153 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:07.763153 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:07.764432 master-0 kubenswrapper[9136]: I1203 21:59:07.764009 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:08.763596 master-0 kubenswrapper[9136]: I1203 21:59:08.763406 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:08.763596 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:08.763596 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:08.763596 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:08.763596 master-0 kubenswrapper[9136]: I1203 21:59:08.763531 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:09.763402 master-0 kubenswrapper[9136]: I1203 21:59:09.763319 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:09.763402 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:09.763402 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:09.763402 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:09.763757 master-0 kubenswrapper[9136]: I1203 21:59:09.763417 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:10.762910 master-0 kubenswrapper[9136]: I1203 21:59:10.762826 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:10.762910 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:10.762910 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:10.762910 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:10.763875 master-0 kubenswrapper[9136]: I1203 21:59:10.762937 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:11.763896 master-0 kubenswrapper[9136]: I1203 21:59:11.763805 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:11.763896 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:11.763896 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:11.763896 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:11.764590 master-0 kubenswrapper[9136]: I1203 21:59:11.764494 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:12.764021 master-0 kubenswrapper[9136]: I1203 21:59:12.763946 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:12.764021 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:12.764021 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:12.764021 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:12.764713 master-0 kubenswrapper[9136]: I1203 21:59:12.764054 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:13.763577 master-0 kubenswrapper[9136]: I1203 21:59:13.763494 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:13.763577 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:13.763577 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:13.763577 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:13.763577 master-0 kubenswrapper[9136]: I1203 21:59:13.763566 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:14.764100 master-0 kubenswrapper[9136]: I1203 21:59:14.763997 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:14.764100 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:14.764100 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:14.764100 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:14.765093 master-0 kubenswrapper[9136]: I1203 21:59:14.764112 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:15.762558 master-0 kubenswrapper[9136]: I1203 21:59:15.762486 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:15.762558 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:15.762558 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:15.762558 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:15.762558 master-0 kubenswrapper[9136]: I1203 21:59:15.762561 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:16.763387 master-0 kubenswrapper[9136]: I1203 21:59:16.763241 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:16.763387 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:16.763387 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:16.763387 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:16.763387 master-0 kubenswrapper[9136]: I1203 21:59:16.763336 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:17.762732 master-0 kubenswrapper[9136]: I1203 21:59:17.762639 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:17.762732 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:17.762732 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:17.762732 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:17.762732 master-0 kubenswrapper[9136]: I1203 21:59:17.762710 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:18.763744 master-0 kubenswrapper[9136]: I1203 21:59:18.763665 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:18.763744 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:18.763744 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:18.763744 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:18.764441 master-0 kubenswrapper[9136]: I1203 21:59:18.763786 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:19.763261 master-0 kubenswrapper[9136]: I1203 21:59:19.763177 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:19.763261 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:19.763261 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:19.763261 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:19.763261 master-0 kubenswrapper[9136]: I1203 21:59:19.763264 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:20.763924 master-0 kubenswrapper[9136]: I1203 21:59:20.763741 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:20.763924 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:20.763924 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:20.763924 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:20.764601 master-0 kubenswrapper[9136]: I1203 21:59:20.763958 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:21.764298 master-0 kubenswrapper[9136]: I1203 21:59:21.764190 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:21.764298 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:21.764298 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:21.764298 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:21.765199 master-0 kubenswrapper[9136]: I1203 21:59:21.764310 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:22.763717 master-0 kubenswrapper[9136]: I1203 21:59:22.763558 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:22.763717 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:22.763717 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:22.763717 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:22.763717 master-0 kubenswrapper[9136]: I1203 21:59:22.763694 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:23.763120 master-0 kubenswrapper[9136]: I1203 21:59:23.763014 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:23.763120 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:23.763120 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:23.763120 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:23.763120 master-0 kubenswrapper[9136]: I1203 21:59:23.763101 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:24.763691 master-0 kubenswrapper[9136]: I1203 21:59:24.763605 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:24.763691 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:24.763691 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:24.763691 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:24.764994 master-0 kubenswrapper[9136]: I1203 21:59:24.763708 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:25.763462 master-0 kubenswrapper[9136]: I1203 21:59:25.763338 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:25.763462 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:25.763462 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:25.763462 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:25.763462 master-0 kubenswrapper[9136]: I1203 21:59:25.763447 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:26.763493 master-0 kubenswrapper[9136]: I1203 21:59:26.763389 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:26.763493 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:26.763493 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:26.763493 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:26.763493 master-0 kubenswrapper[9136]: I1203 21:59:26.763486 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:27.764691 master-0 kubenswrapper[9136]: I1203 21:59:27.764633 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:27.764691 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:27.764691 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:27.764691 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:27.765295 master-0 kubenswrapper[9136]: I1203 21:59:27.764716 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:28.762740 master-0 kubenswrapper[9136]: I1203 21:59:28.762643 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:28.762740 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:28.762740 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:28.762740 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:28.763209 master-0 kubenswrapper[9136]: I1203 21:59:28.762757 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:29.763947 master-0 kubenswrapper[9136]: I1203 21:59:29.763820 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:29.763947 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:29.763947 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:29.763947 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:29.765079 master-0 kubenswrapper[9136]: I1203 21:59:29.763953 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:30.764170 master-0 kubenswrapper[9136]: I1203 21:59:30.764098 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:30.764170 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:30.764170 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:30.764170 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:30.764170 master-0 kubenswrapper[9136]: I1203 21:59:30.764174 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:31.764307 master-0 kubenswrapper[9136]: I1203 21:59:31.764232 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:31.764307 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:31.764307 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:31.764307 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:31.765707 master-0 kubenswrapper[9136]: I1203 21:59:31.764318 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:32.762946 master-0 kubenswrapper[9136]: I1203 21:59:32.762900 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:32.762946 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:32.762946 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:32.762946 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:32.763324 master-0 kubenswrapper[9136]: I1203 21:59:32.763300 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:33.763345 master-0 kubenswrapper[9136]: I1203 21:59:33.763264 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:33.763345 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:33.763345 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:33.763345 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:33.764121 master-0 kubenswrapper[9136]: I1203 21:59:33.763359 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:34.762449 master-0 kubenswrapper[9136]: I1203 21:59:34.762359 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:34.762449 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:34.762449 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:34.762449 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:34.762449 master-0 kubenswrapper[9136]: I1203 21:59:34.762432 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:35.764766 master-0 kubenswrapper[9136]: I1203 21:59:35.764634 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:35.764766 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:35.764766 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:35.764766 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:35.764766 master-0 kubenswrapper[9136]: I1203 21:59:35.764805 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:36.763134 master-0 kubenswrapper[9136]: I1203 21:59:36.763069 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:36.763134 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:36.763134 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:36.763134 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:36.763653 master-0 kubenswrapper[9136]: I1203 21:59:36.763613 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:37.764452 master-0 kubenswrapper[9136]: I1203 21:59:37.764341 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:37.764452 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:37.764452 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:37.764452 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:37.765954 master-0 kubenswrapper[9136]: I1203 21:59:37.764480 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:38.764413 master-0 kubenswrapper[9136]: I1203 21:59:38.764301 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:38.764413 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:38.764413 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:38.764413 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:38.766183 master-0 kubenswrapper[9136]: I1203 21:59:38.764423 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:39.764485 master-0 kubenswrapper[9136]: I1203 21:59:39.764370 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:39.764485 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:39.764485 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:39.764485 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:39.765649 master-0 kubenswrapper[9136]: I1203 21:59:39.764497 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:40.763504 master-0 kubenswrapper[9136]: I1203 21:59:40.763438 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:40.763504 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:40.763504 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:40.763504 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:40.764233 master-0 kubenswrapper[9136]: I1203 21:59:40.764151 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:41.763598 master-0 kubenswrapper[9136]: I1203 21:59:41.763501 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:41.763598 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:41.763598 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:41.763598 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:41.763598 master-0 kubenswrapper[9136]: I1203 21:59:41.763568 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:42.763211 master-0 kubenswrapper[9136]: I1203 21:59:42.763140 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:42.763211 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:42.763211 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:42.763211 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:42.763677 master-0 kubenswrapper[9136]: I1203 21:59:42.763233 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:43.764308 master-0 kubenswrapper[9136]: I1203 21:59:43.764220 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:43.764308 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:43.764308 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:43.764308 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:43.765583 master-0 kubenswrapper[9136]: I1203 21:59:43.764323 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:44.765396 master-0 kubenswrapper[9136]: I1203 21:59:44.765298 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:44.765396 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:44.765396 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:44.765396 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:44.766369 master-0 kubenswrapper[9136]: I1203 21:59:44.765412 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:45.764042 master-0 kubenswrapper[9136]: I1203 21:59:45.763946 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:45.764042 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:45.764042 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:45.764042 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:45.764594 master-0 kubenswrapper[9136]: I1203 21:59:45.764047 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:46.763435 master-0 kubenswrapper[9136]: I1203 21:59:46.763353 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:46.763435 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:46.763435 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:46.763435 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:46.764577 master-0 kubenswrapper[9136]: I1203 21:59:46.763453 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:47.763251 master-0 kubenswrapper[9136]: I1203 21:59:47.763151 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:47.763251 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:47.763251 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:47.763251 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:47.763251 master-0 kubenswrapper[9136]: I1203 21:59:47.763228 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:48.762299 master-0 kubenswrapper[9136]: I1203 21:59:48.762230 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:48.762299 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:48.762299 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:48.762299 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:48.762660 master-0 kubenswrapper[9136]: I1203 21:59:48.762309 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:49.762892 master-0 kubenswrapper[9136]: I1203 21:59:49.762802 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:49.762892 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:49.762892 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:49.762892 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:49.762892 master-0 kubenswrapper[9136]: I1203 21:59:49.762889 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:50.763807 master-0 kubenswrapper[9136]: I1203 21:59:50.763694 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:50.763807 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:50.763807 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:50.763807 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:50.764606 master-0 kubenswrapper[9136]: I1203 21:59:50.763830 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:51.328582 master-0 kubenswrapper[9136]: I1203 21:59:51.328493 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 03 21:59:51.329332 master-0 kubenswrapper[9136]: I1203 21:59:51.329298 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.331397 master-0 kubenswrapper[9136]: I1203 21:59:51.331331 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-tdkqf" Dec 03 21:59:51.332369 master-0 kubenswrapper[9136]: I1203 21:59:51.332331 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 21:59:51.349940 master-0 kubenswrapper[9136]: I1203 21:59:51.349696 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 03 21:59:51.374962 master-0 kubenswrapper[9136]: I1203 21:59:51.372188 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dada903-4b2b-450a-a55f-502ff892fd9f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.374962 master-0 kubenswrapper[9136]: I1203 21:59:51.372290 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-var-lock\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.374962 master-0 kubenswrapper[9136]: I1203 21:59:51.372352 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.473756 master-0 kubenswrapper[9136]: I1203 21:59:51.473621 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.473756 master-0 kubenswrapper[9136]: I1203 21:59:51.473756 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.474258 master-0 kubenswrapper[9136]: I1203 21:59:51.473874 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dada903-4b2b-450a-a55f-502ff892fd9f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.474258 master-0 kubenswrapper[9136]: I1203 21:59:51.473941 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-var-lock\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.474258 master-0 kubenswrapper[9136]: I1203 21:59:51.474106 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-var-lock\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.491536 master-0 kubenswrapper[9136]: I1203 21:59:51.491456 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dada903-4b2b-450a-a55f-502ff892fd9f-kube-api-access\") pod \"installer-2-master-0\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.672002 master-0 kubenswrapper[9136]: I1203 21:59:51.671895 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 21:59:51.765057 master-0 kubenswrapper[9136]: I1203 21:59:51.764966 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:51.765057 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:51.765057 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:51.765057 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:51.765853 master-0 kubenswrapper[9136]: I1203 21:59:51.765060 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:52.139318 master-0 kubenswrapper[9136]: I1203 21:59:52.138255 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Dec 03 21:59:52.705778 master-0 kubenswrapper[9136]: I1203 21:59:52.705629 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj"] Dec 03 21:59:52.705988 master-0 kubenswrapper[9136]: I1203 21:59:52.705863 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" containerID="cri-o://9f4678d3801f7b92e605525d79efc364684c9662011dd7ae5dd7a458afa02c37" gracePeriod=30 Dec 03 21:59:52.739668 master-0 kubenswrapper[9136]: I1203 21:59:52.739597 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87"] Dec 03 21:59:52.739976 master-0 kubenswrapper[9136]: I1203 21:59:52.739933 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" podUID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" containerName="route-controller-manager" containerID="cri-o://2d9a8a18ee922fc6f32018ce01fd96ead1e3d6f161acb32757967568167037fa" gracePeriod=30 Dec 03 21:59:52.762396 master-0 kubenswrapper[9136]: I1203 21:59:52.762323 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:52.762396 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:52.762396 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:52.762396 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:52.762809 master-0 kubenswrapper[9136]: I1203 21:59:52.762407 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:52.887575 master-0 kubenswrapper[9136]: I1203 21:59:52.887524 9136 patch_prober.go:28] interesting pod/route-controller-manager-75678b97b8-qqn87 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.45:8443/healthz\": dial tcp 10.128.0.45:8443: connect: connection refused" start-of-body= Dec 03 21:59:52.888065 master-0 kubenswrapper[9136]: I1203 21:59:52.887597 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" podUID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.45:8443/healthz\": dial tcp 10.128.0.45:8443: connect: connection refused" Dec 03 21:59:52.893608 master-0 kubenswrapper[9136]: I1203 21:59:52.893574 9136 patch_prober.go:28] interesting pod/controller-manager-6b48b87d7b-m7hgj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" start-of-body= Dec 03 21:59:52.893719 master-0 kubenswrapper[9136]: I1203 21:59:52.893618 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.46:8443/healthz\": dial tcp 10.128.0.46:8443: connect: connection refused" Dec 03 21:59:53.112185 master-0 kubenswrapper[9136]: I1203 21:59:53.112102 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"5dada903-4b2b-450a-a55f-502ff892fd9f","Type":"ContainerStarted","Data":"19cd297510bc844f8288365a1588d651ec674ded0636669c86f19116e03ce004"} Dec 03 21:59:53.112185 master-0 kubenswrapper[9136]: I1203 21:59:53.112169 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"5dada903-4b2b-450a-a55f-502ff892fd9f","Type":"ContainerStarted","Data":"93441462469b8b27bc3d666672828b80b42a37ea80417a77f840571890747f10"} Dec 03 21:59:53.114886 master-0 kubenswrapper[9136]: I1203 21:59:53.114840 9136 generic.go:334] "Generic (PLEG): container finished" podID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerID="9f4678d3801f7b92e605525d79efc364684c9662011dd7ae5dd7a458afa02c37" exitCode=0 Dec 03 21:59:53.115069 master-0 kubenswrapper[9136]: I1203 21:59:53.114898 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" event={"ID":"89b8bdab-89ba-4a7d-a464-71a2f240e3c7","Type":"ContainerDied","Data":"9f4678d3801f7b92e605525d79efc364684c9662011dd7ae5dd7a458afa02c37"} Dec 03 21:59:53.115139 master-0 kubenswrapper[9136]: I1203 21:59:53.115093 9136 scope.go:117] "RemoveContainer" containerID="3bfa8deaf08a2a1e0e9189e6bd019c4a1fa728ac1cad25a37ff4255d9c02c3f0" Dec 03 21:59:53.117049 master-0 kubenswrapper[9136]: I1203 21:59:53.117022 9136 generic.go:334] "Generic (PLEG): container finished" podID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" containerID="2d9a8a18ee922fc6f32018ce01fd96ead1e3d6f161acb32757967568167037fa" exitCode=0 Dec 03 21:59:53.117110 master-0 kubenswrapper[9136]: I1203 21:59:53.117055 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" event={"ID":"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb","Type":"ContainerDied","Data":"2d9a8a18ee922fc6f32018ce01fd96ead1e3d6f161acb32757967568167037fa"} Dec 03 21:59:53.232711 master-0 kubenswrapper[9136]: I1203 21:59:53.232618 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=2.232596331 podStartE2EDuration="2.232596331s" podCreationTimestamp="2025-12-03 21:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:59:53.231401143 +0000 UTC m=+599.506577535" watchObservedRunningTime="2025-12-03 21:59:53.232596331 +0000 UTC m=+599.507772723" Dec 03 21:59:53.253573 master-0 kubenswrapper[9136]: I1203 21:59:53.252789 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:59:53.266112 master-0 kubenswrapper[9136]: I1203 21:59:53.261217 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:59:53.399869 master-0 kubenswrapper[9136]: I1203 21:59:53.399755 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8mw4\" (UniqueName: \"kubernetes.io/projected/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-kube-api-access-h8mw4\") pod \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " Dec 03 21:59:53.400123 master-0 kubenswrapper[9136]: I1203 21:59:53.399918 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-client-ca\") pod \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " Dec 03 21:59:53.400123 master-0 kubenswrapper[9136]: I1203 21:59:53.399951 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz545\" (UniqueName: \"kubernetes.io/projected/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-kube-api-access-dz545\") pod \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " Dec 03 21:59:53.400123 master-0 kubenswrapper[9136]: I1203 21:59:53.400049 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-serving-cert\") pod \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " Dec 03 21:59:53.400123 master-0 kubenswrapper[9136]: I1203 21:59:53.400084 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-config\") pod \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " Dec 03 21:59:53.400123 master-0 kubenswrapper[9136]: I1203 21:59:53.400130 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-config\") pod \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " Dec 03 21:59:53.400361 master-0 kubenswrapper[9136]: I1203 21:59:53.400162 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-serving-cert\") pod \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " Dec 03 21:59:53.400361 master-0 kubenswrapper[9136]: I1203 21:59:53.400192 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-proxy-ca-bundles\") pod \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\" (UID: \"89b8bdab-89ba-4a7d-a464-71a2f240e3c7\") " Dec 03 21:59:53.400361 master-0 kubenswrapper[9136]: I1203 21:59:53.400221 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-client-ca\") pod \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\" (UID: \"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb\") " Dec 03 21:59:53.400493 master-0 kubenswrapper[9136]: I1203 21:59:53.400431 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "89b8bdab-89ba-4a7d-a464-71a2f240e3c7" (UID: "89b8bdab-89ba-4a7d-a464-71a2f240e3c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:59:53.400639 master-0 kubenswrapper[9136]: I1203 21:59:53.400588 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.400915 master-0 kubenswrapper[9136]: I1203 21:59:53.400864 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-config" (OuterVolumeSpecName: "config") pod "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" (UID: "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:59:53.401118 master-0 kubenswrapper[9136]: I1203 21:59:53.401071 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "89b8bdab-89ba-4a7d-a464-71a2f240e3c7" (UID: "89b8bdab-89ba-4a7d-a464-71a2f240e3c7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:59:53.401288 master-0 kubenswrapper[9136]: I1203 21:59:53.401220 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-config" (OuterVolumeSpecName: "config") pod "89b8bdab-89ba-4a7d-a464-71a2f240e3c7" (UID: "89b8bdab-89ba-4a7d-a464-71a2f240e3c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:59:53.401288 master-0 kubenswrapper[9136]: I1203 21:59:53.401263 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-client-ca" (OuterVolumeSpecName: "client-ca") pod "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" (UID: "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 21:59:53.403205 master-0 kubenswrapper[9136]: I1203 21:59:53.403072 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-kube-api-access-h8mw4" (OuterVolumeSpecName: "kube-api-access-h8mw4") pod "89b8bdab-89ba-4a7d-a464-71a2f240e3c7" (UID: "89b8bdab-89ba-4a7d-a464-71a2f240e3c7"). InnerVolumeSpecName "kube-api-access-h8mw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:59:53.403445 master-0 kubenswrapper[9136]: I1203 21:59:53.403390 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89b8bdab-89ba-4a7d-a464-71a2f240e3c7" (UID: "89b8bdab-89ba-4a7d-a464-71a2f240e3c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:59:53.403879 master-0 kubenswrapper[9136]: I1203 21:59:53.403843 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-kube-api-access-dz545" (OuterVolumeSpecName: "kube-api-access-dz545") pod "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" (UID: "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb"). InnerVolumeSpecName "kube-api-access-dz545". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 21:59:53.404586 master-0 kubenswrapper[9136]: I1203 21:59:53.404399 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" (UID: "47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 21:59:53.502452 master-0 kubenswrapper[9136]: I1203 21:59:53.502381 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.502452 master-0 kubenswrapper[9136]: I1203 21:59:53.502448 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.502452 master-0 kubenswrapper[9136]: I1203 21:59:53.502467 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-config\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.502452 master-0 kubenswrapper[9136]: I1203 21:59:53.502485 9136 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.502452 master-0 kubenswrapper[9136]: I1203 21:59:53.502502 9136 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.503465 master-0 kubenswrapper[9136]: I1203 21:59:53.502522 9136 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.503465 master-0 kubenswrapper[9136]: I1203 21:59:53.502544 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8mw4\" (UniqueName: \"kubernetes.io/projected/89b8bdab-89ba-4a7d-a464-71a2f240e3c7-kube-api-access-h8mw4\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.503465 master-0 kubenswrapper[9136]: I1203 21:59:53.502563 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz545\" (UniqueName: \"kubernetes.io/projected/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb-kube-api-access-dz545\") on node \"master-0\" DevicePath \"\"" Dec 03 21:59:53.763950 master-0 kubenswrapper[9136]: I1203 21:59:53.763644 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:53.763950 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:53.763950 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:53.763950 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:53.763950 master-0 kubenswrapper[9136]: I1203 21:59:53.763733 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:54.132282 master-0 kubenswrapper[9136]: I1203 21:59:54.132185 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" event={"ID":"89b8bdab-89ba-4a7d-a464-71a2f240e3c7","Type":"ContainerDied","Data":"d1f5e826f7ae6ce38586c70ff633062c9e9f839723da64e9fa179afd4ce387ea"} Dec 03 21:59:54.132282 master-0 kubenswrapper[9136]: I1203 21:59:54.132230 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj" Dec 03 21:59:54.132282 master-0 kubenswrapper[9136]: I1203 21:59:54.132278 9136 scope.go:117] "RemoveContainer" containerID="9f4678d3801f7b92e605525d79efc364684c9662011dd7ae5dd7a458afa02c37" Dec 03 21:59:54.137204 master-0 kubenswrapper[9136]: I1203 21:59:54.137148 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" event={"ID":"47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb","Type":"ContainerDied","Data":"587880584f977d63d726e554f2ecb6f2fb8174c49c96366cee3cd4ee073e231a"} Dec 03 21:59:54.137288 master-0 kubenswrapper[9136]: I1203 21:59:54.137208 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87" Dec 03 21:59:54.158848 master-0 kubenswrapper[9136]: I1203 21:59:54.158752 9136 scope.go:117] "RemoveContainer" containerID="2d9a8a18ee922fc6f32018ce01fd96ead1e3d6f161acb32757967568167037fa" Dec 03 21:59:54.158848 master-0 kubenswrapper[9136]: I1203 21:59:54.158813 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj"] Dec 03 21:59:54.162435 master-0 kubenswrapper[9136]: I1203 21:59:54.162378 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b48b87d7b-m7hgj"] Dec 03 21:59:54.174309 master-0 kubenswrapper[9136]: I1203 21:59:54.174254 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87"] Dec 03 21:59:54.177420 master-0 kubenswrapper[9136]: I1203 21:59:54.177360 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75678b97b8-qqn87"] Dec 03 21:59:54.516562 master-0 kubenswrapper[9136]: I1203 21:59:54.516373 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77778bd57c-xdhvs"] Dec 03 21:59:54.516972 master-0 kubenswrapper[9136]: E1203 21:59:54.516790 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" Dec 03 21:59:54.516972 master-0 kubenswrapper[9136]: I1203 21:59:54.516809 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" Dec 03 21:59:54.516972 master-0 kubenswrapper[9136]: E1203 21:59:54.516847 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" Dec 03 21:59:54.516972 master-0 kubenswrapper[9136]: I1203 21:59:54.516854 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" Dec 03 21:59:54.516972 master-0 kubenswrapper[9136]: E1203 21:59:54.516866 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" containerName="route-controller-manager" Dec 03 21:59:54.516972 master-0 kubenswrapper[9136]: I1203 21:59:54.516873 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" containerName="route-controller-manager" Dec 03 21:59:54.517634 master-0 kubenswrapper[9136]: I1203 21:59:54.516999 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" Dec 03 21:59:54.517634 master-0 kubenswrapper[9136]: I1203 21:59:54.517018 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" containerName="route-controller-manager" Dec 03 21:59:54.517634 master-0 kubenswrapper[9136]: I1203 21:59:54.517597 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.519830 master-0 kubenswrapper[9136]: I1203 21:59:54.519788 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.519959 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-dhpt2" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.520440 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.520457 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.523381 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc"] Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.524092 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" containerName="controller-manager" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.524961 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.527091 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.527275 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 21:59:54.527597 master-0 kubenswrapper[9136]: I1203 21:59:54.527574 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vbccl" Dec 03 21:59:54.529339 master-0 kubenswrapper[9136]: I1203 21:59:54.528571 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 21:59:54.529339 master-0 kubenswrapper[9136]: I1203 21:59:54.528640 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 21:59:54.529339 master-0 kubenswrapper[9136]: I1203 21:59:54.528876 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 21:59:54.529339 master-0 kubenswrapper[9136]: I1203 21:59:54.529092 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 21:59:54.531046 master-0 kubenswrapper[9136]: I1203 21:59:54.531001 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 21:59:54.532948 master-0 kubenswrapper[9136]: I1203 21:59:54.532886 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 21:59:54.533609 master-0 kubenswrapper[9136]: I1203 21:59:54.533558 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77778bd57c-xdhvs"] Dec 03 21:59:54.536414 master-0 kubenswrapper[9136]: I1203 21:59:54.536358 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc"] Dec 03 21:59:54.617810 master-0 kubenswrapper[9136]: I1203 21:59:54.617446 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.617810 master-0 kubenswrapper[9136]: I1203 21:59:54.617549 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.617810 master-0 kubenswrapper[9136]: I1203 21:59:54.617588 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.617810 master-0 kubenswrapper[9136]: I1203 21:59:54.617631 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.617810 master-0 kubenswrapper[9136]: I1203 21:59:54.617660 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.719407 master-0 kubenswrapper[9136]: I1203 21:59:54.719332 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719415 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719453 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719487 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719518 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719547 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719578 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719603 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.719646 master-0 kubenswrapper[9136]: I1203 21:59:54.719635 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.720492 master-0 kubenswrapper[9136]: I1203 21:59:54.720467 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.720972 master-0 kubenswrapper[9136]: I1203 21:59:54.720911 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.721648 master-0 kubenswrapper[9136]: I1203 21:59:54.721593 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.724939 master-0 kubenswrapper[9136]: I1203 21:59:54.724899 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.743138 master-0 kubenswrapper[9136]: I1203 21:59:54.743030 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.763303 master-0 kubenswrapper[9136]: I1203 21:59:54.763235 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:54.763303 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:54.763303 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:54.763303 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:54.763501 master-0 kubenswrapper[9136]: I1203 21:59:54.763343 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:54.821274 master-0 kubenswrapper[9136]: I1203 21:59:54.821167 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.821274 master-0 kubenswrapper[9136]: I1203 21:59:54.821254 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.821588 master-0 kubenswrapper[9136]: I1203 21:59:54.821297 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.821588 master-0 kubenswrapper[9136]: I1203 21:59:54.821337 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.822434 master-0 kubenswrapper[9136]: I1203 21:59:54.822378 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.822628 master-0 kubenswrapper[9136]: I1203 21:59:54.822586 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.826240 master-0 kubenswrapper[9136]: I1203 21:59:54.826190 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.837309 master-0 kubenswrapper[9136]: I1203 21:59:54.837254 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:54.898651 master-0 kubenswrapper[9136]: I1203 21:59:54.898593 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:54.933457 master-0 kubenswrapper[9136]: I1203 21:59:54.931143 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:55.155080 master-0 kubenswrapper[9136]: I1203 21:59:55.155009 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/3.log" Dec 03 21:59:55.156614 master-0 kubenswrapper[9136]: I1203 21:59:55.156591 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/2.log" Dec 03 21:59:55.157159 master-0 kubenswrapper[9136]: I1203 21:59:55.157122 9136 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" exitCode=1 Dec 03 21:59:55.157330 master-0 kubenswrapper[9136]: I1203 21:59:55.157303 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerDied","Data":"70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d"} Dec 03 21:59:55.157483 master-0 kubenswrapper[9136]: I1203 21:59:55.157460 9136 scope.go:117] "RemoveContainer" containerID="ee25dcf2b655ead4c6f49b566e41979c025f7cde0a2815dc7389e10f4849f7af" Dec 03 21:59:55.159264 master-0 kubenswrapper[9136]: I1203 21:59:55.159222 9136 scope.go:117] "RemoveContainer" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" Dec 03 21:59:55.161823 master-0 kubenswrapper[9136]: E1203 21:59:55.160669 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 21:59:55.346937 master-0 kubenswrapper[9136]: I1203 21:59:55.346886 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77778bd57c-xdhvs"] Dec 03 21:59:55.432229 master-0 kubenswrapper[9136]: I1203 21:59:55.432163 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc"] Dec 03 21:59:55.437463 master-0 kubenswrapper[9136]: W1203 21:59:55.437408 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64856d96_023f_46db_819c_02f1adea5aab.slice/crio-ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f WatchSource:0}: Error finding container ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f: Status 404 returned error can't find the container with id ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f Dec 03 21:59:55.763683 master-0 kubenswrapper[9136]: I1203 21:59:55.763588 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:55.763683 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:55.763683 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:55.763683 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:55.763683 master-0 kubenswrapper[9136]: I1203 21:59:55.763670 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:55.924079 master-0 kubenswrapper[9136]: I1203 21:59:55.923995 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb" path="/var/lib/kubelet/pods/47422ce3-d83f-4dfb-9bf2-e4fd303fe1cb/volumes" Dec 03 21:59:55.925189 master-0 kubenswrapper[9136]: I1203 21:59:55.925141 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b8bdab-89ba-4a7d-a464-71a2f240e3c7" path="/var/lib/kubelet/pods/89b8bdab-89ba-4a7d-a464-71a2f240e3c7/volumes" Dec 03 21:59:56.167754 master-0 kubenswrapper[9136]: I1203 21:59:56.167668 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" event={"ID":"f6498ac1-7d07-4a5f-a968-d8bda72d1002","Type":"ContainerStarted","Data":"c83609fd10c7378baa8493e0ab13c737c56d73cf8e97ee62fc24dbfcfaebb463"} Dec 03 21:59:56.168332 master-0 kubenswrapper[9136]: I1203 21:59:56.167761 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" event={"ID":"f6498ac1-7d07-4a5f-a968-d8bda72d1002","Type":"ContainerStarted","Data":"86b3974531c12f65b34f84b67c60273ab31a4569e9157b7dd04d59eef5e8591d"} Dec 03 21:59:56.168332 master-0 kubenswrapper[9136]: I1203 21:59:56.167880 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:56.170444 master-0 kubenswrapper[9136]: I1203 21:59:56.170398 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/3.log" Dec 03 21:59:56.173614 master-0 kubenswrapper[9136]: I1203 21:59:56.173565 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" event={"ID":"64856d96-023f-46db-819c-02f1adea5aab","Type":"ContainerStarted","Data":"5af03b1fc8b9d3242cc113182968101569dfe657f33d984d37c023bd205c8309"} Dec 03 21:59:56.173688 master-0 kubenswrapper[9136]: I1203 21:59:56.173620 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" event={"ID":"64856d96-023f-46db-819c-02f1adea5aab","Type":"ContainerStarted","Data":"ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f"} Dec 03 21:59:56.174028 master-0 kubenswrapper[9136]: I1203 21:59:56.173983 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:56.176160 master-0 kubenswrapper[9136]: I1203 21:59:56.176095 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 21:59:56.179711 master-0 kubenswrapper[9136]: I1203 21:59:56.179650 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 21:59:56.193526 master-0 kubenswrapper[9136]: I1203 21:59:56.193445 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" podStartSLOduration=4.193419997 podStartE2EDuration="4.193419997s" podCreationTimestamp="2025-12-03 21:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:59:56.191233489 +0000 UTC m=+602.466409911" watchObservedRunningTime="2025-12-03 21:59:56.193419997 +0000 UTC m=+602.468596389" Dec 03 21:59:56.260881 master-0 kubenswrapper[9136]: I1203 21:59:56.260813 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" podStartSLOduration=4.260792305 podStartE2EDuration="4.260792305s" podCreationTimestamp="2025-12-03 21:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:59:56.258739691 +0000 UTC m=+602.533916113" watchObservedRunningTime="2025-12-03 21:59:56.260792305 +0000 UTC m=+602.535968717" Dec 03 21:59:56.762170 master-0 kubenswrapper[9136]: I1203 21:59:56.762112 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:56.762170 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:56.762170 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:56.762170 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:56.762170 master-0 kubenswrapper[9136]: I1203 21:59:56.762171 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:57.462668 master-0 kubenswrapper[9136]: I1203 21:59:57.462603 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nrcql"] Dec 03 21:59:57.463394 master-0 kubenswrapper[9136]: I1203 21:59:57.463374 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.465640 master-0 kubenswrapper[9136]: I1203 21:59:57.465599 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 03 21:59:57.466125 master-0 kubenswrapper[9136]: I1203 21:59:57.466097 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-5mjhs" Dec 03 21:59:57.570642 master-0 kubenswrapper[9136]: I1203 21:59:57.570581 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/519e403c-28ab-4750-8143-34c74bf526ce-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.570915 master-0 kubenswrapper[9136]: I1203 21:59:57.570661 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/519e403c-28ab-4750-8143-34c74bf526ce-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.570915 master-0 kubenswrapper[9136]: I1203 21:59:57.570696 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6kt9\" (UniqueName: \"kubernetes.io/projected/519e403c-28ab-4750-8143-34c74bf526ce-kube-api-access-v6kt9\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.570915 master-0 kubenswrapper[9136]: I1203 21:59:57.570731 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/519e403c-28ab-4750-8143-34c74bf526ce-ready\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.671981 master-0 kubenswrapper[9136]: I1203 21:59:57.671893 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/519e403c-28ab-4750-8143-34c74bf526ce-ready\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.672279 master-0 kubenswrapper[9136]: I1203 21:59:57.672018 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/519e403c-28ab-4750-8143-34c74bf526ce-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.672279 master-0 kubenswrapper[9136]: I1203 21:59:57.672131 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/519e403c-28ab-4750-8143-34c74bf526ce-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.672279 master-0 kubenswrapper[9136]: I1203 21:59:57.672194 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6kt9\" (UniqueName: \"kubernetes.io/projected/519e403c-28ab-4750-8143-34c74bf526ce-kube-api-access-v6kt9\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.672388 master-0 kubenswrapper[9136]: I1203 21:59:57.672336 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/519e403c-28ab-4750-8143-34c74bf526ce-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.672476 master-0 kubenswrapper[9136]: I1203 21:59:57.672427 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/519e403c-28ab-4750-8143-34c74bf526ce-ready\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.672907 master-0 kubenswrapper[9136]: I1203 21:59:57.672866 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/519e403c-28ab-4750-8143-34c74bf526ce-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.687678 master-0 kubenswrapper[9136]: I1203 21:59:57.687618 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6kt9\" (UniqueName: \"kubernetes.io/projected/519e403c-28ab-4750-8143-34c74bf526ce-kube-api-access-v6kt9\") pod \"cni-sysctl-allowlist-ds-nrcql\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.763204 master-0 kubenswrapper[9136]: I1203 21:59:57.763025 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:57.763204 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:57.763204 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:57.763204 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:57.763204 master-0 kubenswrapper[9136]: I1203 21:59:57.763160 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:57.781934 master-0 kubenswrapper[9136]: I1203 21:59:57.781866 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 21:59:57.807677 master-0 kubenswrapper[9136]: W1203 21:59:57.807590 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519e403c_28ab_4750_8143_34c74bf526ce.slice/crio-1644d583ba002a0e82a19318277118e077966cb952f62ba634455a674e38f171 WatchSource:0}: Error finding container 1644d583ba002a0e82a19318277118e077966cb952f62ba634455a674e38f171: Status 404 returned error can't find the container with id 1644d583ba002a0e82a19318277118e077966cb952f62ba634455a674e38f171 Dec 03 21:59:58.208287 master-0 kubenswrapper[9136]: I1203 21:59:58.208218 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" event={"ID":"519e403c-28ab-4750-8143-34c74bf526ce","Type":"ContainerStarted","Data":"8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d"} Dec 03 21:59:58.208629 master-0 kubenswrapper[9136]: I1203 21:59:58.208304 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" event={"ID":"519e403c-28ab-4750-8143-34c74bf526ce","Type":"ContainerStarted","Data":"1644d583ba002a0e82a19318277118e077966cb952f62ba634455a674e38f171"} Dec 03 21:59:58.233436 master-0 kubenswrapper[9136]: I1203 21:59:58.233304 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" podStartSLOduration=1.233278813 podStartE2EDuration="1.233278813s" podCreationTimestamp="2025-12-03 21:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 21:59:58.227487893 +0000 UTC m=+604.502664305" watchObservedRunningTime="2025-12-03 21:59:58.233278813 +0000 UTC m=+604.508455225" Dec 03 21:59:58.762312 master-0 kubenswrapper[9136]: I1203 21:59:58.762215 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:58.762312 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:58.762312 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:58.762312 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:58.763120 master-0 kubenswrapper[9136]: I1203 21:59:58.762333 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 21:59:59.763154 master-0 kubenswrapper[9136]: I1203 21:59:59.763050 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 21:59:59.763154 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 21:59:59.763154 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 21:59:59.763154 master-0 kubenswrapper[9136]: healthz check failed Dec 03 21:59:59.763958 master-0 kubenswrapper[9136]: I1203 21:59:59.763185 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:00.171660 master-0 kubenswrapper[9136]: I1203 22:00:00.171582 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m"] Dec 03 22:00:00.172629 master-0 kubenswrapper[9136]: I1203 22:00:00.172605 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.176591 master-0 kubenswrapper[9136]: I1203 22:00:00.176529 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m"] Dec 03 22:00:00.176866 master-0 kubenswrapper[9136]: I1203 22:00:00.176763 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 22:00:00.178601 master-0 kubenswrapper[9136]: I1203 22:00:00.178558 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:00:00.322697 master-0 kubenswrapper[9136]: I1203 22:00:00.322635 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea59f59-8970-4eea-994d-9763792ee704-config-volume\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.322697 master-0 kubenswrapper[9136]: I1203 22:00:00.322704 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25d22\" (UniqueName: \"kubernetes.io/projected/4ea59f59-8970-4eea-994d-9763792ee704-kube-api-access-25d22\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.322973 master-0 kubenswrapper[9136]: I1203 22:00:00.322851 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea59f59-8970-4eea-994d-9763792ee704-secret-volume\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.424355 master-0 kubenswrapper[9136]: I1203 22:00:00.424217 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25d22\" (UniqueName: \"kubernetes.io/projected/4ea59f59-8970-4eea-994d-9763792ee704-kube-api-access-25d22\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.424355 master-0 kubenswrapper[9136]: I1203 22:00:00.424354 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea59f59-8970-4eea-994d-9763792ee704-secret-volume\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.424601 master-0 kubenswrapper[9136]: I1203 22:00:00.424393 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea59f59-8970-4eea-994d-9763792ee704-config-volume\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.425533 master-0 kubenswrapper[9136]: I1203 22:00:00.425501 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea59f59-8970-4eea-994d-9763792ee704-config-volume\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.428868 master-0 kubenswrapper[9136]: I1203 22:00:00.428813 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea59f59-8970-4eea-994d-9763792ee704-secret-volume\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.452299 master-0 kubenswrapper[9136]: I1203 22:00:00.452232 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25d22\" (UniqueName: \"kubernetes.io/projected/4ea59f59-8970-4eea-994d-9763792ee704-kube-api-access-25d22\") pod \"collect-profiles-29413320-ndk8m\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.552748 master-0 kubenswrapper[9136]: I1203 22:00:00.552667 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:00.762422 master-0 kubenswrapper[9136]: I1203 22:00:00.762342 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:00.762422 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:00.762422 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:00.762422 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:00.762422 master-0 kubenswrapper[9136]: I1203 22:00:00.762420 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:00.964016 master-0 kubenswrapper[9136]: I1203 22:00:00.963950 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m"] Dec 03 22:00:00.971866 master-0 kubenswrapper[9136]: W1203 22:00:00.968891 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ea59f59_8970_4eea_994d_9763792ee704.slice/crio-bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072 WatchSource:0}: Error finding container bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072: Status 404 returned error can't find the container with id bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072 Dec 03 22:00:01.247285 master-0 kubenswrapper[9136]: I1203 22:00:01.247121 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" event={"ID":"4ea59f59-8970-4eea-994d-9763792ee704","Type":"ContainerStarted","Data":"bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072"} Dec 03 22:00:01.764206 master-0 kubenswrapper[9136]: I1203 22:00:01.764096 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:01.764206 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:01.764206 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:01.764206 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:01.764686 master-0 kubenswrapper[9136]: I1203 22:00:01.764205 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:02.254029 master-0 kubenswrapper[9136]: I1203 22:00:02.253960 9136 generic.go:334] "Generic (PLEG): container finished" podID="4ea59f59-8970-4eea-994d-9763792ee704" containerID="e80985ff027f59006e6b5b28f1d88811cb0ea27e878e3be4ff3dc70b657dbfd8" exitCode=0 Dec 03 22:00:02.254029 master-0 kubenswrapper[9136]: I1203 22:00:02.254019 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" event={"ID":"4ea59f59-8970-4eea-994d-9763792ee704","Type":"ContainerDied","Data":"e80985ff027f59006e6b5b28f1d88811cb0ea27e878e3be4ff3dc70b657dbfd8"} Dec 03 22:00:02.763763 master-0 kubenswrapper[9136]: I1203 22:00:02.763658 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:02.763763 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:02.763763 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:02.763763 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:02.764352 master-0 kubenswrapper[9136]: I1203 22:00:02.763759 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:03.460170 master-0 kubenswrapper[9136]: I1203 22:00:03.460077 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 03 22:00:03.461605 master-0 kubenswrapper[9136]: I1203 22:00:03.461549 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.464933 master-0 kubenswrapper[9136]: I1203 22:00:03.464889 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-dzb9w" Dec 03 22:00:03.465053 master-0 kubenswrapper[9136]: I1203 22:00:03.465001 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Dec 03 22:00:03.478132 master-0 kubenswrapper[9136]: I1203 22:00:03.478081 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 03 22:00:03.573162 master-0 kubenswrapper[9136]: I1203 22:00:03.573072 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kube-api-access\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.573162 master-0 kubenswrapper[9136]: I1203 22:00:03.573165 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-var-lock\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.573614 master-0 kubenswrapper[9136]: I1203 22:00:03.573506 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.577526 master-0 kubenswrapper[9136]: I1203 22:00:03.577486 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:03.674255 master-0 kubenswrapper[9136]: I1203 22:00:03.674184 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea59f59-8970-4eea-994d-9763792ee704-secret-volume\") pod \"4ea59f59-8970-4eea-994d-9763792ee704\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " Dec 03 22:00:03.674255 master-0 kubenswrapper[9136]: I1203 22:00:03.674247 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25d22\" (UniqueName: \"kubernetes.io/projected/4ea59f59-8970-4eea-994d-9763792ee704-kube-api-access-25d22\") pod \"4ea59f59-8970-4eea-994d-9763792ee704\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " Dec 03 22:00:03.674608 master-0 kubenswrapper[9136]: I1203 22:00:03.674306 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea59f59-8970-4eea-994d-9763792ee704-config-volume\") pod \"4ea59f59-8970-4eea-994d-9763792ee704\" (UID: \"4ea59f59-8970-4eea-994d-9763792ee704\") " Dec 03 22:00:03.674608 master-0 kubenswrapper[9136]: I1203 22:00:03.674439 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.674608 master-0 kubenswrapper[9136]: I1203 22:00:03.674464 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kube-api-access\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.674608 master-0 kubenswrapper[9136]: I1203 22:00:03.674492 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-var-lock\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.674608 master-0 kubenswrapper[9136]: I1203 22:00:03.674605 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-var-lock\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.675321 master-0 kubenswrapper[9136]: I1203 22:00:03.675266 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea59f59-8970-4eea-994d-9763792ee704-config-volume" (OuterVolumeSpecName: "config-volume") pod "4ea59f59-8970-4eea-994d-9763792ee704" (UID: "4ea59f59-8970-4eea-994d-9763792ee704"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:00:03.675686 master-0 kubenswrapper[9136]: I1203 22:00:03.675610 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.678711 master-0 kubenswrapper[9136]: I1203 22:00:03.678653 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ea59f59-8970-4eea-994d-9763792ee704-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4ea59f59-8970-4eea-994d-9763792ee704" (UID: "4ea59f59-8970-4eea-994d-9763792ee704"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:00:03.679578 master-0 kubenswrapper[9136]: I1203 22:00:03.679526 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea59f59-8970-4eea-994d-9763792ee704-kube-api-access-25d22" (OuterVolumeSpecName: "kube-api-access-25d22") pod "4ea59f59-8970-4eea-994d-9763792ee704" (UID: "4ea59f59-8970-4eea-994d-9763792ee704"). InnerVolumeSpecName "kube-api-access-25d22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:03.694968 master-0 kubenswrapper[9136]: I1203 22:00:03.694914 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kube-api-access\") pod \"installer-2-master-0\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:03.762196 master-0 kubenswrapper[9136]: I1203 22:00:03.762122 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:03.762196 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:03.762196 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:03.762196 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:03.762196 master-0 kubenswrapper[9136]: I1203 22:00:03.762186 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:03.776939 master-0 kubenswrapper[9136]: I1203 22:00:03.776882 9136 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4ea59f59-8970-4eea-994d-9763792ee704-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:03.777182 master-0 kubenswrapper[9136]: I1203 22:00:03.777154 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25d22\" (UniqueName: \"kubernetes.io/projected/4ea59f59-8970-4eea-994d-9763792ee704-kube-api-access-25d22\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:03.777456 master-0 kubenswrapper[9136]: I1203 22:00:03.777432 9136 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ea59f59-8970-4eea-994d-9763792ee704-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:03.788462 master-0 kubenswrapper[9136]: I1203 22:00:03.788403 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:04.191898 master-0 kubenswrapper[9136]: I1203 22:00:04.191805 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Dec 03 22:00:04.202153 master-0 kubenswrapper[9136]: W1203 22:00:04.202082 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod70e52a8c_7f9e_47fa_85ca_41f90dcb9747.slice/crio-25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8 WatchSource:0}: Error finding container 25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8: Status 404 returned error can't find the container with id 25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8 Dec 03 22:00:04.270503 master-0 kubenswrapper[9136]: I1203 22:00:04.270405 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"70e52a8c-7f9e-47fa-85ca-41f90dcb9747","Type":"ContainerStarted","Data":"25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8"} Dec 03 22:00:04.272751 master-0 kubenswrapper[9136]: I1203 22:00:04.272705 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" event={"ID":"4ea59f59-8970-4eea-994d-9763792ee704","Type":"ContainerDied","Data":"bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072"} Dec 03 22:00:04.272751 master-0 kubenswrapper[9136]: I1203 22:00:04.272742 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072" Dec 03 22:00:04.272913 master-0 kubenswrapper[9136]: I1203 22:00:04.272830 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:00:04.763474 master-0 kubenswrapper[9136]: I1203 22:00:04.763411 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:04.763474 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:04.763474 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:04.763474 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:04.764121 master-0 kubenswrapper[9136]: I1203 22:00:04.763488 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:05.281275 master-0 kubenswrapper[9136]: I1203 22:00:05.281189 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"70e52a8c-7f9e-47fa-85ca-41f90dcb9747","Type":"ContainerStarted","Data":"888690db481ab6164f73ea7eec62997e93d90f2abdb333711d2e3c24534b02e9"} Dec 03 22:00:05.763644 master-0 kubenswrapper[9136]: I1203 22:00:05.763542 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:05.763644 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:05.763644 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:05.763644 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:05.764339 master-0 kubenswrapper[9136]: I1203 22:00:05.763682 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:06.021308 master-0 kubenswrapper[9136]: I1203 22:00:06.021097 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=3.021068847 podStartE2EDuration="3.021068847s" podCreationTimestamp="2025-12-03 22:00:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:00:05.305397278 +0000 UTC m=+611.580573680" watchObservedRunningTime="2025-12-03 22:00:06.021068847 +0000 UTC m=+612.296245259" Dec 03 22:00:06.021682 master-0 kubenswrapper[9136]: I1203 22:00:06.021508 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 03 22:00:06.021903 master-0 kubenswrapper[9136]: E1203 22:00:06.021861 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea59f59-8970-4eea-994d-9763792ee704" containerName="collect-profiles" Dec 03 22:00:06.021903 master-0 kubenswrapper[9136]: I1203 22:00:06.021898 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea59f59-8970-4eea-994d-9763792ee704" containerName="collect-profiles" Dec 03 22:00:06.022801 master-0 kubenswrapper[9136]: I1203 22:00:06.022203 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea59f59-8970-4eea-994d-9763792ee704" containerName="collect-profiles" Dec 03 22:00:06.023414 master-0 kubenswrapper[9136]: I1203 22:00:06.023372 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.027250 master-0 kubenswrapper[9136]: I1203 22:00:06.027196 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-m76lw" Dec 03 22:00:06.028039 master-0 kubenswrapper[9136]: I1203 22:00:06.027310 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 03 22:00:06.032301 master-0 kubenswrapper[9136]: I1203 22:00:06.032212 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 03 22:00:06.116892 master-0 kubenswrapper[9136]: I1203 22:00:06.116809 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.116892 master-0 kubenswrapper[9136]: I1203 22:00:06.116879 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.117226 master-0 kubenswrapper[9136]: I1203 22:00:06.116926 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25602d69-3aec-487d-8d62-c2c21f27e2b7-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.218021 master-0 kubenswrapper[9136]: I1203 22:00:06.217938 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.218021 master-0 kubenswrapper[9136]: I1203 22:00:06.218022 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.218304 master-0 kubenswrapper[9136]: I1203 22:00:06.218071 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25602d69-3aec-487d-8d62-c2c21f27e2b7-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.218304 master-0 kubenswrapper[9136]: I1203 22:00:06.218128 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.218304 master-0 kubenswrapper[9136]: I1203 22:00:06.218205 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.238229 master-0 kubenswrapper[9136]: I1203 22:00:06.238196 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25602d69-3aec-487d-8d62-c2c21f27e2b7-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.379456 master-0 kubenswrapper[9136]: I1203 22:00:06.379359 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:06.772889 master-0 kubenswrapper[9136]: I1203 22:00:06.772009 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:06.772889 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:06.772889 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:06.772889 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:06.772889 master-0 kubenswrapper[9136]: I1203 22:00:06.772084 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:06.806011 master-0 kubenswrapper[9136]: I1203 22:00:06.805949 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Dec 03 22:00:07.299981 master-0 kubenswrapper[9136]: I1203 22:00:07.299873 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"25602d69-3aec-487d-8d62-c2c21f27e2b7","Type":"ContainerStarted","Data":"8bd5db9bea83f9fc4e7c948b8e5ecf2d5c83b59963dd6e82975811d819eaa07a"} Dec 03 22:00:07.299981 master-0 kubenswrapper[9136]: I1203 22:00:07.299973 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"25602d69-3aec-487d-8d62-c2c21f27e2b7","Type":"ContainerStarted","Data":"8804c8caf822254253605f55cc71b86af43d373e50704b9313038bda7b60d32d"} Dec 03 22:00:07.763146 master-0 kubenswrapper[9136]: I1203 22:00:07.763041 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:07.763146 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:07.763146 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:07.763146 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:07.763146 master-0 kubenswrapper[9136]: I1203 22:00:07.763137 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:07.782854 master-0 kubenswrapper[9136]: I1203 22:00:07.782789 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 22:00:07.816062 master-0 kubenswrapper[9136]: I1203 22:00:07.816017 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 22:00:07.837266 master-0 kubenswrapper[9136]: I1203 22:00:07.837126 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=1.837094596 podStartE2EDuration="1.837094596s" podCreationTimestamp="2025-12-03 22:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:00:07.323963383 +0000 UTC m=+613.599139785" watchObservedRunningTime="2025-12-03 22:00:07.837094596 +0000 UTC m=+614.112271018" Dec 03 22:00:07.974050 master-0 kubenswrapper[9136]: I1203 22:00:07.973950 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-5bdcc987c4-5cs48"] Dec 03 22:00:07.974992 master-0 kubenswrapper[9136]: I1203 22:00:07.974957 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:07.976755 master-0 kubenswrapper[9136]: I1203 22:00:07.976667 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-hf9lh" Dec 03 22:00:07.979216 master-0 kubenswrapper[9136]: I1203 22:00:07.979153 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5bdcc987c4-5cs48"] Dec 03 22:00:08.150737 master-0 kubenswrapper[9136]: I1203 22:00:08.150658 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.151106 master-0 kubenswrapper[9136]: I1203 22:00:08.150801 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9q7k\" (UniqueName: \"kubernetes.io/projected/452012bf-eae1-4e69-9ba1-034309e9f2c8-kube-api-access-b9q7k\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.252402 master-0 kubenswrapper[9136]: I1203 22:00:08.252320 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9q7k\" (UniqueName: \"kubernetes.io/projected/452012bf-eae1-4e69-9ba1-034309e9f2c8-kube-api-access-b9q7k\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.252728 master-0 kubenswrapper[9136]: I1203 22:00:08.252488 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.256720 master-0 kubenswrapper[9136]: I1203 22:00:08.256633 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.270876 master-0 kubenswrapper[9136]: I1203 22:00:08.270792 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9q7k\" (UniqueName: \"kubernetes.io/projected/452012bf-eae1-4e69-9ba1-034309e9f2c8-kube-api-access-b9q7k\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.310349 master-0 kubenswrapper[9136]: I1203 22:00:08.310246 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:00:08.472030 master-0 kubenswrapper[9136]: I1203 22:00:08.471966 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nrcql"] Dec 03 22:00:08.638277 master-0 kubenswrapper[9136]: I1203 22:00:08.638194 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 22:00:08.639263 master-0 kubenswrapper[9136]: I1203 22:00:08.639241 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.642270 master-0 kubenswrapper[9136]: I1203 22:00:08.642027 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 22:00:08.643420 master-0 kubenswrapper[9136]: I1203 22:00:08.642621 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-wkdbt" Dec 03 22:00:08.654475 master-0 kubenswrapper[9136]: I1203 22:00:08.654419 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 22:00:08.758261 master-0 kubenswrapper[9136]: I1203 22:00:08.758199 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-5bdcc987c4-5cs48"] Dec 03 22:00:08.758423 master-0 kubenswrapper[9136]: I1203 22:00:08.758403 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.758475 master-0 kubenswrapper[9136]: I1203 22:00:08.758462 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-var-lock\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.758523 master-0 kubenswrapper[9136]: I1203 22:00:08.758499 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d60f02e-1803-461e-9606-667d91fcae14-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.762216 master-0 kubenswrapper[9136]: I1203 22:00:08.762165 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:08.762216 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:08.762216 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:08.762216 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:08.762397 master-0 kubenswrapper[9136]: I1203 22:00:08.762229 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:08.763678 master-0 kubenswrapper[9136]: W1203 22:00:08.763631 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452012bf_eae1_4e69_9ba1_034309e9f2c8.slice/crio-01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254 WatchSource:0}: Error finding container 01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254: Status 404 returned error can't find the container with id 01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254 Dec 03 22:00:08.859614 master-0 kubenswrapper[9136]: I1203 22:00:08.859511 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.860287 master-0 kubenswrapper[9136]: I1203 22:00:08.859758 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-var-lock\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.860287 master-0 kubenswrapper[9136]: I1203 22:00:08.859851 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d60f02e-1803-461e-9606-667d91fcae14-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.860827 master-0 kubenswrapper[9136]: I1203 22:00:08.860743 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-var-lock\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.861153 master-0 kubenswrapper[9136]: I1203 22:00:08.860761 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.886054 master-0 kubenswrapper[9136]: I1203 22:00:08.885871 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d60f02e-1803-461e-9606-667d91fcae14-kube-api-access\") pod \"installer-2-master-0\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:08.963968 master-0 kubenswrapper[9136]: I1203 22:00:08.963890 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:09.320280 master-0 kubenswrapper[9136]: I1203 22:00:09.320058 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" event={"ID":"452012bf-eae1-4e69-9ba1-034309e9f2c8","Type":"ContainerStarted","Data":"f9212175a2585131970bc08c0d0527846ea259043c2108c6526b1e6b2e585b06"} Dec 03 22:00:09.320280 master-0 kubenswrapper[9136]: I1203 22:00:09.320134 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" event={"ID":"452012bf-eae1-4e69-9ba1-034309e9f2c8","Type":"ContainerStarted","Data":"30b3f666fdeda026ea14ba441523b7a092126d3671fc875c414f09a1d3f41001"} Dec 03 22:00:09.320280 master-0 kubenswrapper[9136]: I1203 22:00:09.320162 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" event={"ID":"452012bf-eae1-4e69-9ba1-034309e9f2c8","Type":"ContainerStarted","Data":"01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254"} Dec 03 22:00:09.320819 master-0 kubenswrapper[9136]: I1203 22:00:09.320330 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" gracePeriod=30 Dec 03 22:00:09.339107 master-0 kubenswrapper[9136]: I1203 22:00:09.339009 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" podStartSLOduration=2.33898762 podStartE2EDuration="2.33898762s" podCreationTimestamp="2025-12-03 22:00:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:00:09.337872605 +0000 UTC m=+615.613048997" watchObservedRunningTime="2025-12-03 22:00:09.33898762 +0000 UTC m=+615.614164022" Dec 03 22:00:09.387844 master-0 kubenswrapper[9136]: I1203 22:00:09.387750 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj"] Dec 03 22:00:09.388090 master-0 kubenswrapper[9136]: I1203 22:00:09.388003 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="multus-admission-controller" containerID="cri-o://b43199be9a4f7a0f84f14243db251c1ad19549fc366171d1c2e7a7eabf31f5ce" gracePeriod=30 Dec 03 22:00:09.388377 master-0 kubenswrapper[9136]: I1203 22:00:09.388350 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="kube-rbac-proxy" containerID="cri-o://064594605d70a71f5b8c396507bbd62023c90012ba2c394b075d83b5e3f0a671" gracePeriod=30 Dec 03 22:00:09.433989 master-0 kubenswrapper[9136]: W1203 22:00:09.433937 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d60f02e_1803_461e_9606_667d91fcae14.slice/crio-672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532 WatchSource:0}: Error finding container 672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532: Status 404 returned error can't find the container with id 672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532 Dec 03 22:00:09.434938 master-0 kubenswrapper[9136]: I1203 22:00:09.434906 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Dec 03 22:00:09.763870 master-0 kubenswrapper[9136]: I1203 22:00:09.763402 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:09.763870 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:09.763870 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:09.763870 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:09.763870 master-0 kubenswrapper[9136]: I1203 22:00:09.763507 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:09.907460 master-0 kubenswrapper[9136]: I1203 22:00:09.907374 9136 scope.go:117] "RemoveContainer" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" Dec 03 22:00:09.908177 master-0 kubenswrapper[9136]: E1203 22:00:09.907598 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:00:10.334620 master-0 kubenswrapper[9136]: I1203 22:00:10.334504 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"8d60f02e-1803-461e-9606-667d91fcae14","Type":"ContainerStarted","Data":"a31f9ca2a3872e9cae07acaea514d81ec85d80124c14419e3e1663f38e942380"} Dec 03 22:00:10.334620 master-0 kubenswrapper[9136]: I1203 22:00:10.334589 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"8d60f02e-1803-461e-9606-667d91fcae14","Type":"ContainerStarted","Data":"672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532"} Dec 03 22:00:10.342968 master-0 kubenswrapper[9136]: I1203 22:00:10.341017 9136 generic.go:334] "Generic (PLEG): container finished" podID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerID="064594605d70a71f5b8c396507bbd62023c90012ba2c394b075d83b5e3f0a671" exitCode=0 Dec 03 22:00:10.342968 master-0 kubenswrapper[9136]: I1203 22:00:10.341127 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" event={"ID":"134c10ef-9f37-4a77-8e8b-4f8326bc8f40","Type":"ContainerDied","Data":"064594605d70a71f5b8c396507bbd62023c90012ba2c394b075d83b5e3f0a671"} Dec 03 22:00:10.431872 master-0 kubenswrapper[9136]: I1203 22:00:10.430561 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.430528137 podStartE2EDuration="2.430528137s" podCreationTimestamp="2025-12-03 22:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:00:10.412280421 +0000 UTC m=+616.687456843" watchObservedRunningTime="2025-12-03 22:00:10.430528137 +0000 UTC m=+616.705704539" Dec 03 22:00:10.762960 master-0 kubenswrapper[9136]: I1203 22:00:10.762845 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:10.762960 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:10.762960 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:10.762960 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:10.763471 master-0 kubenswrapper[9136]: I1203 22:00:10.762971 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:11.762933 master-0 kubenswrapper[9136]: I1203 22:00:11.762812 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:11.762933 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:11.762933 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:11.762933 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:11.762933 master-0 kubenswrapper[9136]: I1203 22:00:11.762925 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:12.763715 master-0 kubenswrapper[9136]: I1203 22:00:12.763631 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:12.763715 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:12.763715 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:12.763715 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:12.764949 master-0 kubenswrapper[9136]: I1203 22:00:12.763735 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:13.763934 master-0 kubenswrapper[9136]: I1203 22:00:13.763815 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:13.763934 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:13.763934 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:13.763934 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:13.764625 master-0 kubenswrapper[9136]: I1203 22:00:13.763939 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:14.768593 master-0 kubenswrapper[9136]: I1203 22:00:14.768496 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:14.768593 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:14.768593 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:14.768593 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:14.768593 master-0 kubenswrapper[9136]: I1203 22:00:14.768597 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:15.763317 master-0 kubenswrapper[9136]: I1203 22:00:15.763205 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:15.763317 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:15.763317 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:15.763317 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:15.763317 master-0 kubenswrapper[9136]: I1203 22:00:15.763302 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:16.762724 master-0 kubenswrapper[9136]: I1203 22:00:16.762673 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:16.762724 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:16.762724 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:16.762724 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:16.763560 master-0 kubenswrapper[9136]: I1203 22:00:16.763522 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:17.764364 master-0 kubenswrapper[9136]: I1203 22:00:17.764281 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:17.764364 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:17.764364 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:17.764364 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:17.765466 master-0 kubenswrapper[9136]: I1203 22:00:17.764368 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:17.784949 master-0 kubenswrapper[9136]: E1203 22:00:17.784880 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:17.787663 master-0 kubenswrapper[9136]: E1203 22:00:17.787550 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:17.789943 master-0 kubenswrapper[9136]: E1203 22:00:17.789824 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:17.789943 master-0 kubenswrapper[9136]: E1203 22:00:17.789917 9136 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" Dec 03 22:00:18.762681 master-0 kubenswrapper[9136]: I1203 22:00:18.762577 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:18.762681 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:18.762681 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:18.762681 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:18.762681 master-0 kubenswrapper[9136]: I1203 22:00:18.762665 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:19.763887 master-0 kubenswrapper[9136]: I1203 22:00:19.763756 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:19.763887 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:19.763887 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:19.763887 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:19.763887 master-0 kubenswrapper[9136]: I1203 22:00:19.763880 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:20.764743 master-0 kubenswrapper[9136]: I1203 22:00:20.764606 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:20.764743 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:20.764743 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:20.764743 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:20.764743 master-0 kubenswrapper[9136]: I1203 22:00:20.764727 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:20.908931 master-0 kubenswrapper[9136]: I1203 22:00:20.908848 9136 scope.go:117] "RemoveContainer" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" Dec 03 22:00:20.909313 master-0 kubenswrapper[9136]: E1203 22:00:20.909256 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:00:21.764386 master-0 kubenswrapper[9136]: I1203 22:00:21.764274 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:21.764386 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:21.764386 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:21.764386 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:21.764386 master-0 kubenswrapper[9136]: I1203 22:00:21.764373 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:22.764354 master-0 kubenswrapper[9136]: I1203 22:00:22.764267 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:22.764354 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:22.764354 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:22.764354 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:22.764354 master-0 kubenswrapper[9136]: I1203 22:00:22.764357 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:23.763377 master-0 kubenswrapper[9136]: I1203 22:00:23.763285 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:23.763377 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:23.763377 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:23.763377 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:23.763377 master-0 kubenswrapper[9136]: I1203 22:00:23.763374 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:24.763022 master-0 kubenswrapper[9136]: I1203 22:00:24.762946 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:24.763022 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:24.763022 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:24.763022 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:24.763328 master-0 kubenswrapper[9136]: I1203 22:00:24.763029 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:25.250363 master-0 kubenswrapper[9136]: I1203 22:00:25.250280 9136 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 22:00:25.251227 master-0 kubenswrapper[9136]: I1203 22:00:25.250653 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" containerID="cri-o://f211b15e8d62153f4deaa1bf7dfc87de0781805cce6c2aabbe56ed6f61fa1aa7" gracePeriod=30 Dec 03 22:00:25.251227 master-0 kubenswrapper[9136]: I1203 22:00:25.250763 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" containerID="cri-o://7077c81bca08e96587337a4956b3fcf6545b925a4a38770d09ecbdc14b1ceaa4" gracePeriod=30 Dec 03 22:00:25.252045 master-0 kubenswrapper[9136]: I1203 22:00:25.251615 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:00:25.252045 master-0 kubenswrapper[9136]: E1203 22:00:25.252008 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252045 master-0 kubenswrapper[9136]: I1203 22:00:25.252026 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252045 master-0 kubenswrapper[9136]: E1203 22:00:25.252041 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252045 master-0 kubenswrapper[9136]: I1203 22:00:25.252053 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: E1203 22:00:25.252078 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252090 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: E1203 22:00:25.252115 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252124 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: E1203 22:00:25.252139 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252148 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252311 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252328 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="cluster-policy-controller" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252346 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.252462 master-0 kubenswrapper[9136]: I1203 22:00:25.252367 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.253321 master-0 kubenswrapper[9136]: E1203 22:00:25.252546 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.253321 master-0 kubenswrapper[9136]: I1203 22:00:25.252563 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.253321 master-0 kubenswrapper[9136]: I1203 22:00:25.252748 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.253321 master-0 kubenswrapper[9136]: I1203 22:00:25.252795 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bce50c457ac1f4721bc81a570dd238a" containerName="kube-controller-manager" Dec 03 22:00:25.254216 master-0 kubenswrapper[9136]: I1203 22:00:25.254167 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.344366 master-0 kubenswrapper[9136]: I1203 22:00:25.344302 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:00:25.436147 master-0 kubenswrapper[9136]: I1203 22:00:25.436090 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.436147 master-0 kubenswrapper[9136]: I1203 22:00:25.436151 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.449675 master-0 kubenswrapper[9136]: I1203 22:00:25.449588 9136 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="7077c81bca08e96587337a4956b3fcf6545b925a4a38770d09ecbdc14b1ceaa4" exitCode=0 Dec 03 22:00:25.449675 master-0 kubenswrapper[9136]: I1203 22:00:25.449652 9136 generic.go:334] "Generic (PLEG): container finished" podID="7bce50c457ac1f4721bc81a570dd238a" containerID="f211b15e8d62153f4deaa1bf7dfc87de0781805cce6c2aabbe56ed6f61fa1aa7" exitCode=0 Dec 03 22:00:25.449981 master-0 kubenswrapper[9136]: I1203 22:00:25.449710 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d7935b96cb72a5574553c21dccf400a98363b1a22ca7819f889a07cdcea7c8a" Dec 03 22:00:25.449981 master-0 kubenswrapper[9136]: I1203 22:00:25.449736 9136 scope.go:117] "RemoveContainer" containerID="fdc23b60da9292f04f004b2100cec48c57024e9040275eb930fc44673f8ac8bd" Dec 03 22:00:25.450438 master-0 kubenswrapper[9136]: I1203 22:00:25.450387 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 22:00:25.500540 master-0 kubenswrapper[9136]: I1203 22:00:25.500393 9136 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="51fb2b64-d8f4-4d38-8b63-901550d7786c" Dec 03 22:00:25.537381 master-0 kubenswrapper[9136]: I1203 22:00:25.537319 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.537381 master-0 kubenswrapper[9136]: I1203 22:00:25.537363 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.537662 master-0 kubenswrapper[9136]: I1203 22:00:25.537441 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.537662 master-0 kubenswrapper[9136]: I1203 22:00:25.537478 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.635586 master-0 kubenswrapper[9136]: I1203 22:00:25.635520 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638168 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638283 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs" (OuterVolumeSpecName: "logs") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638333 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638375 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638417 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config" (OuterVolumeSpecName: "config") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638439 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638445 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets" (OuterVolumeSpecName: "secrets") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638461 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638505 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") pod \"7bce50c457ac1f4721bc81a570dd238a\" (UID: \"7bce50c457ac1f4721bc81a570dd238a\") " Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.638569 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "7bce50c457ac1f4721bc81a570dd238a" (UID: "7bce50c457ac1f4721bc81a570dd238a"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.639005 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.639022 9136 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-secrets\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.639037 9136 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.639048 9136 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:25.639253 master-0 kubenswrapper[9136]: I1203 22:00:25.639062 9136 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/7bce50c457ac1f4721bc81a570dd238a-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:25.660107 master-0 kubenswrapper[9136]: W1203 22:00:25.660033 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05dd6e8e0dea56089da96190349dd4c1.slice/crio-cdf20ff99c26562f4b242d4f427025d07982e61e3333d9eba9727037ba6c8316 WatchSource:0}: Error finding container cdf20ff99c26562f4b242d4f427025d07982e61e3333d9eba9727037ba6c8316: Status 404 returned error can't find the container with id cdf20ff99c26562f4b242d4f427025d07982e61e3333d9eba9727037ba6c8316 Dec 03 22:00:25.763252 master-0 kubenswrapper[9136]: I1203 22:00:25.762959 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:25.763252 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:25.763252 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:25.763252 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:25.763252 master-0 kubenswrapper[9136]: I1203 22:00:25.763035 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:25.922331 master-0 kubenswrapper[9136]: I1203 22:00:25.922257 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bce50c457ac1f4721bc81a570dd238a" path="/var/lib/kubelet/pods/7bce50c457ac1f4721bc81a570dd238a/volumes" Dec 03 22:00:25.923131 master-0 kubenswrapper[9136]: I1203 22:00:25.923083 9136 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Dec 03 22:00:26.036692 master-0 kubenswrapper[9136]: I1203 22:00:26.036635 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 22:00:26.036923 master-0 kubenswrapper[9136]: I1203 22:00:26.036692 9136 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="51fb2b64-d8f4-4d38-8b63-901550d7786c" Dec 03 22:00:26.039286 master-0 kubenswrapper[9136]: I1203 22:00:26.039237 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Dec 03 22:00:26.039349 master-0 kubenswrapper[9136]: I1203 22:00:26.039280 9136 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="51fb2b64-d8f4-4d38-8b63-901550d7786c" Dec 03 22:00:26.461488 master-0 kubenswrapper[9136]: I1203 22:00:26.461423 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"8a613b3a7f370a81b82a9c80ed68357904c42d57cdeafb6362cea4bc2248eaba"} Dec 03 22:00:26.461488 master-0 kubenswrapper[9136]: I1203 22:00:26.461480 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"ddeb5f184ad031c33b8f3f52c83dc7bc7558153b3f92fd847bf08cf2b2c45bec"} Dec 03 22:00:26.461488 master-0 kubenswrapper[9136]: I1203 22:00:26.461492 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"cdf20ff99c26562f4b242d4f427025d07982e61e3333d9eba9727037ba6c8316"} Dec 03 22:00:26.466512 master-0 kubenswrapper[9136]: I1203 22:00:26.466357 9136 generic.go:334] "Generic (PLEG): container finished" podID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerID="19cd297510bc844f8288365a1588d651ec674ded0636669c86f19116e03ce004" exitCode=0 Dec 03 22:00:26.466512 master-0 kubenswrapper[9136]: I1203 22:00:26.466451 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"5dada903-4b2b-450a-a55f-502ff892fd9f","Type":"ContainerDied","Data":"19cd297510bc844f8288365a1588d651ec674ded0636669c86f19116e03ce004"} Dec 03 22:00:26.470040 master-0 kubenswrapper[9136]: I1203 22:00:26.469982 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Dec 03 22:00:26.771313 master-0 kubenswrapper[9136]: I1203 22:00:26.771279 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:26.771313 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:26.771313 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:26.771313 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:26.771564 master-0 kubenswrapper[9136]: I1203 22:00:26.771533 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:27.483112 master-0 kubenswrapper[9136]: I1203 22:00:27.483021 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"fbbfa9bd2a1be205a1331c5dee6338e477cf874c4552e42ed29c6c6059d3ca04"} Dec 03 22:00:27.483112 master-0 kubenswrapper[9136]: I1203 22:00:27.483111 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"5ea02cb72627330e0d9cfb5b5ff03c2263f176b244172690291c3611f9a855f2"} Dec 03 22:00:27.534693 master-0 kubenswrapper[9136]: I1203 22:00:27.534582 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.534560378 podStartE2EDuration="2.534560378s" podCreationTimestamp="2025-12-03 22:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:00:27.529978473 +0000 UTC m=+633.805154885" watchObservedRunningTime="2025-12-03 22:00:27.534560378 +0000 UTC m=+633.809736770" Dec 03 22:00:27.763793 master-0 kubenswrapper[9136]: I1203 22:00:27.763598 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:27.763793 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:27.763793 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:27.763793 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:27.763793 master-0 kubenswrapper[9136]: I1203 22:00:27.763677 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:27.790838 master-0 kubenswrapper[9136]: E1203 22:00:27.790727 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:27.794067 master-0 kubenswrapper[9136]: E1203 22:00:27.793981 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:27.795342 master-0 kubenswrapper[9136]: E1203 22:00:27.795283 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:27.795342 master-0 kubenswrapper[9136]: E1203 22:00:27.795336 9136 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" Dec 03 22:00:27.995162 master-0 kubenswrapper[9136]: I1203 22:00:27.995102 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 22:00:28.075000 master-0 kubenswrapper[9136]: I1203 22:00:28.074847 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-kubelet-dir\") pod \"5dada903-4b2b-450a-a55f-502ff892fd9f\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " Dec 03 22:00:28.075000 master-0 kubenswrapper[9136]: I1203 22:00:28.074990 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-var-lock\") pod \"5dada903-4b2b-450a-a55f-502ff892fd9f\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " Dec 03 22:00:28.075242 master-0 kubenswrapper[9136]: I1203 22:00:28.075040 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5dada903-4b2b-450a-a55f-502ff892fd9f" (UID: "5dada903-4b2b-450a-a55f-502ff892fd9f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:28.075242 master-0 kubenswrapper[9136]: I1203 22:00:28.075100 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dada903-4b2b-450a-a55f-502ff892fd9f-kube-api-access\") pod \"5dada903-4b2b-450a-a55f-502ff892fd9f\" (UID: \"5dada903-4b2b-450a-a55f-502ff892fd9f\") " Dec 03 22:00:28.075332 master-0 kubenswrapper[9136]: I1203 22:00:28.075225 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-var-lock" (OuterVolumeSpecName: "var-lock") pod "5dada903-4b2b-450a-a55f-502ff892fd9f" (UID: "5dada903-4b2b-450a-a55f-502ff892fd9f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:28.075692 master-0 kubenswrapper[9136]: I1203 22:00:28.075652 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:28.075746 master-0 kubenswrapper[9136]: I1203 22:00:28.075689 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5dada903-4b2b-450a-a55f-502ff892fd9f-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:28.079063 master-0 kubenswrapper[9136]: I1203 22:00:28.079013 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dada903-4b2b-450a-a55f-502ff892fd9f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5dada903-4b2b-450a-a55f-502ff892fd9f" (UID: "5dada903-4b2b-450a-a55f-502ff892fd9f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:28.178154 master-0 kubenswrapper[9136]: I1203 22:00:28.178015 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5dada903-4b2b-450a-a55f-502ff892fd9f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:28.494265 master-0 kubenswrapper[9136]: I1203 22:00:28.494194 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"5dada903-4b2b-450a-a55f-502ff892fd9f","Type":"ContainerDied","Data":"93441462469b8b27bc3d666672828b80b42a37ea80417a77f840571890747f10"} Dec 03 22:00:28.494265 master-0 kubenswrapper[9136]: I1203 22:00:28.494249 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 22:00:28.494885 master-0 kubenswrapper[9136]: I1203 22:00:28.494261 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93441462469b8b27bc3d666672828b80b42a37ea80417a77f840571890747f10" Dec 03 22:00:28.762579 master-0 kubenswrapper[9136]: I1203 22:00:28.762423 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:28.762579 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:28.762579 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:28.762579 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:28.762579 master-0 kubenswrapper[9136]: I1203 22:00:28.762491 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:29.763257 master-0 kubenswrapper[9136]: I1203 22:00:29.763143 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:29.763257 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:29.763257 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:29.763257 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:29.764365 master-0 kubenswrapper[9136]: I1203 22:00:29.763255 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:30.763674 master-0 kubenswrapper[9136]: I1203 22:00:30.763584 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:30.763674 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:30.763674 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:30.763674 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:30.764619 master-0 kubenswrapper[9136]: I1203 22:00:30.763682 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:31.767544 master-0 kubenswrapper[9136]: I1203 22:00:31.767455 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:31.767544 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:31.767544 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:31.767544 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:31.768268 master-0 kubenswrapper[9136]: I1203 22:00:31.767576 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:32.763359 master-0 kubenswrapper[9136]: I1203 22:00:32.763261 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:32.763359 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:32.763359 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:32.763359 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:32.763851 master-0 kubenswrapper[9136]: I1203 22:00:32.763370 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:33.763182 master-0 kubenswrapper[9136]: I1203 22:00:33.763098 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:33.763182 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:33.763182 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:33.763182 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:33.763924 master-0 kubenswrapper[9136]: I1203 22:00:33.763193 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:33.914178 master-0 kubenswrapper[9136]: I1203 22:00:33.914101 9136 scope.go:117] "RemoveContainer" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" Dec 03 22:00:33.914664 master-0 kubenswrapper[9136]: E1203 22:00:33.914596 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:00:34.763493 master-0 kubenswrapper[9136]: I1203 22:00:34.763368 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:34.763493 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:34.763493 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:34.763493 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:34.763493 master-0 kubenswrapper[9136]: I1203 22:00:34.763464 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:35.636231 master-0 kubenswrapper[9136]: I1203 22:00:35.636177 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:35.636600 master-0 kubenswrapper[9136]: I1203 22:00:35.636577 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:35.636738 master-0 kubenswrapper[9136]: I1203 22:00:35.636719 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:35.637205 master-0 kubenswrapper[9136]: I1203 22:00:35.637089 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:35.643053 master-0 kubenswrapper[9136]: I1203 22:00:35.642998 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:35.643390 master-0 kubenswrapper[9136]: I1203 22:00:35.643359 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:35.647194 master-0 kubenswrapper[9136]: I1203 22:00:35.647132 9136 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 03 22:00:35.647706 master-0 kubenswrapper[9136]: I1203 22:00:35.647617 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcdctl" containerID="cri-o://c0dc443f1527bf49aa993d4373b075ba2cac81ea231c3894c047c835a26b4c9f" gracePeriod=30 Dec 03 22:00:35.647923 master-0 kubenswrapper[9136]: I1203 22:00:35.647676 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-rev" containerID="cri-o://2cc1c5605242df3437ae984a9e1e4d7f667ba732f16f6113d93fc241030ae6be" gracePeriod=30 Dec 03 22:00:35.647923 master-0 kubenswrapper[9136]: I1203 22:00:35.647737 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-metrics" containerID="cri-o://03a6c4cf49b5dfdb537d453011bfa5fccbeecd91abc60d2409d33529fbb25f01" gracePeriod=30 Dec 03 22:00:35.647923 master-0 kubenswrapper[9136]: I1203 22:00:35.647703 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-readyz" containerID="cri-o://26f2c19baa42c22f852104341244a584ce87ef1be47ee995c3a898bbf27ea037" gracePeriod=30 Dec 03 22:00:35.648237 master-0 kubenswrapper[9136]: I1203 22:00:35.647878 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd" containerID="cri-o://e1b125fe53f3c1e5501ca76af42c64d1123972595dfade5a4e4e7d7f0045d4fa" gracePeriod=30 Dec 03 22:00:35.656857 master-0 kubenswrapper[9136]: I1203 22:00:35.656795 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Dec 03 22:00:35.658555 master-0 kubenswrapper[9136]: E1203 22:00:35.658528 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcdctl" Dec 03 22:00:35.658671 master-0 kubenswrapper[9136]: I1203 22:00:35.658657 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcdctl" Dec 03 22:00:35.658789 master-0 kubenswrapper[9136]: E1203 22:00:35.658758 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerName="installer" Dec 03 22:00:35.658886 master-0 kubenswrapper[9136]: I1203 22:00:35.658872 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerName="installer" Dec 03 22:00:35.658987 master-0 kubenswrapper[9136]: E1203 22:00:35.658973 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="setup" Dec 03 22:00:35.659086 master-0 kubenswrapper[9136]: I1203 22:00:35.659070 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="setup" Dec 03 22:00:35.659191 master-0 kubenswrapper[9136]: E1203 22:00:35.659177 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-rev" Dec 03 22:00:35.659271 master-0 kubenswrapper[9136]: I1203 22:00:35.659260 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-rev" Dec 03 22:00:35.659354 master-0 kubenswrapper[9136]: E1203 22:00:35.659342 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-metrics" Dec 03 22:00:35.659431 master-0 kubenswrapper[9136]: I1203 22:00:35.659420 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-metrics" Dec 03 22:00:35.659503 master-0 kubenswrapper[9136]: E1203 22:00:35.659491 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-readyz" Dec 03 22:00:35.659557 master-0 kubenswrapper[9136]: I1203 22:00:35.659549 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-readyz" Dec 03 22:00:35.659615 master-0 kubenswrapper[9136]: E1203 22:00:35.659606 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd" Dec 03 22:00:35.659670 master-0 kubenswrapper[9136]: I1203 22:00:35.659662 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd" Dec 03 22:00:35.659728 master-0 kubenswrapper[9136]: E1203 22:00:35.659719 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-resources-copy" Dec 03 22:00:35.659813 master-0 kubenswrapper[9136]: I1203 22:00:35.659799 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-resources-copy" Dec 03 22:00:35.659893 master-0 kubenswrapper[9136]: E1203 22:00:35.659881 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-ensure-env-vars" Dec 03 22:00:35.660155 master-0 kubenswrapper[9136]: I1203 22:00:35.660143 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-ensure-env-vars" Dec 03 22:00:35.660405 master-0 kubenswrapper[9136]: I1203 22:00:35.660390 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcdctl" Dec 03 22:00:35.660997 master-0 kubenswrapper[9136]: I1203 22:00:35.660980 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerName="installer" Dec 03 22:00:35.661082 master-0 kubenswrapper[9136]: I1203 22:00:35.661070 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd" Dec 03 22:00:35.661161 master-0 kubenswrapper[9136]: I1203 22:00:35.661148 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-readyz" Dec 03 22:00:35.661264 master-0 kubenswrapper[9136]: I1203 22:00:35.661252 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-rev" Dec 03 22:00:35.661341 master-0 kubenswrapper[9136]: I1203 22:00:35.661330 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf07eb54db570834b7c9a90b6b07403" containerName="etcd-metrics" Dec 03 22:00:35.763913 master-0 kubenswrapper[9136]: I1203 22:00:35.763817 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:35.763913 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:35.763913 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:35.763913 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:35.764583 master-0 kubenswrapper[9136]: I1203 22:00:35.763939 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:35.798108 master-0 kubenswrapper[9136]: I1203 22:00:35.798045 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.798194 master-0 kubenswrapper[9136]: I1203 22:00:35.798143 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.798435 master-0 kubenswrapper[9136]: I1203 22:00:35.798372 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.798594 master-0 kubenswrapper[9136]: I1203 22:00:35.798562 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.798652 master-0 kubenswrapper[9136]: I1203 22:00:35.798631 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.798905 master-0 kubenswrapper[9136]: I1203 22:00:35.798857 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901129 master-0 kubenswrapper[9136]: I1203 22:00:35.900640 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901129 master-0 kubenswrapper[9136]: I1203 22:00:35.900809 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901129 master-0 kubenswrapper[9136]: I1203 22:00:35.901023 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901129 master-0 kubenswrapper[9136]: I1203 22:00:35.901119 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901140 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901188 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901244 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901250 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901317 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901420 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901424 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:35.901608 master-0 kubenswrapper[9136]: I1203 22:00:35.901602 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:00:36.561070 master-0 kubenswrapper[9136]: I1203 22:00:36.560999 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-rev/0.log" Dec 03 22:00:36.562798 master-0 kubenswrapper[9136]: I1203 22:00:36.562741 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-metrics/0.log" Dec 03 22:00:36.566363 master-0 kubenswrapper[9136]: I1203 22:00:36.566288 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="2cc1c5605242df3437ae984a9e1e4d7f667ba732f16f6113d93fc241030ae6be" exitCode=2 Dec 03 22:00:36.566363 master-0 kubenswrapper[9136]: I1203 22:00:36.566333 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="26f2c19baa42c22f852104341244a584ce87ef1be47ee995c3a898bbf27ea037" exitCode=0 Dec 03 22:00:36.566363 master-0 kubenswrapper[9136]: I1203 22:00:36.566343 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="03a6c4cf49b5dfdb537d453011bfa5fccbeecd91abc60d2409d33529fbb25f01" exitCode=2 Dec 03 22:00:36.573517 master-0 kubenswrapper[9136]: I1203 22:00:36.573443 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:36.573755 master-0 kubenswrapper[9136]: I1203 22:00:36.573711 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:00:36.763272 master-0 kubenswrapper[9136]: I1203 22:00:36.763191 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:36.763272 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:36.763272 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:36.763272 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:36.763575 master-0 kubenswrapper[9136]: I1203 22:00:36.763297 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:37.763405 master-0 kubenswrapper[9136]: I1203 22:00:37.763293 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:37.763405 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:37.763405 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:37.763405 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:37.763405 master-0 kubenswrapper[9136]: I1203 22:00:37.763404 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:37.785823 master-0 kubenswrapper[9136]: E1203 22:00:37.785731 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:37.787883 master-0 kubenswrapper[9136]: E1203 22:00:37.787749 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:37.790174 master-0 kubenswrapper[9136]: E1203 22:00:37.790110 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:00:37.790381 master-0 kubenswrapper[9136]: E1203 22:00:37.790181 9136 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" Dec 03 22:00:38.764139 master-0 kubenswrapper[9136]: I1203 22:00:38.764059 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:38.764139 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:38.764139 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:38.764139 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:38.764139 master-0 kubenswrapper[9136]: I1203 22:00:38.764143 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:39.500706 master-0 kubenswrapper[9136]: I1203 22:00:39.500649 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nrcql_519e403c-28ab-4750-8143-34c74bf526ce/kube-multus-additional-cni-plugins/0.log" Dec 03 22:00:39.500975 master-0 kubenswrapper[9136]: I1203 22:00:39.500722 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 22:00:39.590977 master-0 kubenswrapper[9136]: I1203 22:00:39.590723 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/519e403c-28ab-4750-8143-34c74bf526ce-ready\") pod \"519e403c-28ab-4750-8143-34c74bf526ce\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " Dec 03 22:00:39.590977 master-0 kubenswrapper[9136]: I1203 22:00:39.590919 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/519e403c-28ab-4750-8143-34c74bf526ce-tuning-conf-dir\") pod \"519e403c-28ab-4750-8143-34c74bf526ce\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " Dec 03 22:00:39.591431 master-0 kubenswrapper[9136]: I1203 22:00:39.591039 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6kt9\" (UniqueName: \"kubernetes.io/projected/519e403c-28ab-4750-8143-34c74bf526ce-kube-api-access-v6kt9\") pod \"519e403c-28ab-4750-8143-34c74bf526ce\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " Dec 03 22:00:39.591431 master-0 kubenswrapper[9136]: I1203 22:00:39.591044 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/519e403c-28ab-4750-8143-34c74bf526ce-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "519e403c-28ab-4750-8143-34c74bf526ce" (UID: "519e403c-28ab-4750-8143-34c74bf526ce"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:39.591431 master-0 kubenswrapper[9136]: I1203 22:00:39.591117 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/519e403c-28ab-4750-8143-34c74bf526ce-cni-sysctl-allowlist\") pod \"519e403c-28ab-4750-8143-34c74bf526ce\" (UID: \"519e403c-28ab-4750-8143-34c74bf526ce\") " Dec 03 22:00:39.591431 master-0 kubenswrapper[9136]: I1203 22:00:39.591364 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/519e403c-28ab-4750-8143-34c74bf526ce-ready" (OuterVolumeSpecName: "ready") pod "519e403c-28ab-4750-8143-34c74bf526ce" (UID: "519e403c-28ab-4750-8143-34c74bf526ce"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:00:39.591989 master-0 kubenswrapper[9136]: I1203 22:00:39.591930 9136 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/519e403c-28ab-4750-8143-34c74bf526ce-ready\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:39.591989 master-0 kubenswrapper[9136]: I1203 22:00:39.591971 9136 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/519e403c-28ab-4750-8143-34c74bf526ce-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:39.592220 master-0 kubenswrapper[9136]: I1203 22:00:39.592172 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/519e403c-28ab-4750-8143-34c74bf526ce-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "519e403c-28ab-4750-8143-34c74bf526ce" (UID: "519e403c-28ab-4750-8143-34c74bf526ce"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:00:39.593842 master-0 kubenswrapper[9136]: I1203 22:00:39.593738 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nrcql_519e403c-28ab-4750-8143-34c74bf526ce/kube-multus-additional-cni-plugins/0.log" Dec 03 22:00:39.593990 master-0 kubenswrapper[9136]: I1203 22:00:39.593847 9136 generic.go:334] "Generic (PLEG): container finished" podID="519e403c-28ab-4750-8143-34c74bf526ce" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" exitCode=137 Dec 03 22:00:39.594117 master-0 kubenswrapper[9136]: I1203 22:00:39.594003 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" Dec 03 22:00:39.594117 master-0 kubenswrapper[9136]: I1203 22:00:39.594076 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" event={"ID":"519e403c-28ab-4750-8143-34c74bf526ce","Type":"ContainerDied","Data":"8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d"} Dec 03 22:00:39.594341 master-0 kubenswrapper[9136]: I1203 22:00:39.594174 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" event={"ID":"519e403c-28ab-4750-8143-34c74bf526ce","Type":"ContainerDied","Data":"1644d583ba002a0e82a19318277118e077966cb952f62ba634455a674e38f171"} Dec 03 22:00:39.594341 master-0 kubenswrapper[9136]: I1203 22:00:39.594214 9136 scope.go:117] "RemoveContainer" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" Dec 03 22:00:39.596240 master-0 kubenswrapper[9136]: I1203 22:00:39.596155 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/519e403c-28ab-4750-8143-34c74bf526ce-kube-api-access-v6kt9" (OuterVolumeSpecName: "kube-api-access-v6kt9") pod "519e403c-28ab-4750-8143-34c74bf526ce" (UID: "519e403c-28ab-4750-8143-34c74bf526ce"). InnerVolumeSpecName "kube-api-access-v6kt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:39.598414 master-0 kubenswrapper[9136]: I1203 22:00:39.598364 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-78ddcf56f9-6b8qj_134c10ef-9f37-4a77-8e8b-4f8326bc8f40/multus-admission-controller/0.log" Dec 03 22:00:39.598549 master-0 kubenswrapper[9136]: I1203 22:00:39.598423 9136 generic.go:334] "Generic (PLEG): container finished" podID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerID="b43199be9a4f7a0f84f14243db251c1ad19549fc366171d1c2e7a7eabf31f5ce" exitCode=137 Dec 03 22:00:39.598549 master-0 kubenswrapper[9136]: I1203 22:00:39.598471 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" event={"ID":"134c10ef-9f37-4a77-8e8b-4f8326bc8f40","Type":"ContainerDied","Data":"b43199be9a4f7a0f84f14243db251c1ad19549fc366171d1c2e7a7eabf31f5ce"} Dec 03 22:00:39.628060 master-0 kubenswrapper[9136]: I1203 22:00:39.628024 9136 scope.go:117] "RemoveContainer" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" Dec 03 22:00:39.628985 master-0 kubenswrapper[9136]: E1203 22:00:39.628901 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d\": container with ID starting with 8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d not found: ID does not exist" containerID="8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d" Dec 03 22:00:39.629113 master-0 kubenswrapper[9136]: I1203 22:00:39.628990 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d"} err="failed to get container status \"8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d\": rpc error: code = NotFound desc = could not find container \"8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d\": container with ID starting with 8d0299020d4a31d7cbb4db8df361031cce0d531bf89b483a0c4cbe11ed93403d not found: ID does not exist" Dec 03 22:00:39.693211 master-0 kubenswrapper[9136]: I1203 22:00:39.693121 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6kt9\" (UniqueName: \"kubernetes.io/projected/519e403c-28ab-4750-8143-34c74bf526ce-kube-api-access-v6kt9\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:39.693211 master-0 kubenswrapper[9136]: I1203 22:00:39.693184 9136 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/519e403c-28ab-4750-8143-34c74bf526ce-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:39.763415 master-0 kubenswrapper[9136]: I1203 22:00:39.763309 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:39.763415 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:39.763415 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:39.763415 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:39.763955 master-0 kubenswrapper[9136]: I1203 22:00:39.763425 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:40.352204 master-0 kubenswrapper[9136]: I1203 22:00:40.352139 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-78ddcf56f9-6b8qj_134c10ef-9f37-4a77-8e8b-4f8326bc8f40/multus-admission-controller/0.log" Dec 03 22:00:40.352691 master-0 kubenswrapper[9136]: I1203 22:00:40.352258 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 22:00:40.504238 master-0 kubenswrapper[9136]: I1203 22:00:40.504160 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") pod \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " Dec 03 22:00:40.504526 master-0 kubenswrapper[9136]: I1203 22:00:40.504288 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") pod \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\" (UID: \"134c10ef-9f37-4a77-8e8b-4f8326bc8f40\") " Dec 03 22:00:40.508098 master-0 kubenswrapper[9136]: I1203 22:00:40.508045 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb" (OuterVolumeSpecName: "kube-api-access-fkwjb") pod "134c10ef-9f37-4a77-8e8b-4f8326bc8f40" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40"). InnerVolumeSpecName "kube-api-access-fkwjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:40.508230 master-0 kubenswrapper[9136]: I1203 22:00:40.508182 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "134c10ef-9f37-4a77-8e8b-4f8326bc8f40" (UID: "134c10ef-9f37-4a77-8e8b-4f8326bc8f40"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:00:40.606321 master-0 kubenswrapper[9136]: I1203 22:00:40.606187 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkwjb\" (UniqueName: \"kubernetes.io/projected/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-kube-api-access-fkwjb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:40.606321 master-0 kubenswrapper[9136]: I1203 22:00:40.606225 9136 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/134c10ef-9f37-4a77-8e8b-4f8326bc8f40-webhook-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:40.610958 master-0 kubenswrapper[9136]: I1203 22:00:40.610920 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-78ddcf56f9-6b8qj_134c10ef-9f37-4a77-8e8b-4f8326bc8f40/multus-admission-controller/0.log" Dec 03 22:00:40.611055 master-0 kubenswrapper[9136]: I1203 22:00:40.610968 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" event={"ID":"134c10ef-9f37-4a77-8e8b-4f8326bc8f40","Type":"ContainerDied","Data":"f6c57485135ac402ba25cbe0ce41f1bfe52dbe6ef45a2663904b0cf72595d893"} Dec 03 22:00:40.611055 master-0 kubenswrapper[9136]: I1203 22:00:40.611008 9136 scope.go:117] "RemoveContainer" containerID="064594605d70a71f5b8c396507bbd62023c90012ba2c394b075d83b5e3f0a671" Dec 03 22:00:40.611436 master-0 kubenswrapper[9136]: I1203 22:00:40.611053 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj" Dec 03 22:00:40.632876 master-0 kubenswrapper[9136]: I1203 22:00:40.632690 9136 scope.go:117] "RemoveContainer" containerID="b43199be9a4f7a0f84f14243db251c1ad19549fc366171d1c2e7a7eabf31f5ce" Dec 03 22:00:40.764004 master-0 kubenswrapper[9136]: I1203 22:00:40.763924 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:40.764004 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:40.764004 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:40.764004 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:40.764373 master-0 kubenswrapper[9136]: I1203 22:00:40.764030 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:41.764496 master-0 kubenswrapper[9136]: I1203 22:00:41.764401 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:41.764496 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:41.764496 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:41.764496 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:41.764496 master-0 kubenswrapper[9136]: I1203 22:00:41.764491 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:42.764247 master-0 kubenswrapper[9136]: I1203 22:00:42.764129 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:42.764247 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:42.764247 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:42.764247 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:42.765417 master-0 kubenswrapper[9136]: I1203 22:00:42.764241 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:43.764161 master-0 kubenswrapper[9136]: I1203 22:00:43.764053 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:00:43.764161 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:00:43.764161 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:00:43.764161 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:00:43.764610 master-0 kubenswrapper[9136]: I1203 22:00:43.764171 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:00:43.764610 master-0 kubenswrapper[9136]: I1203 22:00:43.764253 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:00:43.765367 master-0 kubenswrapper[9136]: I1203 22:00:43.765235 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"81bfd9fda486fff0e9f3b1dee5719d6b24e5410ceadededcc88dc65916bf639e"} pod="openshift-ingress/router-default-54f97f57-xq6ch" containerMessage="Container router failed startup probe, will be restarted" Dec 03 22:00:43.765367 master-0 kubenswrapper[9136]: I1203 22:00:43.765298 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" containerID="cri-o://81bfd9fda486fff0e9f3b1dee5719d6b24e5410ceadededcc88dc65916bf639e" gracePeriod=3600 Dec 03 22:00:47.909490 master-0 kubenswrapper[9136]: I1203 22:00:47.909394 9136 scope.go:117] "RemoveContainer" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" Dec 03 22:00:49.706919 master-0 kubenswrapper[9136]: I1203 22:00:49.706854 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/3.log" Dec 03 22:00:49.708668 master-0 kubenswrapper[9136]: I1203 22:00:49.708576 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f"} Dec 03 22:00:50.721602 master-0 kubenswrapper[9136]: I1203 22:00:50.721510 9136 generic.go:334] "Generic (PLEG): container finished" podID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerID="888690db481ab6164f73ea7eec62997e93d90f2abdb333711d2e3c24534b02e9" exitCode=0 Dec 03 22:00:50.721602 master-0 kubenswrapper[9136]: I1203 22:00:50.721574 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"70e52a8c-7f9e-47fa-85ca-41f90dcb9747","Type":"ContainerDied","Data":"888690db481ab6164f73ea7eec62997e93d90f2abdb333711d2e3c24534b02e9"} Dec 03 22:00:51.732804 master-0 kubenswrapper[9136]: I1203 22:00:51.732703 9136 generic.go:334] "Generic (PLEG): container finished" podID="d78739a7694769882b7e47ea5ac08a10" containerID="f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26" exitCode=1 Dec 03 22:00:51.733500 master-0 kubenswrapper[9136]: I1203 22:00:51.732846 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerDied","Data":"f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26"} Dec 03 22:00:51.733500 master-0 kubenswrapper[9136]: I1203 22:00:51.732965 9136 scope.go:117] "RemoveContainer" containerID="a5db70d8c04a9fdadcb37e94b566026a87f2170f5e692e5d3ce48ba377b79800" Dec 03 22:00:51.734207 master-0 kubenswrapper[9136]: I1203 22:00:51.733917 9136 scope.go:117] "RemoveContainer" containerID="f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26" Dec 03 22:00:51.734491 master-0 kubenswrapper[9136]: E1203 22:00:51.734429 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(d78739a7694769882b7e47ea5ac08a10)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="d78739a7694769882b7e47ea5ac08a10" Dec 03 22:00:52.205382 master-0 kubenswrapper[9136]: E1203 22:00:52.205242 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:00:52.227491 master-0 kubenswrapper[9136]: I1203 22:00:52.227434 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:52.300156 master-0 kubenswrapper[9136]: I1203 22:00:52.300018 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kubelet-dir\") pod \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " Dec 03 22:00:52.300156 master-0 kubenswrapper[9136]: I1203 22:00:52.300150 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kube-api-access\") pod \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " Dec 03 22:00:52.300544 master-0 kubenswrapper[9136]: I1203 22:00:52.300172 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "70e52a8c-7f9e-47fa-85ca-41f90dcb9747" (UID: "70e52a8c-7f9e-47fa-85ca-41f90dcb9747"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:52.300544 master-0 kubenswrapper[9136]: I1203 22:00:52.300221 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-var-lock\") pod \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\" (UID: \"70e52a8c-7f9e-47fa-85ca-41f90dcb9747\") " Dec 03 22:00:52.300544 master-0 kubenswrapper[9136]: I1203 22:00:52.300275 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-var-lock" (OuterVolumeSpecName: "var-lock") pod "70e52a8c-7f9e-47fa-85ca-41f90dcb9747" (UID: "70e52a8c-7f9e-47fa-85ca-41f90dcb9747"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:52.300720 master-0 kubenswrapper[9136]: I1203 22:00:52.300661 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:52.300720 master-0 kubenswrapper[9136]: I1203 22:00:52.300691 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:52.304019 master-0 kubenswrapper[9136]: I1203 22:00:52.303919 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "70e52a8c-7f9e-47fa-85ca-41f90dcb9747" (UID: "70e52a8c-7f9e-47fa-85ca-41f90dcb9747"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:52.402304 master-0 kubenswrapper[9136]: I1203 22:00:52.402218 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/70e52a8c-7f9e-47fa-85ca-41f90dcb9747-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:52.743048 master-0 kubenswrapper[9136]: I1203 22:00:52.742872 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"70e52a8c-7f9e-47fa-85ca-41f90dcb9747","Type":"ContainerDied","Data":"25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8"} Dec 03 22:00:52.744087 master-0 kubenswrapper[9136]: I1203 22:00:52.743063 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8" Dec 03 22:00:52.744087 master-0 kubenswrapper[9136]: I1203 22:00:52.743359 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 22:00:52.749665 master-0 kubenswrapper[9136]: I1203 22:00:52.749599 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_25602d69-3aec-487d-8d62-c2c21f27e2b7/installer/0.log" Dec 03 22:00:52.749878 master-0 kubenswrapper[9136]: I1203 22:00:52.749693 9136 generic.go:334] "Generic (PLEG): container finished" podID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerID="8bd5db9bea83f9fc4e7c948b8e5ecf2d5c83b59963dd6e82975811d819eaa07a" exitCode=1 Dec 03 22:00:52.749878 master-0 kubenswrapper[9136]: I1203 22:00:52.749748 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"25602d69-3aec-487d-8d62-c2c21f27e2b7","Type":"ContainerDied","Data":"8bd5db9bea83f9fc4e7c948b8e5ecf2d5c83b59963dd6e82975811d819eaa07a"} Dec 03 22:00:54.182402 master-0 kubenswrapper[9136]: I1203 22:00:54.182333 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_25602d69-3aec-487d-8d62-c2c21f27e2b7/installer/0.log" Dec 03 22:00:54.182402 master-0 kubenswrapper[9136]: I1203 22:00:54.182416 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:54.329634 master-0 kubenswrapper[9136]: I1203 22:00:54.329557 9136 scope.go:117] "RemoveContainer" containerID="7077c81bca08e96587337a4956b3fcf6545b925a4a38770d09ecbdc14b1ceaa4" Dec 03 22:00:54.331619 master-0 kubenswrapper[9136]: I1203 22:00:54.331552 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-var-lock\") pod \"25602d69-3aec-487d-8d62-c2c21f27e2b7\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " Dec 03 22:00:54.331742 master-0 kubenswrapper[9136]: I1203 22:00:54.331658 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-var-lock" (OuterVolumeSpecName: "var-lock") pod "25602d69-3aec-487d-8d62-c2c21f27e2b7" (UID: "25602d69-3aec-487d-8d62-c2c21f27e2b7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:54.331860 master-0 kubenswrapper[9136]: I1203 22:00:54.331800 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25602d69-3aec-487d-8d62-c2c21f27e2b7-kube-api-access\") pod \"25602d69-3aec-487d-8d62-c2c21f27e2b7\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " Dec 03 22:00:54.331941 master-0 kubenswrapper[9136]: I1203 22:00:54.331917 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-kubelet-dir\") pod \"25602d69-3aec-487d-8d62-c2c21f27e2b7\" (UID: \"25602d69-3aec-487d-8d62-c2c21f27e2b7\") " Dec 03 22:00:54.332484 master-0 kubenswrapper[9136]: I1203 22:00:54.332431 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:54.332848 master-0 kubenswrapper[9136]: I1203 22:00:54.332742 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "25602d69-3aec-487d-8d62-c2c21f27e2b7" (UID: "25602d69-3aec-487d-8d62-c2c21f27e2b7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:54.335879 master-0 kubenswrapper[9136]: I1203 22:00:54.335800 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25602d69-3aec-487d-8d62-c2c21f27e2b7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "25602d69-3aec-487d-8d62-c2c21f27e2b7" (UID: "25602d69-3aec-487d-8d62-c2c21f27e2b7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:54.387473 master-0 kubenswrapper[9136]: I1203 22:00:54.387423 9136 scope.go:117] "RemoveContainer" containerID="f211b15e8d62153f4deaa1bf7dfc87de0781805cce6c2aabbe56ed6f61fa1aa7" Dec 03 22:00:54.434509 master-0 kubenswrapper[9136]: I1203 22:00:54.434443 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25602d69-3aec-487d-8d62-c2c21f27e2b7-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:54.434509 master-0 kubenswrapper[9136]: I1203 22:00:54.434498 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25602d69-3aec-487d-8d62-c2c21f27e2b7-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:54.768399 master-0 kubenswrapper[9136]: I1203 22:00:54.768332 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_25602d69-3aec-487d-8d62-c2c21f27e2b7/installer/0.log" Dec 03 22:00:54.768649 master-0 kubenswrapper[9136]: I1203 22:00:54.768425 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"25602d69-3aec-487d-8d62-c2c21f27e2b7","Type":"ContainerDied","Data":"8804c8caf822254253605f55cc71b86af43d373e50704b9313038bda7b60d32d"} Dec 03 22:00:54.768649 master-0 kubenswrapper[9136]: I1203 22:00:54.768484 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8804c8caf822254253605f55cc71b86af43d373e50704b9313038bda7b60d32d" Dec 03 22:00:54.768649 master-0 kubenswrapper[9136]: I1203 22:00:54.768528 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:00:55.779854 master-0 kubenswrapper[9136]: I1203 22:00:55.779795 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_8d60f02e-1803-461e-9606-667d91fcae14/installer/0.log" Dec 03 22:00:55.780473 master-0 kubenswrapper[9136]: I1203 22:00:55.779874 9136 generic.go:334] "Generic (PLEG): container finished" podID="8d60f02e-1803-461e-9606-667d91fcae14" containerID="a31f9ca2a3872e9cae07acaea514d81ec85d80124c14419e3e1663f38e942380" exitCode=1 Dec 03 22:00:55.780473 master-0 kubenswrapper[9136]: I1203 22:00:55.779927 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"8d60f02e-1803-461e-9606-667d91fcae14","Type":"ContainerDied","Data":"a31f9ca2a3872e9cae07acaea514d81ec85d80124c14419e3e1663f38e942380"} Dec 03 22:00:57.153634 master-0 kubenswrapper[9136]: I1203 22:00:57.153569 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_8d60f02e-1803-461e-9606-667d91fcae14/installer/0.log" Dec 03 22:00:57.154478 master-0 kubenswrapper[9136]: I1203 22:00:57.153676 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:00:57.276669 master-0 kubenswrapper[9136]: I1203 22:00:57.276583 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-var-lock\") pod \"8d60f02e-1803-461e-9606-667d91fcae14\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " Dec 03 22:00:57.276945 master-0 kubenswrapper[9136]: I1203 22:00:57.276718 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d60f02e-1803-461e-9606-667d91fcae14-kube-api-access\") pod \"8d60f02e-1803-461e-9606-667d91fcae14\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " Dec 03 22:00:57.276945 master-0 kubenswrapper[9136]: I1203 22:00:57.276750 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-kubelet-dir\") pod \"8d60f02e-1803-461e-9606-667d91fcae14\" (UID: \"8d60f02e-1803-461e-9606-667d91fcae14\") " Dec 03 22:00:57.277150 master-0 kubenswrapper[9136]: I1203 22:00:57.277104 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-var-lock" (OuterVolumeSpecName: "var-lock") pod "8d60f02e-1803-461e-9606-667d91fcae14" (UID: "8d60f02e-1803-461e-9606-667d91fcae14"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:57.277262 master-0 kubenswrapper[9136]: I1203 22:00:57.277226 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d60f02e-1803-461e-9606-667d91fcae14" (UID: "8d60f02e-1803-461e-9606-667d91fcae14"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:00:57.281475 master-0 kubenswrapper[9136]: I1203 22:00:57.281409 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d60f02e-1803-461e-9606-667d91fcae14-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d60f02e-1803-461e-9606-667d91fcae14" (UID: "8d60f02e-1803-461e-9606-667d91fcae14"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:00:57.378822 master-0 kubenswrapper[9136]: I1203 22:00:57.378630 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d60f02e-1803-461e-9606-667d91fcae14-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:57.378822 master-0 kubenswrapper[9136]: I1203 22:00:57.378704 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:57.378822 master-0 kubenswrapper[9136]: I1203 22:00:57.378724 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8d60f02e-1803-461e-9606-667d91fcae14-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:00:57.800206 master-0 kubenswrapper[9136]: I1203 22:00:57.800011 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_8d60f02e-1803-461e-9606-667d91fcae14/installer/0.log" Dec 03 22:00:57.800206 master-0 kubenswrapper[9136]: I1203 22:00:57.800111 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"8d60f02e-1803-461e-9606-667d91fcae14","Type":"ContainerDied","Data":"672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532"} Dec 03 22:00:57.800206 master-0 kubenswrapper[9136]: I1203 22:00:57.800158 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532" Dec 03 22:00:57.800616 master-0 kubenswrapper[9136]: I1203 22:00:57.800239 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:01:02.206664 master-0 kubenswrapper[9136]: E1203 22:01:02.206569 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:01:05.872159 master-0 kubenswrapper[9136]: I1203 22:01:05.872083 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-rev/0.log" Dec 03 22:01:05.873736 master-0 kubenswrapper[9136]: I1203 22:01:05.873688 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-metrics/0.log" Dec 03 22:01:05.874728 master-0 kubenswrapper[9136]: I1203 22:01:05.874687 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd/0.log" Dec 03 22:01:05.875543 master-0 kubenswrapper[9136]: I1203 22:01:05.875494 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcdctl/0.log" Dec 03 22:01:05.876600 master-0 kubenswrapper[9136]: I1203 22:01:05.876553 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="e1b125fe53f3c1e5501ca76af42c64d1123972595dfade5a4e4e7d7f0045d4fa" exitCode=137 Dec 03 22:01:05.876600 master-0 kubenswrapper[9136]: I1203 22:01:05.876586 9136 generic.go:334] "Generic (PLEG): container finished" podID="ebf07eb54db570834b7c9a90b6b07403" containerID="c0dc443f1527bf49aa993d4373b075ba2cac81ea231c3894c047c835a26b4c9f" exitCode=137 Dec 03 22:01:06.243438 master-0 kubenswrapper[9136]: I1203 22:01:06.243353 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-rev/0.log" Dec 03 22:01:06.244596 master-0 kubenswrapper[9136]: I1203 22:01:06.244518 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-metrics/0.log" Dec 03 22:01:06.245810 master-0 kubenswrapper[9136]: I1203 22:01:06.245708 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd/0.log" Dec 03 22:01:06.246421 master-0 kubenswrapper[9136]: I1203 22:01:06.246375 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcdctl/0.log" Dec 03 22:01:06.248522 master-0 kubenswrapper[9136]: I1203 22:01:06.248459 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 22:01:06.320548 master-0 kubenswrapper[9136]: I1203 22:01:06.320455 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-data-dir\") pod \"ebf07eb54db570834b7c9a90b6b07403\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " Dec 03 22:01:06.320812 master-0 kubenswrapper[9136]: I1203 22:01:06.320566 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-data-dir" (OuterVolumeSpecName: "data-dir") pod "ebf07eb54db570834b7c9a90b6b07403" (UID: "ebf07eb54db570834b7c9a90b6b07403"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:01:06.320812 master-0 kubenswrapper[9136]: I1203 22:01:06.320614 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-log-dir\") pod \"ebf07eb54db570834b7c9a90b6b07403\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " Dec 03 22:01:06.320812 master-0 kubenswrapper[9136]: I1203 22:01:06.320654 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-cert-dir\") pod \"ebf07eb54db570834b7c9a90b6b07403\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " Dec 03 22:01:06.320812 master-0 kubenswrapper[9136]: I1203 22:01:06.320716 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-log-dir" (OuterVolumeSpecName: "log-dir") pod "ebf07eb54db570834b7c9a90b6b07403" (UID: "ebf07eb54db570834b7c9a90b6b07403"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:01:06.320812 master-0 kubenswrapper[9136]: I1203 22:01:06.320722 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-resource-dir\") pod \"ebf07eb54db570834b7c9a90b6b07403\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " Dec 03 22:01:06.321115 master-0 kubenswrapper[9136]: I1203 22:01:06.320832 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-static-pod-dir\") pod \"ebf07eb54db570834b7c9a90b6b07403\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " Dec 03 22:01:06.321115 master-0 kubenswrapper[9136]: I1203 22:01:06.320916 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-usr-local-bin\") pod \"ebf07eb54db570834b7c9a90b6b07403\" (UID: \"ebf07eb54db570834b7c9a90b6b07403\") " Dec 03 22:01:06.321115 master-0 kubenswrapper[9136]: I1203 22:01:06.320795 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "ebf07eb54db570834b7c9a90b6b07403" (UID: "ebf07eb54db570834b7c9a90b6b07403"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:01:06.321115 master-0 kubenswrapper[9136]: I1203 22:01:06.320827 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "ebf07eb54db570834b7c9a90b6b07403" (UID: "ebf07eb54db570834b7c9a90b6b07403"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:01:06.321115 master-0 kubenswrapper[9136]: I1203 22:01:06.320875 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "ebf07eb54db570834b7c9a90b6b07403" (UID: "ebf07eb54db570834b7c9a90b6b07403"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:01:06.321115 master-0 kubenswrapper[9136]: I1203 22:01:06.321055 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "ebf07eb54db570834b7c9a90b6b07403" (UID: "ebf07eb54db570834b7c9a90b6b07403"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:01:06.321353 master-0 kubenswrapper[9136]: I1203 22:01:06.321319 9136 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Dec 03 22:01:06.321353 master-0 kubenswrapper[9136]: I1203 22:01:06.321345 9136 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-data-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:01:06.321438 master-0 kubenswrapper[9136]: I1203 22:01:06.321362 9136 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-log-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:01:06.321438 master-0 kubenswrapper[9136]: I1203 22:01:06.321380 9136 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:01:06.321438 master-0 kubenswrapper[9136]: I1203 22:01:06.321399 9136 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:01:06.321438 master-0 kubenswrapper[9136]: I1203 22:01:06.321415 9136 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/ebf07eb54db570834b7c9a90b6b07403-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:01:06.888852 master-0 kubenswrapper[9136]: I1203 22:01:06.888733 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-rev/0.log" Dec 03 22:01:06.891421 master-0 kubenswrapper[9136]: I1203 22:01:06.891349 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd-metrics/0.log" Dec 03 22:01:06.892846 master-0 kubenswrapper[9136]: I1203 22:01:06.892728 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcd/0.log" Dec 03 22:01:06.893701 master-0 kubenswrapper[9136]: I1203 22:01:06.893622 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_ebf07eb54db570834b7c9a90b6b07403/etcdctl/0.log" Dec 03 22:01:06.895608 master-0 kubenswrapper[9136]: I1203 22:01:06.895520 9136 scope.go:117] "RemoveContainer" containerID="2cc1c5605242df3437ae984a9e1e4d7f667ba732f16f6113d93fc241030ae6be" Dec 03 22:01:06.895751 master-0 kubenswrapper[9136]: I1203 22:01:06.895611 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 22:01:06.907522 master-0 kubenswrapper[9136]: I1203 22:01:06.907450 9136 scope.go:117] "RemoveContainer" containerID="f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26" Dec 03 22:01:06.923479 master-0 kubenswrapper[9136]: I1203 22:01:06.923013 9136 scope.go:117] "RemoveContainer" containerID="26f2c19baa42c22f852104341244a584ce87ef1be47ee995c3a898bbf27ea037" Dec 03 22:01:06.958665 master-0 kubenswrapper[9136]: I1203 22:01:06.958603 9136 scope.go:117] "RemoveContainer" containerID="03a6c4cf49b5dfdb537d453011bfa5fccbeecd91abc60d2409d33529fbb25f01" Dec 03 22:01:07.007993 master-0 kubenswrapper[9136]: I1203 22:01:07.007871 9136 scope.go:117] "RemoveContainer" containerID="e1b125fe53f3c1e5501ca76af42c64d1123972595dfade5a4e4e7d7f0045d4fa" Dec 03 22:01:07.037871 master-0 kubenswrapper[9136]: I1203 22:01:07.037793 9136 scope.go:117] "RemoveContainer" containerID="c0dc443f1527bf49aa993d4373b075ba2cac81ea231c3894c047c835a26b4c9f" Dec 03 22:01:07.065213 master-0 kubenswrapper[9136]: I1203 22:01:07.065073 9136 scope.go:117] "RemoveContainer" containerID="f5cb9e42c952aa7425c595831eb5aaa25eadc2bda2a857d2db27d386e674a3ef" Dec 03 22:01:07.093158 master-0 kubenswrapper[9136]: I1203 22:01:07.093050 9136 scope.go:117] "RemoveContainer" containerID="8c01bfaede47fa025c4081862076a0aa6db537aee1b97a2732cbc64498e3dbda" Dec 03 22:01:07.117076 master-0 kubenswrapper[9136]: I1203 22:01:07.117010 9136 scope.go:117] "RemoveContainer" containerID="b8ff15b93fc508f3e0b23a351d0dc90be313d555a284b59cf318e88e1e041e14" Dec 03 22:01:07.905712 master-0 kubenswrapper[9136]: I1203 22:01:07.905600 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"d78739a7694769882b7e47ea5ac08a10","Type":"ContainerStarted","Data":"c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac"} Dec 03 22:01:07.920594 master-0 kubenswrapper[9136]: I1203 22:01:07.920484 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf07eb54db570834b7c9a90b6b07403" path="/var/lib/kubelet/pods/ebf07eb54db570834b7c9a90b6b07403/volumes" Dec 03 22:01:09.654725 master-0 kubenswrapper[9136]: I1203 22:01:09.654539 9136 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dbea974b-42c4-49f6-98dd-1512d74ac16c\\\"},\\\"status\\\":{\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a613b3a7f370a81b82a9c80ed68357904c42d57cdeafb6362cea4bc2248eaba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2e254a7fb8a2643817718cfdb54bc819e86eb84232f6e2456548c55c5efb09d2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ddeb5f184ad031c33b8f3f52c83dc7bc7558153b3f92fd847bf08cf2b2c45bec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ea02cb72627330e0d9cfb5b5ff03c2263f176b244172690291c3611f9a855f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbbfa9bd2a1be205a1331c5dee6338e477cf874c4552e42ed29c6c6059d3ca04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:58ed827ee19ac91b6f860d307797b24b8aec02e671605388c4afe4fa19ddfc36\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T22:00:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}]}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-master-0\": Timeout: request did not complete within requested timeout - context deadline exceeded" Dec 03 22:01:11.793904 master-0 kubenswrapper[9136]: E1203 22:01:11.793658 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{cni-sysctl-allowlist-ds-nrcql.187dd389cc42090a openshift-multus 11949 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-multus,Name:cni-sysctl-allowlist-ds-nrcql,UID:519e403c-28ab-4750-8143-34c74bf526ce,APIVersion:v1,ResourceVersion:11678,FieldPath:spec.containers{kube-multus-additional-cni-plugins},},Reason:Unhealthy,Message:Readiness probe errored: rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 22:00:17 +0000 UTC,LastTimestamp:2025-12-03 22:00:37.790262909 +0000 UTC m=+644.065439321,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:01:12.208444 master-0 kubenswrapper[9136]: E1203 22:01:12.208362 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:01:13.907039 master-0 kubenswrapper[9136]: I1203 22:01:13.906932 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 22:01:13.938157 master-0 kubenswrapper[9136]: I1203 22:01:13.938054 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:01:13.938157 master-0 kubenswrapper[9136]: I1203 22:01:13.938132 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:01:22.209361 master-0 kubenswrapper[9136]: E1203 22:01:22.209254 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:01:27.056035 master-0 kubenswrapper[9136]: I1203 22:01:27.055915 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/1.log" Dec 03 22:01:27.056960 master-0 kubenswrapper[9136]: I1203 22:01:27.056743 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/0.log" Dec 03 22:01:27.057364 master-0 kubenswrapper[9136]: I1203 22:01:27.057305 9136 generic.go:334] "Generic (PLEG): container finished" podID="1a0f647a-0260-4737-8ae2-cc90d01d33d1" containerID="afb5b53ec6fc0fe7cc9eca316bae9563bccc04673dbe4d97d1079ce470fa5a03" exitCode=1 Dec 03 22:01:27.057462 master-0 kubenswrapper[9136]: I1203 22:01:27.057350 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerDied","Data":"afb5b53ec6fc0fe7cc9eca316bae9563bccc04673dbe4d97d1079ce470fa5a03"} Dec 03 22:01:27.057462 master-0 kubenswrapper[9136]: I1203 22:01:27.057445 9136 scope.go:117] "RemoveContainer" containerID="9a23336ef4cbe48c00b2e6685616a19fe24f57f37d3cf40c5d7d0d8dc909c159" Dec 03 22:01:27.058180 master-0 kubenswrapper[9136]: I1203 22:01:27.058137 9136 scope.go:117] "RemoveContainer" containerID="afb5b53ec6fc0fe7cc9eca316bae9563bccc04673dbe4d97d1079ce470fa5a03" Dec 03 22:01:27.058434 master-0 kubenswrapper[9136]: E1203 22:01:27.058360 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"approver\" with CrashLoopBackOff: \"back-off 10s restarting failed container=approver pod=network-node-identity-r24k4_openshift-network-node-identity(1a0f647a-0260-4737-8ae2-cc90d01d33d1)\"" pod="openshift-network-node-identity/network-node-identity-r24k4" podUID="1a0f647a-0260-4737-8ae2-cc90d01d33d1" Dec 03 22:01:28.066907 master-0 kubenswrapper[9136]: I1203 22:01:28.066762 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/1.log" Dec 03 22:01:30.087666 master-0 kubenswrapper[9136]: I1203 22:01:30.087251 9136 generic.go:334] "Generic (PLEG): container finished" podID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerID="81bfd9fda486fff0e9f3b1dee5719d6b24e5410ceadededcc88dc65916bf639e" exitCode=0 Dec 03 22:01:30.087666 master-0 kubenswrapper[9136]: I1203 22:01:30.087315 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerDied","Data":"81bfd9fda486fff0e9f3b1dee5719d6b24e5410ceadededcc88dc65916bf639e"} Dec 03 22:01:30.087666 master-0 kubenswrapper[9136]: I1203 22:01:30.087422 9136 scope.go:117] "RemoveContainer" containerID="7ecf6575752194e376e6db6b9227fb236567480b2e2793db9b6ea4ab3a895c8f" Dec 03 22:01:31.098220 master-0 kubenswrapper[9136]: I1203 22:01:31.098137 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"78424a8a743f1d7a594eeae8a1082a555251b4e9cc2031fa13e2e6be36e4505e"} Dec 03 22:01:31.761257 master-0 kubenswrapper[9136]: I1203 22:01:31.761102 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:01:31.765077 master-0 kubenswrapper[9136]: I1203 22:01:31.764998 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:31.765077 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:31.765077 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:31.765077 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:31.765422 master-0 kubenswrapper[9136]: I1203 22:01:31.765091 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:32.210764 master-0 kubenswrapper[9136]: E1203 22:01:32.210333 9136 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:01:32.210764 master-0 kubenswrapper[9136]: I1203 22:01:32.210407 9136 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 22:01:32.763916 master-0 kubenswrapper[9136]: I1203 22:01:32.763848 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:32.763916 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:32.763916 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:32.763916 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:32.764382 master-0 kubenswrapper[9136]: I1203 22:01:32.763945 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:33.764453 master-0 kubenswrapper[9136]: I1203 22:01:33.764173 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:33.764453 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:33.764453 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:33.764453 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:33.765859 master-0 kubenswrapper[9136]: I1203 22:01:33.764464 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:34.760564 master-0 kubenswrapper[9136]: I1203 22:01:34.760471 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:01:34.762920 master-0 kubenswrapper[9136]: I1203 22:01:34.762797 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:34.762920 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:34.762920 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:34.762920 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:34.762920 master-0 kubenswrapper[9136]: I1203 22:01:34.762893 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:35.766668 master-0 kubenswrapper[9136]: I1203 22:01:35.766577 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:35.766668 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:35.766668 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:35.766668 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:35.767587 master-0 kubenswrapper[9136]: I1203 22:01:35.766691 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:36.764170 master-0 kubenswrapper[9136]: I1203 22:01:36.764079 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:36.764170 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:36.764170 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:36.764170 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:36.765059 master-0 kubenswrapper[9136]: I1203 22:01:36.764197 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:37.764183 master-0 kubenswrapper[9136]: I1203 22:01:37.764089 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:37.764183 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:37.764183 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:37.764183 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:37.764183 master-0 kubenswrapper[9136]: I1203 22:01:37.764175 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:38.763744 master-0 kubenswrapper[9136]: I1203 22:01:38.763654 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:38.763744 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:38.763744 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:38.763744 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:38.764229 master-0 kubenswrapper[9136]: I1203 22:01:38.763757 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:39.764140 master-0 kubenswrapper[9136]: I1203 22:01:39.764053 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:39.764140 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:39.764140 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:39.764140 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:39.765638 master-0 kubenswrapper[9136]: I1203 22:01:39.764156 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:40.763933 master-0 kubenswrapper[9136]: I1203 22:01:40.763827 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:40.763933 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:40.763933 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:40.763933 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:40.763933 master-0 kubenswrapper[9136]: I1203 22:01:40.763919 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:41.764742 master-0 kubenswrapper[9136]: I1203 22:01:41.764615 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:41.764742 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:41.764742 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:41.764742 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:41.765819 master-0 kubenswrapper[9136]: I1203 22:01:41.764827 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:42.211762 master-0 kubenswrapper[9136]: E1203 22:01:42.211640 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Dec 03 22:01:42.764325 master-0 kubenswrapper[9136]: I1203 22:01:42.764216 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:42.764325 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:42.764325 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:42.764325 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:42.764823 master-0 kubenswrapper[9136]: I1203 22:01:42.764347 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:42.908196 master-0 kubenswrapper[9136]: I1203 22:01:42.908100 9136 scope.go:117] "RemoveContainer" containerID="afb5b53ec6fc0fe7cc9eca316bae9563bccc04673dbe4d97d1079ce470fa5a03" Dec 03 22:01:43.190030 master-0 kubenswrapper[9136]: I1203 22:01:43.189946 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/1.log" Dec 03 22:01:43.190700 master-0 kubenswrapper[9136]: I1203 22:01:43.190572 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-r24k4" event={"ID":"1a0f647a-0260-4737-8ae2-cc90d01d33d1","Type":"ContainerStarted","Data":"0e16ca8fad683735bf826ad061856fd51a9141ed4ab4d23c0adca35cf39bb0cc"} Dec 03 22:01:43.764906 master-0 kubenswrapper[9136]: I1203 22:01:43.764815 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:43.764906 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:43.764906 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:43.764906 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:43.766177 master-0 kubenswrapper[9136]: I1203 22:01:43.764923 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:44.763206 master-0 kubenswrapper[9136]: I1203 22:01:44.763102 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:44.763206 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:44.763206 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:44.763206 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:44.763662 master-0 kubenswrapper[9136]: I1203 22:01:44.763236 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:45.208742 master-0 kubenswrapper[9136]: I1203 22:01:45.208667 9136 generic.go:334] "Generic (PLEG): container finished" podID="578b2d03-b8b3-4c75-adde-73899c472ad7" containerID="3445ebe854f42ae0ad53b657352b66491efe14d790f93fa83d0efffb68d3546d" exitCode=0 Dec 03 22:01:45.208742 master-0 kubenswrapper[9136]: I1203 22:01:45.208725 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" event={"ID":"578b2d03-b8b3-4c75-adde-73899c472ad7","Type":"ContainerDied","Data":"3445ebe854f42ae0ad53b657352b66491efe14d790f93fa83d0efffb68d3546d"} Dec 03 22:01:45.208742 master-0 kubenswrapper[9136]: I1203 22:01:45.208786 9136 scope.go:117] "RemoveContainer" containerID="69c66e8c33554080901dde3e7449b542159057d7baa15426e32c9d8a63162fa3" Dec 03 22:01:45.209923 master-0 kubenswrapper[9136]: I1203 22:01:45.209689 9136 scope.go:117] "RemoveContainer" containerID="3445ebe854f42ae0ad53b657352b66491efe14d790f93fa83d0efffb68d3546d" Dec 03 22:01:45.210617 master-0 kubenswrapper[9136]: E1203 22:01:45.210061 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-59d99f9b7b-x4tfh_openshift-insights(578b2d03-b8b3-4c75-adde-73899c472ad7)\"" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" podUID="578b2d03-b8b3-4c75-adde-73899c472ad7" Dec 03 22:01:45.762059 master-0 kubenswrapper[9136]: I1203 22:01:45.762009 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:45.762059 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:45.762059 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:45.762059 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:45.762401 master-0 kubenswrapper[9136]: I1203 22:01:45.762072 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: E1203 22:01:45.797139 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: &Event{ObjectMeta:{router-default-54f97f57-xq6ch.187dd34d7d2463d5 openshift-ingress 10796 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-ingress,Name:router-default-54f97f57-xq6ch,UID:698e6d87-1a58-493c-8b69-d22c89d26ac5,APIVersion:v1,ResourceVersion:10260,FieldPath:spec.containers{router},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: body: [-]backend-http failed: reason withheld Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:55:58 +0000 UTC,LastTimestamp:2025-12-03 22:00:38.764114365 +0000 UTC m=+645.039290777,Count:235,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 03 22:01:45.797272 master-0 kubenswrapper[9136]: > Dec 03 22:01:46.764280 master-0 kubenswrapper[9136]: I1203 22:01:46.764185 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:46.764280 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:46.764280 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:46.764280 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:46.765407 master-0 kubenswrapper[9136]: I1203 22:01:46.764348 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:47.764406 master-0 kubenswrapper[9136]: I1203 22:01:47.764286 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:47.764406 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:47.764406 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:47.764406 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:47.764406 master-0 kubenswrapper[9136]: I1203 22:01:47.764397 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:47.941850 master-0 kubenswrapper[9136]: E1203 22:01:47.941675 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 22:01:47.942645 master-0 kubenswrapper[9136]: I1203 22:01:47.942589 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Dec 03 22:01:47.974025 master-0 kubenswrapper[9136]: W1203 22:01:47.973925 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd8b778e190b1975a0a8fad534da6dd.slice/crio-963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9 WatchSource:0}: Error finding container 963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9: Status 404 returned error can't find the container with id 963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9 Dec 03 22:01:48.262822 master-0 kubenswrapper[9136]: I1203 22:01:48.262729 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9"} Dec 03 22:01:48.762916 master-0 kubenswrapper[9136]: I1203 22:01:48.762802 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:48.762916 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:48.762916 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:48.762916 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:48.762916 master-0 kubenswrapper[9136]: I1203 22:01:48.762908 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:49.273312 master-0 kubenswrapper[9136]: I1203 22:01:49.273237 9136 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="909ec871125d6c5f469945361701b23d979d4f7f33b16129f5238c7a2207ec30" exitCode=0 Dec 03 22:01:49.274951 master-0 kubenswrapper[9136]: I1203 22:01:49.273333 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"909ec871125d6c5f469945361701b23d979d4f7f33b16129f5238c7a2207ec30"} Dec 03 22:01:49.274951 master-0 kubenswrapper[9136]: I1203 22:01:49.273803 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:01:49.274951 master-0 kubenswrapper[9136]: I1203 22:01:49.273855 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:01:49.763794 master-0 kubenswrapper[9136]: I1203 22:01:49.763647 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:49.763794 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:49.763794 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:49.763794 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:49.763794 master-0 kubenswrapper[9136]: I1203 22:01:49.763745 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:50.764036 master-0 kubenswrapper[9136]: I1203 22:01:50.763921 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:50.764036 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:50.764036 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:50.764036 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:50.765297 master-0 kubenswrapper[9136]: I1203 22:01:50.764037 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:51.763926 master-0 kubenswrapper[9136]: I1203 22:01:51.763837 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:51.763926 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:51.763926 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:51.763926 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:51.764959 master-0 kubenswrapper[9136]: I1203 22:01:51.763936 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:52.413326 master-0 kubenswrapper[9136]: E1203 22:01:52.413208 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Dec 03 22:01:52.764085 master-0 kubenswrapper[9136]: I1203 22:01:52.763862 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:52.764085 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:52.764085 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:52.764085 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:52.764085 master-0 kubenswrapper[9136]: I1203 22:01:52.764006 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:53.763312 master-0 kubenswrapper[9136]: I1203 22:01:53.763179 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:53.763312 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:53.763312 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:53.763312 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:53.763928 master-0 kubenswrapper[9136]: I1203 22:01:53.763315 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:54.764353 master-0 kubenswrapper[9136]: I1203 22:01:54.764239 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:54.764353 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:54.764353 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:54.764353 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:54.765472 master-0 kubenswrapper[9136]: I1203 22:01:54.764390 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:55.764716 master-0 kubenswrapper[9136]: I1203 22:01:55.764615 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:55.764716 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:55.764716 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:55.764716 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:55.765866 master-0 kubenswrapper[9136]: I1203 22:01:55.764816 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:56.763956 master-0 kubenswrapper[9136]: I1203 22:01:56.763862 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:56.763956 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:56.763956 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:56.763956 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:56.763956 master-0 kubenswrapper[9136]: I1203 22:01:56.763963 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:57.764443 master-0 kubenswrapper[9136]: I1203 22:01:57.764332 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:57.764443 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:57.764443 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:57.764443 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:57.764443 master-0 kubenswrapper[9136]: I1203 22:01:57.764434 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:58.764387 master-0 kubenswrapper[9136]: I1203 22:01:58.764275 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:58.764387 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:58.764387 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:58.764387 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:58.764387 master-0 kubenswrapper[9136]: I1203 22:01:58.764371 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:59.763112 master-0 kubenswrapper[9136]: I1203 22:01:59.763041 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:01:59.763112 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:01:59.763112 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:01:59.763112 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:01:59.763455 master-0 kubenswrapper[9136]: I1203 22:01:59.763119 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:01:59.908282 master-0 kubenswrapper[9136]: I1203 22:01:59.908206 9136 scope.go:117] "RemoveContainer" containerID="3445ebe854f42ae0ad53b657352b66491efe14d790f93fa83d0efffb68d3546d" Dec 03 22:02:00.365978 master-0 kubenswrapper[9136]: I1203 22:02:00.365908 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" event={"ID":"578b2d03-b8b3-4c75-adde-73899c472ad7","Type":"ContainerStarted","Data":"02ed463d54004078354e1438bfc70bbfd194f0ccfe125a9ef4ad2826ac2a833a"} Dec 03 22:02:00.764528 master-0 kubenswrapper[9136]: I1203 22:02:00.764361 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:00.764528 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:00.764528 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:00.764528 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:00.764528 master-0 kubenswrapper[9136]: I1203 22:02:00.764480 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:01.764606 master-0 kubenswrapper[9136]: I1203 22:02:01.764519 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:01.764606 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:01.764606 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:01.764606 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:01.765924 master-0 kubenswrapper[9136]: I1203 22:02:01.764620 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:02.764126 master-0 kubenswrapper[9136]: I1203 22:02:02.763996 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:02.764126 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:02.764126 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:02.764126 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:02.764126 master-0 kubenswrapper[9136]: I1203 22:02:02.764080 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:02.814118 master-0 kubenswrapper[9136]: E1203 22:02:02.813976 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Dec 03 22:02:03.763954 master-0 kubenswrapper[9136]: I1203 22:02:03.763820 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:03.763954 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:03.763954 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:03.763954 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:03.763954 master-0 kubenswrapper[9136]: I1203 22:02:03.763923 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:04.763223 master-0 kubenswrapper[9136]: I1203 22:02:04.763077 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:04.763223 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:04.763223 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:04.763223 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:04.763223 master-0 kubenswrapper[9136]: I1203 22:02:04.763246 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:05.763548 master-0 kubenswrapper[9136]: I1203 22:02:05.763454 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:05.763548 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:05.763548 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:05.763548 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:05.764604 master-0 kubenswrapper[9136]: I1203 22:02:05.763559 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:06.763938 master-0 kubenswrapper[9136]: I1203 22:02:06.763868 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:06.763938 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:06.763938 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:06.763938 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:06.764939 master-0 kubenswrapper[9136]: I1203 22:02:06.763946 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:07.763891 master-0 kubenswrapper[9136]: I1203 22:02:07.763755 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:07.763891 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:07.763891 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:07.763891 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:07.763891 master-0 kubenswrapper[9136]: I1203 22:02:07.763873 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:08.763305 master-0 kubenswrapper[9136]: I1203 22:02:08.763223 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:08.763305 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:08.763305 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:08.763305 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:08.763305 master-0 kubenswrapper[9136]: I1203 22:02:08.763305 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:09.656926 master-0 kubenswrapper[9136]: I1203 22:02:09.656754 9136 status_manager.go:851] "Failed to get status for pod" podUID="519e403c-28ab-4750-8143-34c74bf526ce" pod="openshift-multus/cni-sysctl-allowlist-ds-nrcql" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods cni-sysctl-allowlist-ds-nrcql)" Dec 03 22:02:09.763924 master-0 kubenswrapper[9136]: I1203 22:02:09.763856 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:09.763924 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:09.763924 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:09.763924 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:09.764466 master-0 kubenswrapper[9136]: I1203 22:02:09.764428 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:10.763234 master-0 kubenswrapper[9136]: I1203 22:02:10.763129 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:10.763234 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:10.763234 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:10.763234 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:10.763234 master-0 kubenswrapper[9136]: I1203 22:02:10.763224 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:11.764218 master-0 kubenswrapper[9136]: I1203 22:02:11.764120 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:11.764218 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:11.764218 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:11.764218 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:11.765253 master-0 kubenswrapper[9136]: I1203 22:02:11.764224 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:12.763530 master-0 kubenswrapper[9136]: I1203 22:02:12.763420 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:12.763530 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:12.763530 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:12.763530 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:12.764079 master-0 kubenswrapper[9136]: I1203 22:02:12.763539 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:13.615526 master-0 kubenswrapper[9136]: E1203 22:02:13.615382 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Dec 03 22:02:13.764380 master-0 kubenswrapper[9136]: I1203 22:02:13.764293 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:13.764380 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:13.764380 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:13.764380 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:13.764380 master-0 kubenswrapper[9136]: I1203 22:02:13.764359 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:14.763723 master-0 kubenswrapper[9136]: I1203 22:02:14.763643 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:14.763723 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:14.763723 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:14.763723 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:14.764495 master-0 kubenswrapper[9136]: I1203 22:02:14.763757 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:15.763357 master-0 kubenswrapper[9136]: I1203 22:02:15.763300 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:15.763357 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:15.763357 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:15.763357 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:15.763748 master-0 kubenswrapper[9136]: I1203 22:02:15.763380 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:16.764043 master-0 kubenswrapper[9136]: I1203 22:02:16.763985 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:16.764043 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:16.764043 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:16.764043 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:16.765047 master-0 kubenswrapper[9136]: I1203 22:02:16.764847 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:17.763908 master-0 kubenswrapper[9136]: I1203 22:02:17.763814 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:17.763908 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:17.763908 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:17.763908 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:17.763908 master-0 kubenswrapper[9136]: I1203 22:02:17.763904 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:18.762322 master-0 kubenswrapper[9136]: I1203 22:02:18.762230 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:18.762322 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:18.762322 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:18.762322 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:18.762818 master-0 kubenswrapper[9136]: I1203 22:02:18.762333 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:19.762647 master-0 kubenswrapper[9136]: I1203 22:02:19.762573 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:19.762647 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:19.762647 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:19.762647 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:19.763342 master-0 kubenswrapper[9136]: I1203 22:02:19.762683 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:19.802007 master-0 kubenswrapper[9136]: E1203 22:02:19.801760 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd391b37f3e72 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:BackOff,Message:Back-off restarting failed container kube-scheduler in pod bootstrap-kube-scheduler-master-0_kube-system(d78739a7694769882b7e47ea5ac08a10),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 22:00:51.734281842 +0000 UTC m=+658.009458274,LastTimestamp:2025-12-03 22:00:51.734281842 +0000 UTC m=+658.009458274,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:02:20.763890 master-0 kubenswrapper[9136]: I1203 22:02:20.763830 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:20.763890 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:20.763890 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:20.763890 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:20.765221 master-0 kubenswrapper[9136]: I1203 22:02:20.765171 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:21.763539 master-0 kubenswrapper[9136]: I1203 22:02:21.763438 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:21.763539 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:21.763539 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:21.763539 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:21.763539 master-0 kubenswrapper[9136]: I1203 22:02:21.763527 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:22.764689 master-0 kubenswrapper[9136]: I1203 22:02:22.764602 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:22.764689 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:22.764689 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:22.764689 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:22.765806 master-0 kubenswrapper[9136]: I1203 22:02:22.764698 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:23.276397 master-0 kubenswrapper[9136]: E1203 22:02:23.276309 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 22:02:23.764356 master-0 kubenswrapper[9136]: I1203 22:02:23.764111 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:23.764356 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:23.764356 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:23.764356 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:23.764356 master-0 kubenswrapper[9136]: I1203 22:02:23.764211 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:24.555871 master-0 kubenswrapper[9136]: I1203 22:02:24.555748 9136 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="919b8027f8f33ff55f195fb5965cea0049cac7097943867a77eb3961b87644ee" exitCode=0 Dec 03 22:02:24.555871 master-0 kubenswrapper[9136]: I1203 22:02:24.555809 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"919b8027f8f33ff55f195fb5965cea0049cac7097943867a77eb3961b87644ee"} Dec 03 22:02:24.556351 master-0 kubenswrapper[9136]: I1203 22:02:24.556296 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:02:24.556351 master-0 kubenswrapper[9136]: I1203 22:02:24.556350 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:02:24.763159 master-0 kubenswrapper[9136]: I1203 22:02:24.763041 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:24.763159 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:24.763159 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:24.763159 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:24.763159 master-0 kubenswrapper[9136]: I1203 22:02:24.763158 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:25.217419 master-0 kubenswrapper[9136]: E1203 22:02:25.217343 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 22:02:25.764054 master-0 kubenswrapper[9136]: I1203 22:02:25.763969 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:25.764054 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:25.764054 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:25.764054 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:25.764430 master-0 kubenswrapper[9136]: I1203 22:02:25.764085 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:26.763354 master-0 kubenswrapper[9136]: I1203 22:02:26.763270 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:26.763354 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:26.763354 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:26.763354 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:26.764113 master-0 kubenswrapper[9136]: I1203 22:02:26.763366 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:27.764167 master-0 kubenswrapper[9136]: I1203 22:02:27.764083 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:27.764167 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:27.764167 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:27.764167 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:27.764810 master-0 kubenswrapper[9136]: I1203 22:02:27.764177 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:28.585357 master-0 kubenswrapper[9136]: I1203 22:02:28.585300 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/3.log" Dec 03 22:02:28.586048 master-0 kubenswrapper[9136]: I1203 22:02:28.585987 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/2.log" Dec 03 22:02:28.586152 master-0 kubenswrapper[9136]: I1203 22:02:28.586104 9136 generic.go:334] "Generic (PLEG): container finished" podID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" containerID="e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e" exitCode=1 Dec 03 22:02:28.586220 master-0 kubenswrapper[9136]: I1203 22:02:28.586173 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerDied","Data":"e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e"} Dec 03 22:02:28.586294 master-0 kubenswrapper[9136]: I1203 22:02:28.586260 9136 scope.go:117] "RemoveContainer" containerID="a05a4eebdeab07c94caa9823ffeece53541343f1149e78fde7e1048cdda7e1b6" Dec 03 22:02:28.587130 master-0 kubenswrapper[9136]: I1203 22:02:28.587080 9136 scope.go:117] "RemoveContainer" containerID="e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e" Dec 03 22:02:28.587886 master-0 kubenswrapper[9136]: E1203 22:02:28.587454 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:02:28.764339 master-0 kubenswrapper[9136]: I1203 22:02:28.764126 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:28.764339 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:28.764339 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:28.764339 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:28.764339 master-0 kubenswrapper[9136]: I1203 22:02:28.764246 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:29.598791 master-0 kubenswrapper[9136]: I1203 22:02:29.598685 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/3.log" Dec 03 22:02:29.763621 master-0 kubenswrapper[9136]: I1203 22:02:29.763531 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:29.763621 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:29.763621 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:29.763621 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:29.763621 master-0 kubenswrapper[9136]: I1203 22:02:29.763626 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:30.608934 master-0 kubenswrapper[9136]: I1203 22:02:30.608876 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/1.log" Dec 03 22:02:30.610216 master-0 kubenswrapper[9136]: I1203 22:02:30.610196 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/0.log" Dec 03 22:02:30.610710 master-0 kubenswrapper[9136]: I1203 22:02:30.610662 9136 generic.go:334] "Generic (PLEG): container finished" podID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerID="9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a" exitCode=1 Dec 03 22:02:30.610809 master-0 kubenswrapper[9136]: I1203 22:02:30.610711 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerDied","Data":"9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a"} Dec 03 22:02:30.610809 master-0 kubenswrapper[9136]: I1203 22:02:30.610752 9136 scope.go:117] "RemoveContainer" containerID="0f6a85e4a73afc226173f2b4c67fa571e667d3c81985c4b5d23669be9018152c" Dec 03 22:02:30.611345 master-0 kubenswrapper[9136]: I1203 22:02:30.611292 9136 scope.go:117] "RemoveContainer" containerID="9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a" Dec 03 22:02:30.611527 master-0 kubenswrapper[9136]: E1203 22:02:30.611498 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-754cfd84-bnstw_openshift-catalogd(9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6)\"" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podUID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" Dec 03 22:02:30.763250 master-0 kubenswrapper[9136]: I1203 22:02:30.763157 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:30.763250 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:30.763250 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:30.763250 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:30.763831 master-0 kubenswrapper[9136]: I1203 22:02:30.763273 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:31.617956 master-0 kubenswrapper[9136]: I1203 22:02:31.617879 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/1.log" Dec 03 22:02:31.763528 master-0 kubenswrapper[9136]: I1203 22:02:31.763415 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:31.763528 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:31.763528 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:31.763528 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:31.763528 master-0 kubenswrapper[9136]: I1203 22:02:31.763519 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:32.368953 master-0 kubenswrapper[9136]: I1203 22:02:32.368846 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:02:32.369976 master-0 kubenswrapper[9136]: I1203 22:02:32.369920 9136 scope.go:117] "RemoveContainer" containerID="9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a" Dec 03 22:02:32.370398 master-0 kubenswrapper[9136]: E1203 22:02:32.370332 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=catalogd-controller-manager-754cfd84-bnstw_openshift-catalogd(9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6)\"" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" podUID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" Dec 03 22:02:32.763975 master-0 kubenswrapper[9136]: I1203 22:02:32.763811 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:32.763975 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:32.763975 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:32.763975 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:32.763975 master-0 kubenswrapper[9136]: I1203 22:02:32.763918 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:33.764446 master-0 kubenswrapper[9136]: I1203 22:02:33.764328 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:33.764446 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:33.764446 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:33.764446 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:33.765453 master-0 kubenswrapper[9136]: I1203 22:02:33.764453 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:34.646920 master-0 kubenswrapper[9136]: I1203 22:02:34.646761 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/0.log" Dec 03 22:02:34.647731 master-0 kubenswrapper[9136]: I1203 22:02:34.647658 9136 generic.go:334] "Generic (PLEG): container finished" podID="def52ba3-77c1-4e0c-8a0d-44ff4d677607" containerID="19a9bc6e0f1f8798f87d4882e3c42bf8b1af00134db1a21061f75b7ece3744fd" exitCode=1 Dec 03 22:02:34.647731 master-0 kubenswrapper[9136]: I1203 22:02:34.647724 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerDied","Data":"19a9bc6e0f1f8798f87d4882e3c42bf8b1af00134db1a21061f75b7ece3744fd"} Dec 03 22:02:34.648850 master-0 kubenswrapper[9136]: I1203 22:02:34.648798 9136 scope.go:117] "RemoveContainer" containerID="19a9bc6e0f1f8798f87d4882e3c42bf8b1af00134db1a21061f75b7ece3744fd" Dec 03 22:02:34.764239 master-0 kubenswrapper[9136]: I1203 22:02:34.764165 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:34.764239 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:34.764239 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:34.764239 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:34.765013 master-0 kubenswrapper[9136]: I1203 22:02:34.764237 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:35.659049 master-0 kubenswrapper[9136]: I1203 22:02:35.658971 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/0.log" Dec 03 22:02:35.659857 master-0 kubenswrapper[9136]: I1203 22:02:35.659800 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerStarted","Data":"75db246c4b8a8b8a25c05292a8bbaa915186c922b09c9d2ab2042d68ea4720d7"} Dec 03 22:02:35.764204 master-0 kubenswrapper[9136]: I1203 22:02:35.764073 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:35.764204 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:35.764204 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:35.764204 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:35.764204 master-0 kubenswrapper[9136]: I1203 22:02:35.764183 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:36.671711 master-0 kubenswrapper[9136]: I1203 22:02:36.671604 9136 generic.go:334] "Generic (PLEG): container finished" podID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerID="8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483" exitCode=0 Dec 03 22:02:36.671711 master-0 kubenswrapper[9136]: I1203 22:02:36.671679 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerDied","Data":"8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483"} Dec 03 22:02:36.672163 master-0 kubenswrapper[9136]: I1203 22:02:36.671739 9136 scope.go:117] "RemoveContainer" containerID="82722f9afea36ecc1e8162d281159bb72f29817202c15f48a80157dddaa3525e" Dec 03 22:02:36.672554 master-0 kubenswrapper[9136]: I1203 22:02:36.672495 9136 scope.go:117] "RemoveContainer" containerID="8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483" Dec 03 22:02:36.673014 master-0 kubenswrapper[9136]: E1203 22:02:36.672850 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-7d67745bb7-4jd6d_openshift-marketplace(a4399d20-f9a6-4ab1-86be-e2845394eaba)\"" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" podUID="a4399d20-f9a6-4ab1-86be-e2845394eaba" Dec 03 22:02:36.763257 master-0 kubenswrapper[9136]: I1203 22:02:36.763149 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:36.763257 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:36.763257 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:36.763257 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:36.763810 master-0 kubenswrapper[9136]: I1203 22:02:36.763277 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:36.949509 master-0 kubenswrapper[9136]: I1203 22:02:36.949334 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:02:36.949509 master-0 kubenswrapper[9136]: I1203 22:02:36.949423 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:02:37.684064 master-0 kubenswrapper[9136]: I1203 22:02:37.684001 9136 scope.go:117] "RemoveContainer" containerID="8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483" Dec 03 22:02:37.684518 master-0 kubenswrapper[9136]: E1203 22:02:37.684463 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-7d67745bb7-4jd6d_openshift-marketplace(a4399d20-f9a6-4ab1-86be-e2845394eaba)\"" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" podUID="a4399d20-f9a6-4ab1-86be-e2845394eaba" Dec 03 22:02:37.687823 master-0 kubenswrapper[9136]: I1203 22:02:37.687641 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/0.log" Dec 03 22:02:37.688554 master-0 kubenswrapper[9136]: I1203 22:02:37.688500 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/cluster-cloud-controller-manager/0.log" Dec 03 22:02:37.688690 master-0 kubenswrapper[9136]: I1203 22:02:37.688562 9136 generic.go:334] "Generic (PLEG): container finished" podID="def52ba3-77c1-4e0c-8a0d-44ff4d677607" containerID="e063c475fe48713685c7da56849e11be35ce10c982e2192d9447df0278644182" exitCode=1 Dec 03 22:02:37.688690 master-0 kubenswrapper[9136]: I1203 22:02:37.688603 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerDied","Data":"e063c475fe48713685c7da56849e11be35ce10c982e2192d9447df0278644182"} Dec 03 22:02:37.689274 master-0 kubenswrapper[9136]: I1203 22:02:37.689241 9136 scope.go:117] "RemoveContainer" containerID="e063c475fe48713685c7da56849e11be35ce10c982e2192d9447df0278644182" Dec 03 22:02:37.763510 master-0 kubenswrapper[9136]: I1203 22:02:37.763419 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:37.763510 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:37.763510 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:37.763510 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:37.764109 master-0 kubenswrapper[9136]: I1203 22:02:37.763531 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:38.418905 master-0 kubenswrapper[9136]: E1203 22:02:38.418753 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 03 22:02:38.699151 master-0 kubenswrapper[9136]: I1203 22:02:38.698899 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/0.log" Dec 03 22:02:38.699997 master-0 kubenswrapper[9136]: I1203 22:02:38.699938 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/cluster-cloud-controller-manager/0.log" Dec 03 22:02:38.700087 master-0 kubenswrapper[9136]: I1203 22:02:38.700050 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" event={"ID":"def52ba3-77c1-4e0c-8a0d-44ff4d677607","Type":"ContainerStarted","Data":"dfbcaddcf7ce4a4fc83310c92b35d12f9eedd21238faebcbf2aced549cf822b9"} Dec 03 22:02:38.763101 master-0 kubenswrapper[9136]: I1203 22:02:38.763021 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:38.763101 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:38.763101 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:38.763101 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:38.763389 master-0 kubenswrapper[9136]: I1203 22:02:38.763108 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:39.763428 master-0 kubenswrapper[9136]: I1203 22:02:39.763296 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:39.763428 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:39.763428 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:39.763428 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:39.763428 master-0 kubenswrapper[9136]: I1203 22:02:39.763408 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:40.763662 master-0 kubenswrapper[9136]: I1203 22:02:40.763562 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:40.763662 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:40.763662 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:40.763662 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:40.765172 master-0 kubenswrapper[9136]: I1203 22:02:40.765059 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:41.727595 master-0 kubenswrapper[9136]: I1203 22:02:41.727520 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/1.log" Dec 03 22:02:41.729106 master-0 kubenswrapper[9136]: I1203 22:02:41.729056 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/0.log" Dec 03 22:02:41.729245 master-0 kubenswrapper[9136]: I1203 22:02:41.729103 9136 generic.go:334] "Generic (PLEG): container finished" podID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerID="b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e" exitCode=1 Dec 03 22:02:41.729245 master-0 kubenswrapper[9136]: I1203 22:02:41.729151 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerDied","Data":"b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e"} Dec 03 22:02:41.729245 master-0 kubenswrapper[9136]: I1203 22:02:41.729187 9136 scope.go:117] "RemoveContainer" containerID="5ae5503bf1205bc663cc1204fe09557a6737e824df9d6d84a86d54a184a45a47" Dec 03 22:02:41.730178 master-0 kubenswrapper[9136]: I1203 22:02:41.730074 9136 scope.go:117] "RemoveContainer" containerID="b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e" Dec 03 22:02:41.730535 master-0 kubenswrapper[9136]: E1203 22:02:41.730478 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-5f78c89466-kz8nk_openshift-operator-controller(da949cf7-ab12-43ff-8e45-da1c2fd46e20)\"" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podUID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" Dec 03 22:02:41.763193 master-0 kubenswrapper[9136]: I1203 22:02:41.763123 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:41.763193 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:41.763193 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:41.763193 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:41.763599 master-0 kubenswrapper[9136]: I1203 22:02:41.763215 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:41.907941 master-0 kubenswrapper[9136]: I1203 22:02:41.907837 9136 scope.go:117] "RemoveContainer" containerID="e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e" Dec 03 22:02:41.908732 master-0 kubenswrapper[9136]: E1203 22:02:41.908184 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:02:42.369045 master-0 kubenswrapper[9136]: I1203 22:02:42.368925 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:02:42.369978 master-0 kubenswrapper[9136]: I1203 22:02:42.369937 9136 scope.go:117] "RemoveContainer" containerID="9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a" Dec 03 22:02:42.740580 master-0 kubenswrapper[9136]: I1203 22:02:42.740502 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/1.log" Dec 03 22:02:42.752109 master-0 kubenswrapper[9136]: I1203 22:02:42.752045 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/1.log" Dec 03 22:02:42.764022 master-0 kubenswrapper[9136]: I1203 22:02:42.763931 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:42.764022 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:42.764022 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:42.764022 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:42.764022 master-0 kubenswrapper[9136]: I1203 22:02:42.764019 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:43.763171 master-0 kubenswrapper[9136]: I1203 22:02:43.763093 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:43.763171 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:43.763171 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:43.763171 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:43.764151 master-0 kubenswrapper[9136]: I1203 22:02:43.763186 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:43.764224 master-0 kubenswrapper[9136]: I1203 22:02:43.764186 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/1.log" Dec 03 22:02:43.764859 master-0 kubenswrapper[9136]: I1203 22:02:43.764796 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" event={"ID":"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6","Type":"ContainerStarted","Data":"70d822c488994edd094331f49273f2e8f1461ab267cf95b3c24ba97b255a3f6c"} Dec 03 22:02:43.765161 master-0 kubenswrapper[9136]: I1203 22:02:43.765113 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:02:44.764175 master-0 kubenswrapper[9136]: I1203 22:02:44.764076 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:44.764175 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:44.764175 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:44.764175 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:44.765257 master-0 kubenswrapper[9136]: I1203 22:02:44.764198 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:44.915252 master-0 kubenswrapper[9136]: I1203 22:02:44.915153 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:02:44.916189 master-0 kubenswrapper[9136]: I1203 22:02:44.916110 9136 scope.go:117] "RemoveContainer" containerID="b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e" Dec 03 22:02:44.916529 master-0 kubenswrapper[9136]: E1203 22:02:44.916476 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=operator-controller-controller-manager-5f78c89466-kz8nk_openshift-operator-controller(da949cf7-ab12-43ff-8e45-da1c2fd46e20)\"" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" podUID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" Dec 03 22:02:45.763515 master-0 kubenswrapper[9136]: I1203 22:02:45.763430 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:45.763515 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:45.763515 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:45.763515 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:45.763854 master-0 kubenswrapper[9136]: I1203 22:02:45.763541 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:46.764424 master-0 kubenswrapper[9136]: I1203 22:02:46.764341 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:46.764424 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:46.764424 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:46.764424 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:46.765710 master-0 kubenswrapper[9136]: I1203 22:02:46.764432 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:47.763844 master-0 kubenswrapper[9136]: I1203 22:02:47.763737 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:47.763844 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:47.763844 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:47.763844 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:47.764320 master-0 kubenswrapper[9136]: I1203 22:02:47.763846 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:48.764277 master-0 kubenswrapper[9136]: I1203 22:02:48.764181 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:48.764277 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:48.764277 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:48.764277 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:48.765010 master-0 kubenswrapper[9136]: I1203 22:02:48.764305 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:49.764639 master-0 kubenswrapper[9136]: I1203 22:02:49.764558 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:49.764639 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:49.764639 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:49.764639 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:49.765726 master-0 kubenswrapper[9136]: I1203 22:02:49.764653 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:49.820500 master-0 kubenswrapper[9136]: I1203 22:02:49.820407 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/4.log" Dec 03 22:02:49.821692 master-0 kubenswrapper[9136]: I1203 22:02:49.821635 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/3.log" Dec 03 22:02:49.822545 master-0 kubenswrapper[9136]: I1203 22:02:49.822492 9136 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" exitCode=1 Dec 03 22:02:49.822758 master-0 kubenswrapper[9136]: I1203 22:02:49.822542 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerDied","Data":"fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f"} Dec 03 22:02:49.823012 master-0 kubenswrapper[9136]: I1203 22:02:49.822986 9136 scope.go:117] "RemoveContainer" containerID="70aab6c0f99a200adb05bf0466bae07bea09c3ff8ddc008c24e8eb7d9ae3808d" Dec 03 22:02:49.823822 master-0 kubenswrapper[9136]: I1203 22:02:49.823735 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:02:49.824240 master-0 kubenswrapper[9136]: E1203 22:02:49.824203 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:02:50.763827 master-0 kubenswrapper[9136]: I1203 22:02:50.763707 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:50.763827 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:50.763827 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:50.763827 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:50.764291 master-0 kubenswrapper[9136]: I1203 22:02:50.763859 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:50.834550 master-0 kubenswrapper[9136]: I1203 22:02:50.834460 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/4.log" Dec 03 22:02:50.907905 master-0 kubenswrapper[9136]: I1203 22:02:50.907760 9136 scope.go:117] "RemoveContainer" containerID="8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483" Dec 03 22:02:51.763644 master-0 kubenswrapper[9136]: I1203 22:02:51.763521 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:51.763644 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:51.763644 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:51.763644 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:51.763644 master-0 kubenswrapper[9136]: I1203 22:02:51.763632 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:51.847938 master-0 kubenswrapper[9136]: I1203 22:02:51.847831 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerStarted","Data":"1f34be941b3fc23e5df25dd00be7228b904591ae9bbedc66623d73e0baeab2fe"} Dec 03 22:02:51.849022 master-0 kubenswrapper[9136]: I1203 22:02:51.848385 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:02:51.853403 master-0 kubenswrapper[9136]: I1203 22:02:51.853321 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:02:52.372982 master-0 kubenswrapper[9136]: I1203 22:02:52.372882 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:02:52.763194 master-0 kubenswrapper[9136]: I1203 22:02:52.763002 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:52.763194 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:52.763194 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:52.763194 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:52.763194 master-0 kubenswrapper[9136]: I1203 22:02:52.763101 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:53.763904 master-0 kubenswrapper[9136]: I1203 22:02:53.763736 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:53.763904 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:53.763904 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:53.763904 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:53.764907 master-0 kubenswrapper[9136]: I1203 22:02:53.763902 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:53.805905 master-0 kubenswrapper[9136]: E1203 22:02:53.805686 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd30b277303dd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d0bb91faa6e9f82b589a6535665e51517abe4a1b2eb5d0b3a36b36df6a5330a0\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:13.859052509 +0000 UTC m=+80.134228931,LastTimestamp:2025-12-03 22:01:06.90959899 +0000 UTC m=+673.184775402,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:02:54.763600 master-0 kubenswrapper[9136]: I1203 22:02:54.763502 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:54.763600 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:54.763600 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:54.763600 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:54.763600 master-0 kubenswrapper[9136]: I1203 22:02:54.763580 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:54.820035 master-0 kubenswrapper[9136]: E1203 22:02:54.819947 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:02:54.915621 master-0 kubenswrapper[9136]: I1203 22:02:54.915565 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:02:54.916864 master-0 kubenswrapper[9136]: I1203 22:02:54.916847 9136 scope.go:117] "RemoveContainer" containerID="b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e" Dec 03 22:02:55.764338 master-0 kubenswrapper[9136]: I1203 22:02:55.764226 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:55.764338 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:55.764338 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:55.764338 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:55.765396 master-0 kubenswrapper[9136]: I1203 22:02:55.764334 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:55.883763 master-0 kubenswrapper[9136]: I1203 22:02:55.883681 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/1.log" Dec 03 22:02:55.885859 master-0 kubenswrapper[9136]: I1203 22:02:55.884441 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" event={"ID":"da949cf7-ab12-43ff-8e45-da1c2fd46e20","Type":"ContainerStarted","Data":"2bf44f57e639b84148478a6b8b46ef8c208137297e49d31d8000cda106459c88"} Dec 03 22:02:55.885859 master-0 kubenswrapper[9136]: I1203 22:02:55.884890 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:02:56.763747 master-0 kubenswrapper[9136]: I1203 22:02:56.763630 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:56.763747 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:56.763747 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:56.763747 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:56.763747 master-0 kubenswrapper[9136]: I1203 22:02:56.763748 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:56.907452 master-0 kubenswrapper[9136]: I1203 22:02:56.907407 9136 scope.go:117] "RemoveContainer" containerID="e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e" Dec 03 22:02:56.908280 master-0 kubenswrapper[9136]: E1203 22:02:56.907608 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:02:57.764126 master-0 kubenswrapper[9136]: I1203 22:02:57.763995 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:57.764126 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:57.764126 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:57.764126 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:57.764126 master-0 kubenswrapper[9136]: I1203 22:02:57.764111 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:57.904934 master-0 kubenswrapper[9136]: I1203 22:02:57.904821 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/1.log" Dec 03 22:02:57.904934 master-0 kubenswrapper[9136]: I1203 22:02:57.904912 9136 generic.go:334] "Generic (PLEG): container finished" podID="82055cfc-b4ce-4a00-a51d-141059947693" containerID="bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd" exitCode=0 Dec 03 22:02:57.905397 master-0 kubenswrapper[9136]: I1203 22:02:57.904989 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerDied","Data":"bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd"} Dec 03 22:02:57.905397 master-0 kubenswrapper[9136]: I1203 22:02:57.905116 9136 scope.go:117] "RemoveContainer" containerID="14ce5aa72512f80cbae852a84578e1ef46771d28aa2cb1798dd058e96128786a" Dec 03 22:02:57.905941 master-0 kubenswrapper[9136]: I1203 22:02:57.905881 9136 scope.go:117] "RemoveContainer" containerID="bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd" Dec 03 22:02:57.906299 master-0 kubenswrapper[9136]: E1203 22:02:57.906244 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 22:02:58.559542 master-0 kubenswrapper[9136]: E1203 22:02:58.559451 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 22:02:58.764342 master-0 kubenswrapper[9136]: I1203 22:02:58.763991 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:58.764342 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:58.764342 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:58.764342 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:58.764342 master-0 kubenswrapper[9136]: I1203 22:02:58.764078 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:58.917953 master-0 kubenswrapper[9136]: I1203 22:02:58.917914 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-cb84b9cdf-wkcnd_01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/machine-approver-controller/0.log" Dec 03 22:02:58.918689 master-0 kubenswrapper[9136]: I1203 22:02:58.918631 9136 generic.go:334] "Generic (PLEG): container finished" podID="01b80ad5-7d7c-4ecd-90b0-2913d4559b5f" containerID="627c6aedb1aa983e6e6dc6c1e7265386ba60120ffcae5ff232be95dd9b9911e5" exitCode=255 Dec 03 22:02:58.918858 master-0 kubenswrapper[9136]: I1203 22:02:58.918688 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" event={"ID":"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f","Type":"ContainerDied","Data":"627c6aedb1aa983e6e6dc6c1e7265386ba60120ffcae5ff232be95dd9b9911e5"} Dec 03 22:02:58.919099 master-0 kubenswrapper[9136]: I1203 22:02:58.919071 9136 scope.go:117] "RemoveContainer" containerID="627c6aedb1aa983e6e6dc6c1e7265386ba60120ffcae5ff232be95dd9b9911e5" Dec 03 22:02:58.926672 master-0 kubenswrapper[9136]: I1203 22:02:58.926577 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"ff61a256b05f04a3b13b90f6bdc5da43d7f265eb788e86c58f6473565825003c"} Dec 03 22:02:58.927152 master-0 kubenswrapper[9136]: I1203 22:02:58.927087 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:02:58.927152 master-0 kubenswrapper[9136]: I1203 22:02:58.927133 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:02:59.764984 master-0 kubenswrapper[9136]: I1203 22:02:59.764882 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:02:59.764984 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:02:59.764984 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:02:59.764984 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:02:59.764984 master-0 kubenswrapper[9136]: I1203 22:02:59.764979 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:02:59.939329 master-0 kubenswrapper[9136]: I1203 22:02:59.939256 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-cb84b9cdf-wkcnd_01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/machine-approver-controller/0.log" Dec 03 22:02:59.940008 master-0 kubenswrapper[9136]: I1203 22:02:59.939922 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" event={"ID":"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f","Type":"ContainerStarted","Data":"a0ad19c41abc80323b5e15f46acd1505e37378ef0fe5ad50bb495338908fbcc1"} Dec 03 22:02:59.943380 master-0 kubenswrapper[9136]: I1203 22:02:59.943307 9136 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="ff61a256b05f04a3b13b90f6bdc5da43d7f265eb788e86c58f6473565825003c" exitCode=0 Dec 03 22:02:59.943512 master-0 kubenswrapper[9136]: I1203 22:02:59.943381 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"ff61a256b05f04a3b13b90f6bdc5da43d7f265eb788e86c58f6473565825003c"} Dec 03 22:03:00.682986 master-0 kubenswrapper[9136]: E1203 22:03:00.682906 9136 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8da5d44_680e_4169_abc6_607bdc37a64d.slice/crio-conmon-2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8da5d44_680e_4169_abc6_607bdc37a64d.slice/crio-2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:03:00.763234 master-0 kubenswrapper[9136]: I1203 22:03:00.763127 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:00.763234 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:00.763234 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:00.763234 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:00.763536 master-0 kubenswrapper[9136]: I1203 22:03:00.763294 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:00.955911 master-0 kubenswrapper[9136]: I1203 22:03:00.955830 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/1.log" Dec 03 22:03:00.957247 master-0 kubenswrapper[9136]: I1203 22:03:00.957193 9136 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7" exitCode=0 Dec 03 22:03:00.957355 master-0 kubenswrapper[9136]: I1203 22:03:00.957300 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerDied","Data":"2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7"} Dec 03 22:03:00.957428 master-0 kubenswrapper[9136]: I1203 22:03:00.957397 9136 scope.go:117] "RemoveContainer" containerID="d544b409fd7217f0540c9141cdb2568edb09ef4b3389d770b5af4b42fbce8988" Dec 03 22:03:00.958299 master-0 kubenswrapper[9136]: I1203 22:03:00.958261 9136 scope.go:117] "RemoveContainer" containerID="2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7" Dec 03 22:03:00.960012 master-0 kubenswrapper[9136]: E1203 22:03:00.959927 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" Dec 03 22:03:00.960884 master-0 kubenswrapper[9136]: I1203 22:03:00.960844 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/1.log" Dec 03 22:03:00.960939 master-0 kubenswrapper[9136]: I1203 22:03:00.960916 9136 generic.go:334] "Generic (PLEG): container finished" podID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" containerID="5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04" exitCode=0 Dec 03 22:03:00.960973 master-0 kubenswrapper[9136]: I1203 22:03:00.960956 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerDied","Data":"5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04"} Dec 03 22:03:00.961628 master-0 kubenswrapper[9136]: I1203 22:03:00.961589 9136 scope.go:117] "RemoveContainer" containerID="5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04" Dec 03 22:03:00.961985 master-0 kubenswrapper[9136]: E1203 22:03:00.961943 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 22:03:01.012352 master-0 kubenswrapper[9136]: I1203 22:03:01.012290 9136 scope.go:117] "RemoveContainer" containerID="ac757c18056258c4d53cfe54ce6d097e1c1edb7723c2d58dc025a8fed4ae755c" Dec 03 22:03:01.763760 master-0 kubenswrapper[9136]: I1203 22:03:01.763587 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:01.763760 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:01.763760 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:01.763760 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:01.763760 master-0 kubenswrapper[9136]: I1203 22:03:01.763691 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:01.908071 master-0 kubenswrapper[9136]: I1203 22:03:01.907974 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:03:01.908475 master-0 kubenswrapper[9136]: E1203 22:03:01.908416 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:03:01.977312 master-0 kubenswrapper[9136]: I1203 22:03:01.977224 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/0.log" Dec 03 22:03:01.977312 master-0 kubenswrapper[9136]: I1203 22:03:01.977316 9136 generic.go:334] "Generic (PLEG): container finished" podID="6e96335e-1866-41c8-b128-b95e783a9be4" containerID="575a6b466c21bed63405fda68f3a17896616983e54bbfaf59e5919bc2381d15d" exitCode=0 Dec 03 22:03:01.978401 master-0 kubenswrapper[9136]: I1203 22:03:01.977375 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerDied","Data":"575a6b466c21bed63405fda68f3a17896616983e54bbfaf59e5919bc2381d15d"} Dec 03 22:03:01.978401 master-0 kubenswrapper[9136]: I1203 22:03:01.977497 9136 scope.go:117] "RemoveContainer" containerID="2e3a2cbe5d1d0f700f443f5a9e3ce8b40d65f0655e3c96ce54030b35eb469f7b" Dec 03 22:03:01.979282 master-0 kubenswrapper[9136]: I1203 22:03:01.979201 9136 scope.go:117] "RemoveContainer" containerID="575a6b466c21bed63405fda68f3a17896616983e54bbfaf59e5919bc2381d15d" Dec 03 22:03:02.763965 master-0 kubenswrapper[9136]: I1203 22:03:02.763752 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:02.763965 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:02.763965 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:02.763965 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:02.763965 master-0 kubenswrapper[9136]: I1203 22:03:02.763891 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:02.992816 master-0 kubenswrapper[9136]: I1203 22:03:02.992735 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/1.log" Dec 03 22:03:02.994060 master-0 kubenswrapper[9136]: I1203 22:03:02.992836 9136 generic.go:334] "Generic (PLEG): container finished" podID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" containerID="4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569" exitCode=0 Dec 03 22:03:02.994060 master-0 kubenswrapper[9136]: I1203 22:03:02.992912 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerDied","Data":"4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569"} Dec 03 22:03:02.994060 master-0 kubenswrapper[9136]: I1203 22:03:02.992955 9136 scope.go:117] "RemoveContainer" containerID="b5c51e812810828f3868a9b23aec61a5698eeb487f28ca92e49346f8d56c307a" Dec 03 22:03:02.994060 master-0 kubenswrapper[9136]: I1203 22:03:02.993556 9136 scope.go:117] "RemoveContainer" containerID="4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569" Dec 03 22:03:02.994060 master-0 kubenswrapper[9136]: E1203 22:03:02.993955 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 22:03:02.997499 master-0 kubenswrapper[9136]: I1203 22:03:02.997455 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerStarted","Data":"16ea4a93de4863ed15876a3040be9a6845e308ffe81f55c8cf6208378385e0da"} Dec 03 22:03:03.764079 master-0 kubenswrapper[9136]: I1203 22:03:03.763983 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:03.764079 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:03.764079 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:03.764079 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:03.764620 master-0 kubenswrapper[9136]: I1203 22:03:03.764101 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:04.092302 master-0 kubenswrapper[9136]: I1203 22:03:04.092050 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:03:04.093460 master-0 kubenswrapper[9136]: I1203 22:03:04.093401 9136 scope.go:117] "RemoveContainer" containerID="bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd" Dec 03 22:03:04.094047 master-0 kubenswrapper[9136]: E1203 22:03:04.093974 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 22:03:04.764541 master-0 kubenswrapper[9136]: I1203 22:03:04.764429 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:04.764541 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:04.764541 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:04.764541 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:04.765098 master-0 kubenswrapper[9136]: I1203 22:03:04.764557 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:04.919668 master-0 kubenswrapper[9136]: I1203 22:03:04.919587 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:03:05.763958 master-0 kubenswrapper[9136]: I1203 22:03:05.763796 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:05.763958 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:05.763958 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:05.763958 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:05.763958 master-0 kubenswrapper[9136]: I1203 22:03:05.763888 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:06.037676 master-0 kubenswrapper[9136]: I1203 22:03:06.037488 9136 generic.go:334] "Generic (PLEG): container finished" podID="08432be8-0086-48d2-a93d-7a474e96749d" containerID="d1bdd8b92f1b53e27dbdc370a3e552cba7eb4014b85281c160413adf1ac3135b" exitCode=0 Dec 03 22:03:06.037676 master-0 kubenswrapper[9136]: I1203 22:03:06.037616 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerDied","Data":"d1bdd8b92f1b53e27dbdc370a3e552cba7eb4014b85281c160413adf1ac3135b"} Dec 03 22:03:06.037676 master-0 kubenswrapper[9136]: I1203 22:03:06.037674 9136 scope.go:117] "RemoveContainer" containerID="ffe4f10866cf1bc36713ebdb04a86f2cd5ff92ba34f253339cffa02ccd5a5e66" Dec 03 22:03:06.038554 master-0 kubenswrapper[9136]: I1203 22:03:06.038486 9136 scope.go:117] "RemoveContainer" containerID="d1bdd8b92f1b53e27dbdc370a3e552cba7eb4014b85281c160413adf1ac3135b" Dec 03 22:03:06.041598 master-0 kubenswrapper[9136]: I1203 22:03:06.041541 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/1.log" Dec 03 22:03:06.041692 master-0 kubenswrapper[9136]: I1203 22:03:06.041626 9136 generic.go:334] "Generic (PLEG): container finished" podID="6976b503-87da-48fc-b097-d1b315fbee3f" containerID="bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6" exitCode=0 Dec 03 22:03:06.041803 master-0 kubenswrapper[9136]: I1203 22:03:06.041739 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerDied","Data":"bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6"} Dec 03 22:03:06.042719 master-0 kubenswrapper[9136]: I1203 22:03:06.042614 9136 scope.go:117] "RemoveContainer" containerID="bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6" Dec 03 22:03:06.043132 master-0 kubenswrapper[9136]: E1203 22:03:06.043047 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 22:03:06.046376 master-0 kubenswrapper[9136]: I1203 22:03:06.046314 9136 generic.go:334] "Generic (PLEG): container finished" podID="29ac4a9d-1228-49c7-9051-338e7dc98a38" containerID="5c6cff0db5f54508702a0378b5a2fcfe25e7d02bb8251b1a5dc1e5b0d7ac5a5a" exitCode=0 Dec 03 22:03:06.046376 master-0 kubenswrapper[9136]: I1203 22:03:06.046370 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerDied","Data":"5c6cff0db5f54508702a0378b5a2fcfe25e7d02bb8251b1a5dc1e5b0d7ac5a5a"} Dec 03 22:03:06.047116 master-0 kubenswrapper[9136]: I1203 22:03:06.047062 9136 scope.go:117] "RemoveContainer" containerID="5c6cff0db5f54508702a0378b5a2fcfe25e7d02bb8251b1a5dc1e5b0d7ac5a5a" Dec 03 22:03:06.047426 master-0 kubenswrapper[9136]: E1203 22:03:06.047377 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-cluster-manager pod=ovnkube-control-plane-f9f7f4946-8qg8w_openshift-ovn-kubernetes(29ac4a9d-1228-49c7-9051-338e7dc98a38)\"" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" podUID="29ac4a9d-1228-49c7-9051-338e7dc98a38" Dec 03 22:03:06.092379 master-0 kubenswrapper[9136]: I1203 22:03:06.092320 9136 scope.go:117] "RemoveContainer" containerID="ed377a768f3ae617a354166679d4faec90c18a1d576ddc666702319bbce47318" Dec 03 22:03:06.174157 master-0 kubenswrapper[9136]: I1203 22:03:06.174102 9136 scope.go:117] "RemoveContainer" containerID="efd056e2ae598223549439d740b1aee6a580ddac5417acae41c47f9e3c130bb0" Dec 03 22:03:06.763809 master-0 kubenswrapper[9136]: I1203 22:03:06.763686 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:06.763809 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:06.763809 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:06.763809 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:06.763809 master-0 kubenswrapper[9136]: I1203 22:03:06.763807 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:07.059542 master-0 kubenswrapper[9136]: I1203 22:03:07.059337 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerStarted","Data":"53a60280703adaa150a5f659528627f228a9f7e4e33c25a05a6a83bbcae2a4c4"} Dec 03 22:03:07.763919 master-0 kubenswrapper[9136]: I1203 22:03:07.763824 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:07.763919 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:07.763919 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:07.763919 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:07.763919 master-0 kubenswrapper[9136]: I1203 22:03:07.763914 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:08.075757 master-0 kubenswrapper[9136]: I1203 22:03:08.075531 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kb5rx_858384f3-5741-4e67-8669-2eb2b2dcaf7f/cluster-autoscaler-operator/0.log" Dec 03 22:03:08.076242 master-0 kubenswrapper[9136]: I1203 22:03:08.076165 9136 generic.go:334] "Generic (PLEG): container finished" podID="858384f3-5741-4e67-8669-2eb2b2dcaf7f" containerID="4c2adc08380436f319ebce0bff4387679c426ba1840c8e9241539270a64e7dab" exitCode=255 Dec 03 22:03:08.076367 master-0 kubenswrapper[9136]: I1203 22:03:08.076231 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" event={"ID":"858384f3-5741-4e67-8669-2eb2b2dcaf7f","Type":"ContainerDied","Data":"4c2adc08380436f319ebce0bff4387679c426ba1840c8e9241539270a64e7dab"} Dec 03 22:03:08.077400 master-0 kubenswrapper[9136]: I1203 22:03:08.077341 9136 scope.go:117] "RemoveContainer" containerID="4c2adc08380436f319ebce0bff4387679c426ba1840c8e9241539270a64e7dab" Dec 03 22:03:08.763817 master-0 kubenswrapper[9136]: I1203 22:03:08.763736 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:08.763817 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:08.763817 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:08.763817 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:08.764998 master-0 kubenswrapper[9136]: I1203 22:03:08.763830 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:08.908760 master-0 kubenswrapper[9136]: I1203 22:03:08.908680 9136 scope.go:117] "RemoveContainer" containerID="e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e" Dec 03 22:03:09.086799 master-0 kubenswrapper[9136]: I1203 22:03:09.086719 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kb5rx_858384f3-5741-4e67-8669-2eb2b2dcaf7f/cluster-autoscaler-operator/0.log" Dec 03 22:03:09.087356 master-0 kubenswrapper[9136]: I1203 22:03:09.087304 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" event={"ID":"858384f3-5741-4e67-8669-2eb2b2dcaf7f","Type":"ContainerStarted","Data":"44294f2d0cc666cec446bb1aa070412b1c4a0018771207158378af4732aa03ea"} Dec 03 22:03:09.658954 master-0 kubenswrapper[9136]: I1203 22:03:09.658876 9136 status_manager.go:851] "Failed to get status for pod" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-4-retry-1-master-0)" Dec 03 22:03:09.763717 master-0 kubenswrapper[9136]: I1203 22:03:09.763629 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:09.763717 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:09.763717 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:09.763717 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:09.764342 master-0 kubenswrapper[9136]: I1203 22:03:09.763737 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:10.103146 master-0 kubenswrapper[9136]: I1203 22:03:10.103079 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/1.log" Dec 03 22:03:10.104026 master-0 kubenswrapper[9136]: I1203 22:03:10.103172 9136 generic.go:334] "Generic (PLEG): container finished" podID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" containerID="48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f" exitCode=0 Dec 03 22:03:10.104026 master-0 kubenswrapper[9136]: I1203 22:03:10.103277 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerDied","Data":"48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f"} Dec 03 22:03:10.104026 master-0 kubenswrapper[9136]: I1203 22:03:10.103375 9136 scope.go:117] "RemoveContainer" containerID="8820c33df939aa1daa4c748a1b14d13d60d4ea6314f7234562766dc5cbdcadf8" Dec 03 22:03:10.104250 master-0 kubenswrapper[9136]: I1203 22:03:10.104029 9136 scope.go:117] "RemoveContainer" containerID="48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f" Dec 03 22:03:10.104424 master-0 kubenswrapper[9136]: E1203 22:03:10.104366 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 22:03:10.106169 master-0 kubenswrapper[9136]: I1203 22:03:10.106119 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/3.log" Dec 03 22:03:10.106266 master-0 kubenswrapper[9136]: I1203 22:03:10.106210 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252"} Dec 03 22:03:10.763068 master-0 kubenswrapper[9136]: I1203 22:03:10.762959 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:10.763068 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:10.763068 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:10.763068 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:10.763420 master-0 kubenswrapper[9136]: I1203 22:03:10.763075 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:11.121137 master-0 kubenswrapper[9136]: I1203 22:03:11.121086 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/1.log" Dec 03 22:03:11.122030 master-0 kubenswrapper[9136]: I1203 22:03:11.121940 9136 generic.go:334] "Generic (PLEG): container finished" podID="892d5611-debf-402f-abc5-3f99aa080159" containerID="ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e" exitCode=0 Dec 03 22:03:11.122121 master-0 kubenswrapper[9136]: I1203 22:03:11.122031 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerDied","Data":"ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e"} Dec 03 22:03:11.122121 master-0 kubenswrapper[9136]: I1203 22:03:11.122102 9136 scope.go:117] "RemoveContainer" containerID="db4f2041dbd606d645ef97beb6cb4387e780516338543969beb7168c5691d4f3" Dec 03 22:03:11.122910 master-0 kubenswrapper[9136]: I1203 22:03:11.122852 9136 scope.go:117] "RemoveContainer" containerID="ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e" Dec 03 22:03:11.123353 master-0 kubenswrapper[9136]: E1203 22:03:11.123301 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=network-operator pod=network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159)\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 22:03:11.763627 master-0 kubenswrapper[9136]: I1203 22:03:11.763514 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:11.763627 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:11.763627 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:11.763627 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:11.763627 master-0 kubenswrapper[9136]: I1203 22:03:11.763618 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:11.821815 master-0 kubenswrapper[9136]: E1203 22:03:11.821676 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:03:11.908212 master-0 kubenswrapper[9136]: I1203 22:03:11.908096 9136 scope.go:117] "RemoveContainer" containerID="2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7" Dec 03 22:03:11.908540 master-0 kubenswrapper[9136]: E1203 22:03:11.908491 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" Dec 03 22:03:12.763614 master-0 kubenswrapper[9136]: I1203 22:03:12.763504 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:12.763614 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:12.763614 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:12.763614 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:12.763614 master-0 kubenswrapper[9136]: I1203 22:03:12.763613 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:12.908719 master-0 kubenswrapper[9136]: I1203 22:03:12.908617 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:03:12.909189 master-0 kubenswrapper[9136]: E1203 22:03:12.909105 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:03:13.763237 master-0 kubenswrapper[9136]: I1203 22:03:13.763184 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:13.763237 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:13.763237 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:13.763237 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:13.763461 master-0 kubenswrapper[9136]: I1203 22:03:13.763282 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:13.908998 master-0 kubenswrapper[9136]: I1203 22:03:13.908916 9136 scope.go:117] "RemoveContainer" containerID="5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04" Dec 03 22:03:13.909999 master-0 kubenswrapper[9136]: E1203 22:03:13.909210 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 22:03:14.152356 master-0 kubenswrapper[9136]: I1203 22:03:14.152262 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/0.log" Dec 03 22:03:14.152931 master-0 kubenswrapper[9136]: I1203 22:03:14.152865 9136 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f" exitCode=0 Dec 03 22:03:14.153043 master-0 kubenswrapper[9136]: I1203 22:03:14.153000 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerDied","Data":"1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f"} Dec 03 22:03:14.153117 master-0 kubenswrapper[9136]: I1203 22:03:14.153081 9136 scope.go:117] "RemoveContainer" containerID="3870fd945d6202d34fa9215ae4b7aebad43b7e8cbdd1682883399a063c91b9d2" Dec 03 22:03:14.153986 master-0 kubenswrapper[9136]: I1203 22:03:14.153936 9136 scope.go:117] "RemoveContainer" containerID="1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f" Dec 03 22:03:14.154460 master-0 kubenswrapper[9136]: E1203 22:03:14.154398 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:03:14.158706 master-0 kubenswrapper[9136]: I1203 22:03:14.158638 9136 generic.go:334] "Generic (PLEG): container finished" podID="f59094ec-47dd-4547-ad41-b15a7933f461" containerID="e2c788e798e45c9b371c7db0d83b84623d2d9cabb25ab2e95a2d107b202c0add" exitCode=0 Dec 03 22:03:14.158855 master-0 kubenswrapper[9136]: I1203 22:03:14.158760 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerDied","Data":"e2c788e798e45c9b371c7db0d83b84623d2d9cabb25ab2e95a2d107b202c0add"} Dec 03 22:03:14.159910 master-0 kubenswrapper[9136]: I1203 22:03:14.159830 9136 scope.go:117] "RemoveContainer" containerID="e2c788e798e45c9b371c7db0d83b84623d2d9cabb25ab2e95a2d107b202c0add" Dec 03 22:03:14.161688 master-0 kubenswrapper[9136]: I1203 22:03:14.161628 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/3.log" Dec 03 22:03:14.162368 master-0 kubenswrapper[9136]: I1203 22:03:14.162323 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/2.log" Dec 03 22:03:14.163028 master-0 kubenswrapper[9136]: I1203 22:03:14.162960 9136 generic.go:334] "Generic (PLEG): container finished" podID="fa9b5917-d4f3-4372-a200-45b57412f92f" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" exitCode=1 Dec 03 22:03:14.163207 master-0 kubenswrapper[9136]: I1203 22:03:14.163111 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerDied","Data":"2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff"} Dec 03 22:03:14.164946 master-0 kubenswrapper[9136]: I1203 22:03:14.164812 9136 scope.go:117] "RemoveContainer" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" Dec 03 22:03:14.165490 master-0 kubenswrapper[9136]: E1203 22:03:14.165427 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5fdc576499-q9tf6_openshift-machine-api(fa9b5917-d4f3-4372-a200-45b57412f92f)\"" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podUID="fa9b5917-d4f3-4372-a200-45b57412f92f" Dec 03 22:03:14.167280 master-0 kubenswrapper[9136]: I1203 22:03:14.167227 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="8a613b3a7f370a81b82a9c80ed68357904c42d57cdeafb6362cea4bc2248eaba" exitCode=0 Dec 03 22:03:14.167342 master-0 kubenswrapper[9136]: I1203 22:03:14.167293 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerDied","Data":"8a613b3a7f370a81b82a9c80ed68357904c42d57cdeafb6362cea4bc2248eaba"} Dec 03 22:03:14.168097 master-0 kubenswrapper[9136]: I1203 22:03:14.168020 9136 scope.go:117] "RemoveContainer" containerID="8a613b3a7f370a81b82a9c80ed68357904c42d57cdeafb6362cea4bc2248eaba" Dec 03 22:03:14.215232 master-0 kubenswrapper[9136]: I1203 22:03:14.215173 9136 scope.go:117] "RemoveContainer" containerID="f8f51cc4e951397e53befcebae88ea2052970977c20019cfe472b3c9dcc46778" Dec 03 22:03:14.272236 master-0 kubenswrapper[9136]: I1203 22:03:14.272195 9136 scope.go:117] "RemoveContainer" containerID="2cc2d1edcdc07582d326dd4cbd41cc55661ae846989d317ed59433e25fa1cc16" Dec 03 22:03:14.763204 master-0 kubenswrapper[9136]: I1203 22:03:14.763117 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:14.763204 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:14.763204 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:14.763204 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:14.763859 master-0 kubenswrapper[9136]: I1203 22:03:14.763214 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:14.811702 master-0 kubenswrapper[9136]: E1203 22:03:14.811630 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:03:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:03:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:03:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T22:03:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:15.178355 master-0 kubenswrapper[9136]: I1203 22:03:15.178284 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerStarted","Data":"694dfb3a5405c23ea842bfdfc58ff9ae7e4ed9b528fa489e10e2cc964b4f43d4"} Dec 03 22:03:15.181140 master-0 kubenswrapper[9136]: I1203 22:03:15.181058 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/3.log" Dec 03 22:03:15.184662 master-0 kubenswrapper[9136]: I1203 22:03:15.184622 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"eda227b3339e8d7fa804198da85eae80f678788107f4e300a3d0d50334b6969f"} Dec 03 22:03:15.483208 master-0 kubenswrapper[9136]: I1203 22:03:15.482983 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:03:15.483208 master-0 kubenswrapper[9136]: I1203 22:03:15.483079 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:03:15.484153 master-0 kubenswrapper[9136]: I1203 22:03:15.484103 9136 scope.go:117] "RemoveContainer" containerID="1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f" Dec 03 22:03:15.484602 master-0 kubenswrapper[9136]: E1203 22:03:15.484535 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:03:15.636612 master-0 kubenswrapper[9136]: I1203 22:03:15.636532 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:03:15.636612 master-0 kubenswrapper[9136]: I1203 22:03:15.636578 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:03:15.763994 master-0 kubenswrapper[9136]: I1203 22:03:15.763820 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:15.763994 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:15.763994 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:15.763994 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:15.763994 master-0 kubenswrapper[9136]: I1203 22:03:15.763927 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:16.195951 master-0 kubenswrapper[9136]: I1203 22:03:16.195901 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/1.log" Dec 03 22:03:16.197877 master-0 kubenswrapper[9136]: I1203 22:03:16.197818 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/0.log" Dec 03 22:03:16.198015 master-0 kubenswrapper[9136]: I1203 22:03:16.197897 9136 generic.go:334] "Generic (PLEG): container finished" podID="ba624ed0-32cc-4c87-81a5-708a8a8a7f88" containerID="3be403144fdd17b31c23c14e8eecf3d61113ddb620955d19d68fd287cf34269c" exitCode=1 Dec 03 22:03:16.198104 master-0 kubenswrapper[9136]: I1203 22:03:16.198016 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" event={"ID":"ba624ed0-32cc-4c87-81a5-708a8a8a7f88","Type":"ContainerDied","Data":"3be403144fdd17b31c23c14e8eecf3d61113ddb620955d19d68fd287cf34269c"} Dec 03 22:03:16.198104 master-0 kubenswrapper[9136]: I1203 22:03:16.198067 9136 scope.go:117] "RemoveContainer" containerID="6f91acf82d50566dd54057e9b1be20f0d89f3f616993406ed044d52651283dc2" Dec 03 22:03:16.199025 master-0 kubenswrapper[9136]: I1203 22:03:16.198973 9136 scope.go:117] "RemoveContainer" containerID="3be403144fdd17b31c23c14e8eecf3d61113ddb620955d19d68fd287cf34269c" Dec 03 22:03:16.199507 master-0 kubenswrapper[9136]: E1203 22:03:16.199444 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"control-plane-machine-set-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=control-plane-machine-set-operator pod=control-plane-machine-set-operator-66f4cc99d4-jlq49_openshift-machine-api(ba624ed0-32cc-4c87-81a5-708a8a8a7f88)\"" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" podUID="ba624ed0-32cc-4c87-81a5-708a8a8a7f88" Dec 03 22:03:16.202303 master-0 kubenswrapper[9136]: I1203 22:03:16.202218 9136 generic.go:334] "Generic (PLEG): container finished" podID="5f088999-ec66-402e-9634-8c762206d6b4" containerID="257bfb23352e3c96e97fc399573f91ccd44a49ae756eac88f5f3eb6b84b4bf2e" exitCode=0 Dec 03 22:03:16.202303 master-0 kubenswrapper[9136]: I1203 22:03:16.202264 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerDied","Data":"257bfb23352e3c96e97fc399573f91ccd44a49ae756eac88f5f3eb6b84b4bf2e"} Dec 03 22:03:16.203108 master-0 kubenswrapper[9136]: I1203 22:03:16.203056 9136 scope.go:117] "RemoveContainer" containerID="257bfb23352e3c96e97fc399573f91ccd44a49ae756eac88f5f3eb6b84b4bf2e" Dec 03 22:03:16.284652 master-0 kubenswrapper[9136]: I1203 22:03:16.284548 9136 scope.go:117] "RemoveContainer" containerID="7abe60a608c2243a0171688d4d7d094c30c03b0afcc0cdcafa8f846a9926432b" Dec 03 22:03:16.763506 master-0 kubenswrapper[9136]: I1203 22:03:16.763410 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:16.763506 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:16.763506 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:16.763506 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:16.763506 master-0 kubenswrapper[9136]: I1203 22:03:16.763486 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:17.213404 master-0 kubenswrapper[9136]: I1203 22:03:17.213318 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/1.log" Dec 03 22:03:17.215656 master-0 kubenswrapper[9136]: I1203 22:03:17.215577 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerStarted","Data":"33a7bb5f5e679144cd91e9f7be121f1b4387ca8c698a44bc1d39de4b8dcc3580"} Dec 03 22:03:17.218948 master-0 kubenswrapper[9136]: I1203 22:03:17.218905 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/1.log" Dec 03 22:03:17.219098 master-0 kubenswrapper[9136]: I1203 22:03:17.218954 9136 generic.go:334] "Generic (PLEG): container finished" podID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerID="4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0" exitCode=0 Dec 03 22:03:17.219098 master-0 kubenswrapper[9136]: I1203 22:03:17.218981 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerDied","Data":"4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0"} Dec 03 22:03:17.219098 master-0 kubenswrapper[9136]: I1203 22:03:17.219010 9136 scope.go:117] "RemoveContainer" containerID="0e38d42aee36ec27dd6129e81cdc5f8517f478bd68d3edd07a620c1ebb8b1c14" Dec 03 22:03:17.219712 master-0 kubenswrapper[9136]: I1203 22:03:17.219653 9136 scope.go:117] "RemoveContainer" containerID="4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0" Dec 03 22:03:17.220012 master-0 kubenswrapper[9136]: E1203 22:03:17.219956 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-fqnsm_openshift-authentication-operator(785612fc-3f78-4f1a-bc83-7afe5d3b8056)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" Dec 03 22:03:17.763988 master-0 kubenswrapper[9136]: I1203 22:03:17.763906 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:17.763988 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:17.763988 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:17.763988 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:17.763988 master-0 kubenswrapper[9136]: I1203 22:03:17.763974 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:17.908440 master-0 kubenswrapper[9136]: I1203 22:03:17.908325 9136 scope.go:117] "RemoveContainer" containerID="4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569" Dec 03 22:03:17.908440 master-0 kubenswrapper[9136]: I1203 22:03:17.908454 9136 scope.go:117] "RemoveContainer" containerID="bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd" Dec 03 22:03:17.908911 master-0 kubenswrapper[9136]: E1203 22:03:17.908739 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 22:03:18.636765 master-0 kubenswrapper[9136]: I1203 22:03:18.636656 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:03:18.636765 master-0 kubenswrapper[9136]: I1203 22:03:18.636726 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:18.762708 master-0 kubenswrapper[9136]: I1203 22:03:18.762582 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:18.762708 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:18.762708 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:18.762708 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:18.763266 master-0 kubenswrapper[9136]: I1203 22:03:18.762707 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:18.908219 master-0 kubenswrapper[9136]: I1203 22:03:18.907998 9136 scope.go:117] "RemoveContainer" containerID="5c6cff0db5f54508702a0378b5a2fcfe25e7d02bb8251b1a5dc1e5b0d7ac5a5a" Dec 03 22:03:19.239252 master-0 kubenswrapper[9136]: I1203 22:03:19.239085 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerStarted","Data":"857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26"} Dec 03 22:03:19.240835 master-0 kubenswrapper[9136]: I1203 22:03:19.240791 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" event={"ID":"29ac4a9d-1228-49c7-9051-338e7dc98a38","Type":"ContainerStarted","Data":"c444189a0424d0a53fe5110681d4c8b94f759666d1ffa6a8fb88ba3067c68598"} Dec 03 22:03:19.764487 master-0 kubenswrapper[9136]: I1203 22:03:19.764383 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:19.764487 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:19.764487 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:19.764487 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:19.765619 master-0 kubenswrapper[9136]: I1203 22:03:19.764504 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:19.908310 master-0 kubenswrapper[9136]: I1203 22:03:19.908216 9136 scope.go:117] "RemoveContainer" containerID="bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6" Dec 03 22:03:19.908697 master-0 kubenswrapper[9136]: E1203 22:03:19.908487 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 22:03:20.251022 master-0 kubenswrapper[9136]: I1203 22:03:20.250972 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-w9xk2_8b56c318-09b7-47f0-a7bf-32eb96e836ca/machine-api-operator/0.log" Dec 03 22:03:20.252105 master-0 kubenswrapper[9136]: I1203 22:03:20.252022 9136 generic.go:334] "Generic (PLEG): container finished" podID="8b56c318-09b7-47f0-a7bf-32eb96e836ca" containerID="dc9a686d307f5dc3dced989067ef4ced2ce1f8af42bcaa948a56153b4e7018c7" exitCode=255 Dec 03 22:03:20.252242 master-0 kubenswrapper[9136]: I1203 22:03:20.252079 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" event={"ID":"8b56c318-09b7-47f0-a7bf-32eb96e836ca","Type":"ContainerDied","Data":"dc9a686d307f5dc3dced989067ef4ced2ce1f8af42bcaa948a56153b4e7018c7"} Dec 03 22:03:20.252996 master-0 kubenswrapper[9136]: I1203 22:03:20.252946 9136 scope.go:117] "RemoveContainer" containerID="dc9a686d307f5dc3dced989067ef4ced2ce1f8af42bcaa948a56153b4e7018c7" Dec 03 22:03:20.762958 master-0 kubenswrapper[9136]: I1203 22:03:20.762870 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:20.762958 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:20.762958 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:20.762958 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:20.763495 master-0 kubenswrapper[9136]: I1203 22:03:20.762970 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:21.263738 master-0 kubenswrapper[9136]: I1203 22:03:21.263669 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-w9xk2_8b56c318-09b7-47f0-a7bf-32eb96e836ca/machine-api-operator/0.log" Dec 03 22:03:21.264738 master-0 kubenswrapper[9136]: I1203 22:03:21.264295 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" event={"ID":"8b56c318-09b7-47f0-a7bf-32eb96e836ca","Type":"ContainerStarted","Data":"922b36f6b2eb470f1f0c1cd09a0aedbfcde6e3651bf4bf15e6b06fe469d73100"} Dec 03 22:03:21.763715 master-0 kubenswrapper[9136]: I1203 22:03:21.763610 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:21.763715 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:21.763715 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:21.763715 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:21.763715 master-0 kubenswrapper[9136]: I1203 22:03:21.763702 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:22.763375 master-0 kubenswrapper[9136]: I1203 22:03:22.763263 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:22.763375 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:22.763375 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:22.763375 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:22.764430 master-0 kubenswrapper[9136]: I1203 22:03:22.763387 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:22.908484 master-0 kubenswrapper[9136]: I1203 22:03:22.908390 9136 scope.go:117] "RemoveContainer" containerID="2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7" Dec 03 22:03:23.284980 master-0 kubenswrapper[9136]: I1203 22:03:23.284839 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerStarted","Data":"a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64"} Dec 03 22:03:23.763718 master-0 kubenswrapper[9136]: I1203 22:03:23.763635 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:23.763718 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:23.763718 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:23.763718 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:23.764855 master-0 kubenswrapper[9136]: I1203 22:03:23.763754 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:23.908913 master-0 kubenswrapper[9136]: I1203 22:03:23.908867 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:03:23.909340 master-0 kubenswrapper[9136]: I1203 22:03:23.909302 9136 scope.go:117] "RemoveContainer" containerID="ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e" Dec 03 22:03:23.909465 master-0 kubenswrapper[9136]: I1203 22:03:23.909356 9136 scope.go:117] "RemoveContainer" containerID="48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f" Dec 03 22:03:23.909720 master-0 kubenswrapper[9136]: E1203 22:03:23.909671 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 22:03:23.909858 master-0 kubenswrapper[9136]: E1203 22:03:23.909734 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=network-operator pod=network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159)\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 22:03:23.909983 master-0 kubenswrapper[9136]: E1203 22:03:23.909948 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:03:24.763631 master-0 kubenswrapper[9136]: I1203 22:03:24.763468 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:24.763631 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:24.763631 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:24.763631 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:24.763631 master-0 kubenswrapper[9136]: I1203 22:03:24.763574 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:24.813131 master-0 kubenswrapper[9136]: E1203 22:03:24.813048 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:24.908412 master-0 kubenswrapper[9136]: I1203 22:03:24.908320 9136 scope.go:117] "RemoveContainer" containerID="5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04" Dec 03 22:03:25.300949 master-0 kubenswrapper[9136]: I1203 22:03:25.300877 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerStarted","Data":"964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160"} Dec 03 22:03:25.763575 master-0 kubenswrapper[9136]: I1203 22:03:25.763479 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:25.763575 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:25.763575 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:25.763575 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:25.764031 master-0 kubenswrapper[9136]: I1203 22:03:25.763598 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:25.908203 master-0 kubenswrapper[9136]: I1203 22:03:25.908081 9136 scope.go:117] "RemoveContainer" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" Dec 03 22:03:25.909335 master-0 kubenswrapper[9136]: E1203 22:03:25.908397 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5fdc576499-q9tf6_openshift-machine-api(fa9b5917-d4f3-4372-a200-45b57412f92f)\"" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podUID="fa9b5917-d4f3-4372-a200-45b57412f92f" Dec 03 22:03:26.463056 master-0 kubenswrapper[9136]: I1203 22:03:26.462957 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:03:26.463830 master-0 kubenswrapper[9136]: I1203 22:03:26.463749 9136 scope.go:117] "RemoveContainer" containerID="4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0" Dec 03 22:03:26.464188 master-0 kubenswrapper[9136]: E1203 22:03:26.464134 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=authentication-operator pod=authentication-operator-7479ffdf48-fqnsm_openshift-authentication-operator(785612fc-3f78-4f1a-bc83-7afe5d3b8056)\"" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" Dec 03 22:03:26.763726 master-0 kubenswrapper[9136]: I1203 22:03:26.763526 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:26.763726 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:26.763726 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:26.763726 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:26.763726 master-0 kubenswrapper[9136]: I1203 22:03:26.763603 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:27.763542 master-0 kubenswrapper[9136]: I1203 22:03:27.763467 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:27.763542 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:27.763542 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:27.763542 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:27.764154 master-0 kubenswrapper[9136]: I1203 22:03:27.763565 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:27.809271 master-0 kubenswrapper[9136]: E1203 22:03:27.809127 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd30d5ad8be2a kube-system 9768 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:23 +0000 UTC,LastTimestamp:2025-12-03 22:01:07.239442268 +0000 UTC m=+673.514618680,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:03:28.637013 master-0 kubenswrapper[9136]: I1203 22:03:28.636947 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:03:28.637354 master-0 kubenswrapper[9136]: I1203 22:03:28.637326 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:28.763618 master-0 kubenswrapper[9136]: I1203 22:03:28.763560 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:28.763618 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:28.763618 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:28.763618 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:28.764573 master-0 kubenswrapper[9136]: I1203 22:03:28.764531 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:28.823601 master-0 kubenswrapper[9136]: E1203 22:03:28.823513 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:03:28.907379 master-0 kubenswrapper[9136]: I1203 22:03:28.907250 9136 scope.go:117] "RemoveContainer" containerID="4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569" Dec 03 22:03:28.908482 master-0 kubenswrapper[9136]: I1203 22:03:28.908054 9136 scope.go:117] "RemoveContainer" containerID="1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f" Dec 03 22:03:29.764212 master-0 kubenswrapper[9136]: I1203 22:03:29.764102 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:29.764212 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:29.764212 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:29.764212 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:29.764212 master-0 kubenswrapper[9136]: I1203 22:03:29.764206 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:30.347923 master-0 kubenswrapper[9136]: I1203 22:03:30.347853 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerStarted","Data":"8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6"} Dec 03 22:03:30.351245 master-0 kubenswrapper[9136]: I1203 22:03:30.351191 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"54401a05125658819e717b9db70358e8392bf6552b9c5e8d042cbe435340c757"} Dec 03 22:03:30.351550 master-0 kubenswrapper[9136]: I1203 22:03:30.351487 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:03:30.763681 master-0 kubenswrapper[9136]: I1203 22:03:30.763598 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:03:30.763681 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:03:30.763681 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:03:30.763681 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:03:30.764242 master-0 kubenswrapper[9136]: I1203 22:03:30.763706 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:03:30.764242 master-0 kubenswrapper[9136]: I1203 22:03:30.763837 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:03:30.764997 master-0 kubenswrapper[9136]: I1203 22:03:30.764560 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"78424a8a743f1d7a594eeae8a1082a555251b4e9cc2031fa13e2e6be36e4505e"} pod="openshift-ingress/router-default-54f97f57-xq6ch" containerMessage="Container router failed startup probe, will be restarted" Dec 03 22:03:30.764997 master-0 kubenswrapper[9136]: I1203 22:03:30.764613 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" containerID="cri-o://78424a8a743f1d7a594eeae8a1082a555251b4e9cc2031fa13e2e6be36e4505e" gracePeriod=3600 Dec 03 22:03:30.907678 master-0 kubenswrapper[9136]: I1203 22:03:30.907575 9136 scope.go:117] "RemoveContainer" containerID="3be403144fdd17b31c23c14e8eecf3d61113ddb620955d19d68fd287cf34269c" Dec 03 22:03:31.362257 master-0 kubenswrapper[9136]: I1203 22:03:31.362118 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/1.log" Dec 03 22:03:31.362257 master-0 kubenswrapper[9136]: I1203 22:03:31.362220 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" event={"ID":"ba624ed0-32cc-4c87-81a5-708a8a8a7f88","Type":"ContainerStarted","Data":"efcde3ef0d07aa86baeb551a9ded33bf1b89c953b5cec837ce67e23d4ad2724c"} Dec 03 22:03:32.908487 master-0 kubenswrapper[9136]: I1203 22:03:32.908398 9136 scope.go:117] "RemoveContainer" containerID="bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6" Dec 03 22:03:32.930481 master-0 kubenswrapper[9136]: E1203 22:03:32.930416 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 22:03:33.385445 master-0 kubenswrapper[9136]: I1203 22:03:33.385372 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"82d2da1a02234717b8dd48730b93c0192a4b42b7f7379e1fc9653baa00df6c93"} Dec 03 22:03:33.390142 master-0 kubenswrapper[9136]: I1203 22:03:33.390064 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerStarted","Data":"3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a"} Dec 03 22:03:33.484646 master-0 kubenswrapper[9136]: I1203 22:03:33.484548 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:33.484954 master-0 kubenswrapper[9136]: I1203 22:03:33.484568 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:33.484954 master-0 kubenswrapper[9136]: I1203 22:03:33.484675 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:33.484954 master-0 kubenswrapper[9136]: I1203 22:03:33.484801 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:34.409656 master-0 kubenswrapper[9136]: I1203 22:03:34.409599 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"b7cb5e73f7418263a2c8c42bb52ae4c4bb2f0b2e8a5241c7e6ae5e58057ecc31"} Dec 03 22:03:34.409656 master-0 kubenswrapper[9136]: I1203 22:03:34.409654 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"bfdc7006170b024bd74362f283ec12b682ef294c515e62ba9157cea3e95b4c90"} Dec 03 22:03:34.813660 master-0 kubenswrapper[9136]: E1203 22:03:34.813576 9136 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:35.425166 master-0 kubenswrapper[9136]: I1203 22:03:35.424975 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"cf0b0e4e1879f4685f9869a658ff0f8605dc42c65afb63586ce51981115ba251"} Dec 03 22:03:35.425166 master-0 kubenswrapper[9136]: I1203 22:03:35.425137 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"02f0436e29c41d2aa03592d5036e113bab231655e9781237864df2d13f97fd4c"} Dec 03 22:03:35.426237 master-0 kubenswrapper[9136]: I1203 22:03:35.425312 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:03:35.426237 master-0 kubenswrapper[9136]: I1203 22:03:35.425340 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:03:36.483730 master-0 kubenswrapper[9136]: I1203 22:03:36.483654 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:36.485096 master-0 kubenswrapper[9136]: I1203 22:03:36.483722 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:36.485096 master-0 kubenswrapper[9136]: I1203 22:03:36.483881 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:36.485096 master-0 kubenswrapper[9136]: I1203 22:03:36.483740 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:36.908495 master-0 kubenswrapper[9136]: I1203 22:03:36.908388 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:03:36.908896 master-0 kubenswrapper[9136]: E1203 22:03:36.908838 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:03:37.908067 master-0 kubenswrapper[9136]: I1203 22:03:37.907971 9136 scope.go:117] "RemoveContainer" containerID="48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f" Dec 03 22:03:37.908067 master-0 kubenswrapper[9136]: I1203 22:03:37.908042 9136 scope.go:117] "RemoveContainer" containerID="ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e" Dec 03 22:03:37.943907 master-0 kubenswrapper[9136]: I1203 22:03:37.943208 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 03 22:03:37.943907 master-0 kubenswrapper[9136]: I1203 22:03:37.943260 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 03 22:03:38.453650 master-0 kubenswrapper[9136]: I1203 22:03:38.453472 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerStarted","Data":"2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1"} Dec 03 22:03:38.456490 master-0 kubenswrapper[9136]: I1203 22:03:38.456443 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerStarted","Data":"a0d8b8091c75c8d084f0ab3c649dec1f42114979d34c4d430f5908359e9d65f4"} Dec 03 22:03:38.637331 master-0 kubenswrapper[9136]: I1203 22:03:38.637204 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:03:38.637702 master-0 kubenswrapper[9136]: I1203 22:03:38.637378 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:38.637702 master-0 kubenswrapper[9136]: I1203 22:03:38.637440 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:03:38.638303 master-0 kubenswrapper[9136]: I1203 22:03:38.638248 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"eda227b3339e8d7fa804198da85eae80f678788107f4e300a3d0d50334b6969f"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 03 22:03:38.638416 master-0 kubenswrapper[9136]: I1203 22:03:38.638387 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" containerID="cri-o://eda227b3339e8d7fa804198da85eae80f678788107f4e300a3d0d50334b6969f" gracePeriod=30 Dec 03 22:03:39.467568 master-0 kubenswrapper[9136]: I1203 22:03:39.467493 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/4.log" Dec 03 22:03:39.468428 master-0 kubenswrapper[9136]: I1203 22:03:39.468030 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/3.log" Dec 03 22:03:39.468428 master-0 kubenswrapper[9136]: I1203 22:03:39.468091 9136 generic.go:334] "Generic (PLEG): container finished" podID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" exitCode=1 Dec 03 22:03:39.468428 master-0 kubenswrapper[9136]: I1203 22:03:39.468154 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerDied","Data":"416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252"} Dec 03 22:03:39.468428 master-0 kubenswrapper[9136]: I1203 22:03:39.468251 9136 scope.go:117] "RemoveContainer" containerID="e0793e31f5fb489b6b15bff62f7f4a34ffd9e051976a5af997f6a70e4540749e" Dec 03 22:03:39.469176 master-0 kubenswrapper[9136]: I1203 22:03:39.469111 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:03:39.469553 master-0 kubenswrapper[9136]: E1203 22:03:39.469495 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:03:39.471764 master-0 kubenswrapper[9136]: I1203 22:03:39.471703 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/1.log" Dec 03 22:03:39.474177 master-0 kubenswrapper[9136]: I1203 22:03:39.474108 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="eda227b3339e8d7fa804198da85eae80f678788107f4e300a3d0d50334b6969f" exitCode=255 Dec 03 22:03:39.474322 master-0 kubenswrapper[9136]: I1203 22:03:39.474174 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerDied","Data":"eda227b3339e8d7fa804198da85eae80f678788107f4e300a3d0d50334b6969f"} Dec 03 22:03:39.474322 master-0 kubenswrapper[9136]: I1203 22:03:39.474237 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"b37f42f952beb79219161817e5d754bb07fa299dc93fdb096e0cceed27ef8d66"} Dec 03 22:03:39.482930 master-0 kubenswrapper[9136]: I1203 22:03:39.482867 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:39.482930 master-0 kubenswrapper[9136]: I1203 22:03:39.482909 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:39.483175 master-0 kubenswrapper[9136]: I1203 22:03:39.482934 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:39.483175 master-0 kubenswrapper[9136]: I1203 22:03:39.482958 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:39.483175 master-0 kubenswrapper[9136]: I1203 22:03:39.483006 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:03:39.483665 master-0 kubenswrapper[9136]: I1203 22:03:39.483609 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"54401a05125658819e717b9db70358e8392bf6552b9c5e8d042cbe435340c757"} pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 22:03:39.483801 master-0 kubenswrapper[9136]: I1203 22:03:39.483664 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" containerID="cri-o://54401a05125658819e717b9db70358e8392bf6552b9c5e8d042cbe435340c757" gracePeriod=30 Dec 03 22:03:39.484055 master-0 kubenswrapper[9136]: I1203 22:03:39.483994 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:39.484162 master-0 kubenswrapper[9136]: I1203 22:03:39.484077 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:39.536958 master-0 kubenswrapper[9136]: I1203 22:03:39.536862 9136 scope.go:117] "RemoveContainer" containerID="8a613b3a7f370a81b82a9c80ed68357904c42d57cdeafb6362cea4bc2248eaba" Dec 03 22:03:40.490996 master-0 kubenswrapper[9136]: I1203 22:03:40.490923 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/2.log" Dec 03 22:03:40.492531 master-0 kubenswrapper[9136]: I1203 22:03:40.492486 9136 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="54401a05125658819e717b9db70358e8392bf6552b9c5e8d042cbe435340c757" exitCode=255 Dec 03 22:03:40.493130 master-0 kubenswrapper[9136]: I1203 22:03:40.492592 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerDied","Data":"54401a05125658819e717b9db70358e8392bf6552b9c5e8d042cbe435340c757"} Dec 03 22:03:40.493209 master-0 kubenswrapper[9136]: I1203 22:03:40.493145 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb"} Dec 03 22:03:40.493209 master-0 kubenswrapper[9136]: I1203 22:03:40.493183 9136 scope.go:117] "RemoveContainer" containerID="1b9a68b725c78d3ab9d3fdd39dac94fca8569e0861e77bcec964ab3b7762206f" Dec 03 22:03:40.493511 master-0 kubenswrapper[9136]: I1203 22:03:40.493475 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:03:40.495468 master-0 kubenswrapper[9136]: I1203 22:03:40.495293 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/4.log" Dec 03 22:03:40.501097 master-0 kubenswrapper[9136]: I1203 22:03:40.501031 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/1.log" Dec 03 22:03:40.908259 master-0 kubenswrapper[9136]: I1203 22:03:40.908176 9136 scope.go:117] "RemoveContainer" containerID="4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0" Dec 03 22:03:40.908579 master-0 kubenswrapper[9136]: I1203 22:03:40.908292 9136 scope.go:117] "RemoveContainer" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" Dec 03 22:03:40.908764 master-0 kubenswrapper[9136]: E1203 22:03:40.908682 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5fdc576499-q9tf6_openshift-machine-api(fa9b5917-d4f3-4372-a200-45b57412f92f)\"" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podUID="fa9b5917-d4f3-4372-a200-45b57412f92f" Dec 03 22:03:41.515165 master-0 kubenswrapper[9136]: I1203 22:03:41.515084 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/2.log" Dec 03 22:03:41.519409 master-0 kubenswrapper[9136]: I1203 22:03:41.519318 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerStarted","Data":"3f938a79f0b6c6611be3fa1d23328a9984e35cf9a2e94086de4076ebe8bf8a36"} Dec 03 22:03:42.483631 master-0 kubenswrapper[9136]: I1203 22:03:42.483542 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:42.484189 master-0 kubenswrapper[9136]: I1203 22:03:42.483649 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:42.485233 master-0 kubenswrapper[9136]: I1203 22:03:42.485124 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:42.485361 master-0 kubenswrapper[9136]: I1203 22:03:42.485292 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:45.483324 master-0 kubenswrapper[9136]: I1203 22:03:45.483154 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:45.483324 master-0 kubenswrapper[9136]: I1203 22:03:45.483220 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:45.483324 master-0 kubenswrapper[9136]: I1203 22:03:45.483255 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:45.484189 master-0 kubenswrapper[9136]: I1203 22:03:45.483344 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:45.636889 master-0 kubenswrapper[9136]: I1203 22:03:45.636809 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:03:45.636889 master-0 kubenswrapper[9136]: I1203 22:03:45.636887 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:03:45.825699 master-0 kubenswrapper[9136]: E1203 22:03:45.825237 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:03:47.973144 master-0 kubenswrapper[9136]: I1203 22:03:47.973059 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 03 22:03:48.483077 master-0 kubenswrapper[9136]: I1203 22:03:48.482978 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:48.483469 master-0 kubenswrapper[9136]: I1203 22:03:48.483091 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:48.483469 master-0 kubenswrapper[9136]: I1203 22:03:48.482985 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:48.483469 master-0 kubenswrapper[9136]: I1203 22:03:48.483161 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:03:48.483469 master-0 kubenswrapper[9136]: I1203 22:03:48.483237 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:48.484521 master-0 kubenswrapper[9136]: I1203 22:03:48.484455 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:03:48.484652 master-0 kubenswrapper[9136]: I1203 22:03:48.484584 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:03:48.485969 master-0 kubenswrapper[9136]: I1203 22:03:48.485898 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb"} pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 22:03:48.486059 master-0 kubenswrapper[9136]: I1203 22:03:48.486031 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" containerID="cri-o://96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" gracePeriod=30 Dec 03 22:03:48.637099 master-0 kubenswrapper[9136]: I1203 22:03:48.636990 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:03:48.637099 master-0 kubenswrapper[9136]: I1203 22:03:48.637087 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:48.815581 master-0 kubenswrapper[9136]: E1203 22:03:48.815502 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:03:48.907728 master-0 kubenswrapper[9136]: I1203 22:03:48.907660 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:03:48.908207 master-0 kubenswrapper[9136]: E1203 22:03:48.908159 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:03:49.583617 master-0 kubenswrapper[9136]: I1203 22:03:49.583534 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/3.log" Dec 03 22:03:49.584677 master-0 kubenswrapper[9136]: I1203 22:03:49.584604 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/2.log" Dec 03 22:03:49.585443 master-0 kubenswrapper[9136]: I1203 22:03:49.585390 9136 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" exitCode=255 Dec 03 22:03:49.585554 master-0 kubenswrapper[9136]: I1203 22:03:49.585467 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerDied","Data":"96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb"} Dec 03 22:03:49.585554 master-0 kubenswrapper[9136]: I1203 22:03:49.585541 9136 scope.go:117] "RemoveContainer" containerID="54401a05125658819e717b9db70358e8392bf6552b9c5e8d042cbe435340c757" Dec 03 22:03:49.586420 master-0 kubenswrapper[9136]: I1203 22:03:49.586360 9136 scope.go:117] "RemoveContainer" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" Dec 03 22:03:49.587126 master-0 kubenswrapper[9136]: E1203 22:03:49.586776 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:03:50.597913 master-0 kubenswrapper[9136]: I1203 22:03:50.597842 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/3.log" Dec 03 22:03:52.908175 master-0 kubenswrapper[9136]: I1203 22:03:52.908119 9136 scope.go:117] "RemoveContainer" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" Dec 03 22:03:52.909234 master-0 kubenswrapper[9136]: E1203 22:03:52.909170 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5fdc576499-q9tf6_openshift-machine-api(fa9b5917-d4f3-4372-a200-45b57412f92f)\"" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" podUID="fa9b5917-d4f3-4372-a200-45b57412f92f" Dec 03 22:03:52.978762 master-0 kubenswrapper[9136]: I1203 22:03:52.978625 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 03 22:03:54.908131 master-0 kubenswrapper[9136]: I1203 22:03:54.908027 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:03:54.909129 master-0 kubenswrapper[9136]: E1203 22:03:54.908371 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:03:58.636676 master-0 kubenswrapper[9136]: I1203 22:03:58.636516 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:03:58.636676 master-0 kubenswrapper[9136]: I1203 22:03:58.636655 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:03:59.907525 master-0 kubenswrapper[9136]: I1203 22:03:59.907446 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:03:59.908858 master-0 kubenswrapper[9136]: E1203 22:03:59.907799 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:04:00.908459 master-0 kubenswrapper[9136]: I1203 22:04:00.908400 9136 scope.go:117] "RemoveContainer" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" Dec 03 22:04:00.909017 master-0 kubenswrapper[9136]: E1203 22:04:00.908810 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:04:01.812399 master-0 kubenswrapper[9136]: E1203 22:04:01.812089 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.187dd30db66099fd kube-system 9781 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:d78739a7694769882b7e47ea5ac08a10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 21:51:24 +0000 UTC,LastTimestamp:2025-12-03 22:01:07.252833754 +0000 UTC m=+673.528010146,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:04:02.826411 master-0 kubenswrapper[9136]: E1203 22:04:02.826258 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:04:04.096249 master-0 kubenswrapper[9136]: I1203 22:04:04.096153 9136 patch_prober.go:28] interesting pod/etcd-operator-7978bf889c-w8hsm container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.20:8443/healthz\": dial tcp 10.128.0.20:8443: connect: connection refused" start-of-body= Dec 03 22:04:04.096249 master-0 kubenswrapper[9136]: I1203 22:04:04.096245 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.20:8443/healthz\": dial tcp 10.128.0.20:8443: connect: connection refused" Dec 03 22:04:05.908682 master-0 kubenswrapper[9136]: I1203 22:04:05.908582 9136 scope.go:117] "RemoveContainer" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" Dec 03 22:04:05.909758 master-0 kubenswrapper[9136]: I1203 22:04:05.908717 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:04:05.909758 master-0 kubenswrapper[9136]: E1203 22:04:05.909209 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:04:06.730239 master-0 kubenswrapper[9136]: I1203 22:04:06.730152 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/3.log" Dec 03 22:04:06.730838 master-0 kubenswrapper[9136]: I1203 22:04:06.730715 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" event={"ID":"fa9b5917-d4f3-4372-a200-45b57412f92f","Type":"ContainerStarted","Data":"0d3b6aa6ddc448fe01323d383917b0efb673a621dc7f776ca02a466310d7eceb"} Dec 03 22:04:08.637074 master-0 kubenswrapper[9136]: I1203 22:04:08.636989 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:04:08.637790 master-0 kubenswrapper[9136]: I1203 22:04:08.637103 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:04:08.637790 master-0 kubenswrapper[9136]: I1203 22:04:08.637176 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:04:08.638143 master-0 kubenswrapper[9136]: I1203 22:04:08.638091 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"b37f42f952beb79219161817e5d754bb07fa299dc93fdb096e0cceed27ef8d66"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 03 22:04:08.638225 master-0 kubenswrapper[9136]: I1203 22:04:08.638200 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" containerID="cri-o://b37f42f952beb79219161817e5d754bb07fa299dc93fdb096e0cceed27ef8d66" gracePeriod=30 Dec 03 22:04:09.428320 master-0 kubenswrapper[9136]: E1203 22:04:09.428249 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 22:04:09.661182 master-0 kubenswrapper[9136]: I1203 22:04:09.661090 9136 status_manager.go:851] "Failed to get status for pod" podUID="8d60f02e-1803-461e-9606-667d91fcae14" pod="openshift-kube-apiserver/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Dec 03 22:04:09.768845 master-0 kubenswrapper[9136]: I1203 22:04:09.768648 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/2.log" Dec 03 22:04:09.769627 master-0 kubenswrapper[9136]: I1203 22:04:09.769582 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/1.log" Dec 03 22:04:09.770994 master-0 kubenswrapper[9136]: I1203 22:04:09.770938 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="b37f42f952beb79219161817e5d754bb07fa299dc93fdb096e0cceed27ef8d66" exitCode=255 Dec 03 22:04:09.771091 master-0 kubenswrapper[9136]: I1203 22:04:09.771040 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerDied","Data":"b37f42f952beb79219161817e5d754bb07fa299dc93fdb096e0cceed27ef8d66"} Dec 03 22:04:09.771178 master-0 kubenswrapper[9136]: I1203 22:04:09.771136 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f"} Dec 03 22:04:09.771238 master-0 kubenswrapper[9136]: I1203 22:04:09.771179 9136 scope.go:117] "RemoveContainer" containerID="eda227b3339e8d7fa804198da85eae80f678788107f4e300a3d0d50334b6969f" Dec 03 22:04:09.771517 master-0 kubenswrapper[9136]: I1203 22:04:09.771486 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:04:09.771517 master-0 kubenswrapper[9136]: I1203 22:04:09.771510 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:04:10.781199 master-0 kubenswrapper[9136]: I1203 22:04:10.781122 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/2.log" Dec 03 22:04:13.908935 master-0 kubenswrapper[9136]: I1203 22:04:13.908837 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:04:14.814143 master-0 kubenswrapper[9136]: I1203 22:04:14.813962 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/4.log" Dec 03 22:04:14.814410 master-0 kubenswrapper[9136]: I1203 22:04:14.814285 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082"} Dec 03 22:04:14.908074 master-0 kubenswrapper[9136]: I1203 22:04:14.908020 9136 scope.go:117] "RemoveContainer" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" Dec 03 22:04:14.908340 master-0 kubenswrapper[9136]: E1203 22:04:14.908227 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:04:15.636960 master-0 kubenswrapper[9136]: I1203 22:04:15.636829 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:04:15.636960 master-0 kubenswrapper[9136]: I1203 22:04:15.636897 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:04:16.465636 master-0 kubenswrapper[9136]: I1203 22:04:16.465541 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 22:04:16.466087 master-0 kubenswrapper[9136]: I1203 22:04:16.465672 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 22:04:17.841381 master-0 kubenswrapper[9136]: I1203 22:04:17.841276 9136 generic.go:334] "Generic (PLEG): container finished" podID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerID="78424a8a743f1d7a594eeae8a1082a555251b4e9cc2031fa13e2e6be36e4505e" exitCode=0 Dec 03 22:04:17.841381 master-0 kubenswrapper[9136]: I1203 22:04:17.841356 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerDied","Data":"78424a8a743f1d7a594eeae8a1082a555251b4e9cc2031fa13e2e6be36e4505e"} Dec 03 22:04:17.842535 master-0 kubenswrapper[9136]: I1203 22:04:17.841421 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"b2f6734c8b53bbe2c29b93a23c7bd8fd22b8ba672473f7aeb1ceaade6ae54ba1"} Dec 03 22:04:17.842535 master-0 kubenswrapper[9136]: I1203 22:04:17.841488 9136 scope.go:117] "RemoveContainer" containerID="81bfd9fda486fff0e9f3b1dee5719d6b24e5410ceadededcc88dc65916bf639e" Dec 03 22:04:18.637091 master-0 kubenswrapper[9136]: I1203 22:04:18.636847 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:04:18.637091 master-0 kubenswrapper[9136]: I1203 22:04:18.636952 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:04:18.760677 master-0 kubenswrapper[9136]: I1203 22:04:18.760565 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:04:18.764292 master-0 kubenswrapper[9136]: I1203 22:04:18.764200 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:18.764292 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:18.764292 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:18.764292 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:18.764292 master-0 kubenswrapper[9136]: I1203 22:04:18.764276 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:18.908108 master-0 kubenswrapper[9136]: I1203 22:04:18.907940 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:04:18.908863 master-0 kubenswrapper[9136]: E1203 22:04:18.908295 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:04:19.764275 master-0 kubenswrapper[9136]: I1203 22:04:19.764109 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:19.764275 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:19.764275 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:19.764275 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:19.765034 master-0 kubenswrapper[9136]: I1203 22:04:19.764283 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:19.827680 master-0 kubenswrapper[9136]: E1203 22:04:19.827551 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:04:20.764005 master-0 kubenswrapper[9136]: I1203 22:04:20.763853 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:20.764005 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:20.764005 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:20.764005 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:20.765045 master-0 kubenswrapper[9136]: I1203 22:04:20.763998 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:21.764518 master-0 kubenswrapper[9136]: I1203 22:04:21.764385 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:21.764518 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:21.764518 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:21.764518 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:21.765559 master-0 kubenswrapper[9136]: I1203 22:04:21.764516 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:22.762684 master-0 kubenswrapper[9136]: I1203 22:04:22.762573 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:22.762684 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:22.762684 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:22.762684 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:22.763188 master-0 kubenswrapper[9136]: I1203 22:04:22.762702 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:23.763528 master-0 kubenswrapper[9136]: I1203 22:04:23.763459 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:23.763528 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:23.763528 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:23.763528 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:23.764889 master-0 kubenswrapper[9136]: I1203 22:04:23.763558 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:24.761045 master-0 kubenswrapper[9136]: I1203 22:04:24.760800 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:04:24.763437 master-0 kubenswrapper[9136]: I1203 22:04:24.763381 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:24.763437 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:24.763437 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:24.763437 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:24.763586 master-0 kubenswrapper[9136]: I1203 22:04:24.763447 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:25.763412 master-0 kubenswrapper[9136]: I1203 22:04:25.763306 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:25.763412 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:25.763412 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:25.763412 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:25.763412 master-0 kubenswrapper[9136]: I1203 22:04:25.763398 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:25.907916 master-0 kubenswrapper[9136]: I1203 22:04:25.907856 9136 scope.go:117] "RemoveContainer" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" Dec 03 22:04:25.908747 master-0 kubenswrapper[9136]: E1203 22:04:25.908704 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:04:26.463499 master-0 kubenswrapper[9136]: I1203 22:04:26.463393 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 22:04:26.463888 master-0 kubenswrapper[9136]: I1203 22:04:26.463506 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 22:04:26.763470 master-0 kubenswrapper[9136]: I1203 22:04:26.763274 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:26.763470 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:26.763470 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:26.763470 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:26.764579 master-0 kubenswrapper[9136]: I1203 22:04:26.763482 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:27.764185 master-0 kubenswrapper[9136]: I1203 22:04:27.764117 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:27.764185 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:27.764185 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:27.764185 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:27.765248 master-0 kubenswrapper[9136]: I1203 22:04:27.764203 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:28.638118 master-0 kubenswrapper[9136]: I1203 22:04:28.638011 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:04:28.638499 master-0 kubenswrapper[9136]: I1203 22:04:28.638109 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:04:28.763811 master-0 kubenswrapper[9136]: I1203 22:04:28.763689 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:28.763811 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:28.763811 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:28.763811 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:28.765069 master-0 kubenswrapper[9136]: I1203 22:04:28.763825 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:29.763746 master-0 kubenswrapper[9136]: I1203 22:04:29.763657 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:29.763746 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:29.763746 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:29.763746 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:29.764264 master-0 kubenswrapper[9136]: I1203 22:04:29.763762 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:29.920012 master-0 kubenswrapper[9136]: I1203 22:04:29.908319 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:04:29.920012 master-0 kubenswrapper[9136]: E1203 22:04:29.918729 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:04:30.763131 master-0 kubenswrapper[9136]: I1203 22:04:30.763023 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:30.763131 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:30.763131 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:30.763131 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:30.763131 master-0 kubenswrapper[9136]: I1203 22:04:30.763108 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:31.763297 master-0 kubenswrapper[9136]: I1203 22:04:31.763219 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:31.763297 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:31.763297 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:31.763297 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:31.764397 master-0 kubenswrapper[9136]: I1203 22:04:31.763328 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:32.763653 master-0 kubenswrapper[9136]: I1203 22:04:32.763575 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:32.763653 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:32.763653 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:32.763653 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:32.764583 master-0 kubenswrapper[9136]: I1203 22:04:32.763671 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:32.984749 master-0 kubenswrapper[9136]: I1203 22:04:32.984600 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/2.log" Dec 03 22:04:32.985436 master-0 kubenswrapper[9136]: I1203 22:04:32.985370 9136 generic.go:334] "Generic (PLEG): container finished" podID="6e96335e-1866-41c8-b128-b95e783a9be4" containerID="16ea4a93de4863ed15876a3040be9a6845e308ffe81f55c8cf6208378385e0da" exitCode=255 Dec 03 22:04:32.985511 master-0 kubenswrapper[9136]: I1203 22:04:32.985458 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerDied","Data":"16ea4a93de4863ed15876a3040be9a6845e308ffe81f55c8cf6208378385e0da"} Dec 03 22:04:32.985567 master-0 kubenswrapper[9136]: I1203 22:04:32.985538 9136 scope.go:117] "RemoveContainer" containerID="575a6b466c21bed63405fda68f3a17896616983e54bbfaf59e5919bc2381d15d" Dec 03 22:04:32.986387 master-0 kubenswrapper[9136]: I1203 22:04:32.986347 9136 scope.go:117] "RemoveContainer" containerID="16ea4a93de4863ed15876a3040be9a6845e308ffe81f55c8cf6208378385e0da" Dec 03 22:04:32.986726 master-0 kubenswrapper[9136]: E1203 22:04:32.986691 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-storage-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-storage-operator pod=cluster-storage-operator-f84784664-hv5z8_openshift-cluster-storage-operator(6e96335e-1866-41c8-b128-b95e783a9be4)\"" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" podUID="6e96335e-1866-41c8-b128-b95e783a9be4" Dec 03 22:04:33.764112 master-0 kubenswrapper[9136]: I1203 22:04:33.764027 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:33.764112 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:33.764112 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:33.764112 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:33.765219 master-0 kubenswrapper[9136]: I1203 22:04:33.764115 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:33.995152 master-0 kubenswrapper[9136]: I1203 22:04:33.995085 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/2.log" Dec 03 22:04:34.763883 master-0 kubenswrapper[9136]: I1203 22:04:34.763603 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:34.763883 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:34.763883 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:34.763883 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:34.763883 master-0 kubenswrapper[9136]: I1203 22:04:34.763748 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:35.763386 master-0 kubenswrapper[9136]: I1203 22:04:35.763233 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:35.763386 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:35.763386 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:35.763386 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:35.763386 master-0 kubenswrapper[9136]: I1203 22:04:35.763331 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:35.815481 master-0 kubenswrapper[9136]: E1203 22:04:35.815293 9136 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{network-node-identity-r24k4.187dd399ecf97f82 openshift-network-node-identity 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-network-node-identity,Name:network-node-identity-r24k4,UID:1a0f647a-0260-4737-8ae2-cc90d01d33d1,APIVersion:v1,ResourceVersion:3241,FieldPath:spec.containers{approver},},Reason:BackOff,Message:Back-off restarting failed container approver in pod network-node-identity-r24k4_openshift-network-node-identity(1a0f647a-0260-4737-8ae2-cc90d01d33d1),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 22:01:27.05833357 +0000 UTC m=+693.333509962,LastTimestamp:2025-12-03 22:01:27.05833357 +0000 UTC m=+693.333509962,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:04:36.463893 master-0 kubenswrapper[9136]: I1203 22:04:36.463755 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 22:04:36.463893 master-0 kubenswrapper[9136]: I1203 22:04:36.463876 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 22:04:36.464367 master-0 kubenswrapper[9136]: I1203 22:04:36.463931 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:04:36.464747 master-0 kubenswrapper[9136]: I1203 22:04:36.464701 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"3f938a79f0b6c6611be3fa1d23328a9984e35cf9a2e94086de4076ebe8bf8a36"} pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Dec 03 22:04:36.464834 master-0 kubenswrapper[9136]: I1203 22:04:36.464758 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" containerID="cri-o://3f938a79f0b6c6611be3fa1d23328a9984e35cf9a2e94086de4076ebe8bf8a36" gracePeriod=30 Dec 03 22:04:36.763888 master-0 kubenswrapper[9136]: I1203 22:04:36.763688 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:36.763888 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:36.763888 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:36.763888 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:36.763888 master-0 kubenswrapper[9136]: I1203 22:04:36.763811 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:36.829091 master-0 kubenswrapper[9136]: E1203 22:04:36.828994 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:04:37.024376 master-0 kubenswrapper[9136]: I1203 22:04:37.024213 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-jxw8c_08432be8-0086-48d2-a93d-7a474e96749d/kube-apiserver-operator/2.log" Dec 03 22:04:37.025199 master-0 kubenswrapper[9136]: I1203 22:04:37.025137 9136 generic.go:334] "Generic (PLEG): container finished" podID="08432be8-0086-48d2-a93d-7a474e96749d" containerID="53a60280703adaa150a5f659528627f228a9f7e4e33c25a05a6a83bbcae2a4c4" exitCode=255 Dec 03 22:04:37.025261 master-0 kubenswrapper[9136]: I1203 22:04:37.025233 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerDied","Data":"53a60280703adaa150a5f659528627f228a9f7e4e33c25a05a6a83bbcae2a4c4"} Dec 03 22:04:37.025316 master-0 kubenswrapper[9136]: I1203 22:04:37.025285 9136 scope.go:117] "RemoveContainer" containerID="d1bdd8b92f1b53e27dbdc370a3e552cba7eb4014b85281c160413adf1ac3135b" Dec 03 22:04:37.025954 master-0 kubenswrapper[9136]: I1203 22:04:37.025918 9136 scope.go:117] "RemoveContainer" containerID="53a60280703adaa150a5f659528627f228a9f7e4e33c25a05a6a83bbcae2a4c4" Dec 03 22:04:37.026150 master-0 kubenswrapper[9136]: E1203 22:04:37.026115 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-operator pod=kube-apiserver-operator-5b557b5f57-jxw8c_openshift-kube-apiserver-operator(08432be8-0086-48d2-a93d-7a474e96749d)\"" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" podUID="08432be8-0086-48d2-a93d-7a474e96749d" Dec 03 22:04:37.027864 master-0 kubenswrapper[9136]: I1203 22:04:37.027830 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/3.log" Dec 03 22:04:37.028327 master-0 kubenswrapper[9136]: I1203 22:04:37.028290 9136 generic.go:334] "Generic (PLEG): container finished" podID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerID="3f938a79f0b6c6611be3fa1d23328a9984e35cf9a2e94086de4076ebe8bf8a36" exitCode=255 Dec 03 22:04:37.028368 master-0 kubenswrapper[9136]: I1203 22:04:37.028331 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerDied","Data":"3f938a79f0b6c6611be3fa1d23328a9984e35cf9a2e94086de4076ebe8bf8a36"} Dec 03 22:04:37.139009 master-0 kubenswrapper[9136]: I1203 22:04:37.138874 9136 scope.go:117] "RemoveContainer" containerID="4529f099d568ac74685af2719c7672b1e1cfe8cd2cc0c1cccee0c35e98d451a0" Dec 03 22:04:37.763050 master-0 kubenswrapper[9136]: I1203 22:04:37.762972 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:37.763050 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:37.763050 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:37.763050 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:37.763050 master-0 kubenswrapper[9136]: I1203 22:04:37.763042 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:38.041069 master-0 kubenswrapper[9136]: I1203 22:04:38.040886 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/3.log" Dec 03 22:04:38.041970 master-0 kubenswrapper[9136]: I1203 22:04:38.041070 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" event={"ID":"785612fc-3f78-4f1a-bc83-7afe5d3b8056","Type":"ContainerStarted","Data":"85b51f4e3a884c4fd9417e917f015f1e20cf5692e3a90676f2db5f37e44985c9"} Dec 03 22:04:38.044529 master-0 kubenswrapper[9136]: I1203 22:04:38.044462 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-jxw8c_08432be8-0086-48d2-a93d-7a474e96749d/kube-apiserver-operator/2.log" Dec 03 22:04:38.640546 master-0 kubenswrapper[9136]: I1203 22:04:38.640454 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:04:38.640546 master-0 kubenswrapper[9136]: I1203 22:04:38.640546 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:04:38.640999 master-0 kubenswrapper[9136]: I1203 22:04:38.640623 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:04:38.641677 master-0 kubenswrapper[9136]: I1203 22:04:38.641619 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Dec 03 22:04:38.641835 master-0 kubenswrapper[9136]: I1203 22:04:38.641763 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" containerID="cri-o://9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" gracePeriod=30 Dec 03 22:04:38.767586 master-0 kubenswrapper[9136]: I1203 22:04:38.767476 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:38.767586 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:38.767586 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:38.767586 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:38.767994 master-0 kubenswrapper[9136]: I1203 22:04:38.767610 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:38.773228 master-0 kubenswrapper[9136]: E1203 22:04:38.773153 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(05dd6e8e0dea56089da96190349dd4c1)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" Dec 03 22:04:39.057355 master-0 kubenswrapper[9136]: I1203 22:04:39.057129 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/3.log" Dec 03 22:04:39.058436 master-0 kubenswrapper[9136]: I1203 22:04:39.058167 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/2.log" Dec 03 22:04:39.059765 master-0 kubenswrapper[9136]: I1203 22:04:39.059693 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" exitCode=255 Dec 03 22:04:39.059993 master-0 kubenswrapper[9136]: I1203 22:04:39.059839 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerDied","Data":"9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f"} Dec 03 22:04:39.059993 master-0 kubenswrapper[9136]: I1203 22:04:39.059921 9136 scope.go:117] "RemoveContainer" containerID="b37f42f952beb79219161817e5d754bb07fa299dc93fdb096e0cceed27ef8d66" Dec 03 22:04:39.061174 master-0 kubenswrapper[9136]: I1203 22:04:39.061089 9136 scope.go:117] "RemoveContainer" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" Dec 03 22:04:39.061843 master-0 kubenswrapper[9136]: E1203 22:04:39.061700 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(05dd6e8e0dea56089da96190349dd4c1)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" Dec 03 22:04:39.763672 master-0 kubenswrapper[9136]: I1203 22:04:39.763580 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:39.763672 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:39.763672 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:39.763672 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:39.763672 master-0 kubenswrapper[9136]: I1203 22:04:39.763664 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:40.071587 master-0 kubenswrapper[9136]: I1203 22:04:40.071416 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/3.log" Dec 03 22:04:40.764322 master-0 kubenswrapper[9136]: I1203 22:04:40.764199 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:40.764322 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:40.764322 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:40.764322 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:40.764322 master-0 kubenswrapper[9136]: I1203 22:04:40.764296 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:40.908159 master-0 kubenswrapper[9136]: I1203 22:04:40.908061 9136 scope.go:117] "RemoveContainer" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" Dec 03 22:04:41.764182 master-0 kubenswrapper[9136]: I1203 22:04:41.764076 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:41.764182 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:41.764182 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:41.764182 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:41.765100 master-0 kubenswrapper[9136]: I1203 22:04:41.764189 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:42.094659 master-0 kubenswrapper[9136]: I1203 22:04:42.094454 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/3.log" Dec 03 22:04:42.095218 master-0 kubenswrapper[9136]: I1203 22:04:42.095165 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233"} Dec 03 22:04:42.095653 master-0 kubenswrapper[9136]: I1203 22:04:42.095603 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:04:42.764549 master-0 kubenswrapper[9136]: I1203 22:04:42.764480 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:42.764549 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:42.764549 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:42.764549 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:42.765958 master-0 kubenswrapper[9136]: I1203 22:04:42.765338 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:43.763547 master-0 kubenswrapper[9136]: I1203 22:04:43.763441 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:43.763547 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:43.763547 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:43.763547 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:43.764091 master-0 kubenswrapper[9136]: I1203 22:04:43.763558 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:43.773942 master-0 kubenswrapper[9136]: E1203 22:04:43.773884 9136 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Dec 03 22:04:44.764288 master-0 kubenswrapper[9136]: I1203 22:04:44.764061 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:44.764288 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:44.764288 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:44.764288 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:44.764288 master-0 kubenswrapper[9136]: I1203 22:04:44.764178 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:44.908380 master-0 kubenswrapper[9136]: I1203 22:04:44.908275 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:04:44.909229 master-0 kubenswrapper[9136]: E1203 22:04:44.908718 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:04:45.120822 master-0 kubenswrapper[9136]: I1203 22:04:45.120730 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db_f59094ec-47dd-4547-ad41-b15a7933f461/openshift-apiserver-operator/2.log" Dec 03 22:04:45.121457 master-0 kubenswrapper[9136]: I1203 22:04:45.121389 9136 generic.go:334] "Generic (PLEG): container finished" podID="f59094ec-47dd-4547-ad41-b15a7933f461" containerID="694dfb3a5405c23ea842bfdfc58ff9ae7e4ed9b528fa489e10e2cc964b4f43d4" exitCode=255 Dec 03 22:04:45.121505 master-0 kubenswrapper[9136]: I1203 22:04:45.121478 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerDied","Data":"694dfb3a5405c23ea842bfdfc58ff9ae7e4ed9b528fa489e10e2cc964b4f43d4"} Dec 03 22:04:45.121597 master-0 kubenswrapper[9136]: I1203 22:04:45.121558 9136 scope.go:117] "RemoveContainer" containerID="e2c788e798e45c9b371c7db0d83b84623d2d9cabb25ab2e95a2d107b202c0add" Dec 03 22:04:45.122502 master-0 kubenswrapper[9136]: I1203 22:04:45.122434 9136 scope.go:117] "RemoveContainer" containerID="694dfb3a5405c23ea842bfdfc58ff9ae7e4ed9b528fa489e10e2cc964b4f43d4" Dec 03 22:04:45.123022 master-0 kubenswrapper[9136]: E1203 22:04:45.122960 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=openshift-apiserver-operator pod=openshift-apiserver-operator-667484ff5-st2db_openshift-apiserver-operator(f59094ec-47dd-4547-ad41-b15a7933f461)\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" podUID="f59094ec-47dd-4547-ad41-b15a7933f461" Dec 03 22:04:45.484987 master-0 kubenswrapper[9136]: I1203 22:04:45.484867 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:45.484987 master-0 kubenswrapper[9136]: I1203 22:04:45.484959 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:45.485468 master-0 kubenswrapper[9136]: I1203 22:04:45.485046 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:45.485468 master-0 kubenswrapper[9136]: I1203 22:04:45.485146 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:45.636254 master-0 kubenswrapper[9136]: I1203 22:04:45.636137 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:04:45.637871 master-0 kubenswrapper[9136]: I1203 22:04:45.637744 9136 scope.go:117] "RemoveContainer" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" Dec 03 22:04:45.638337 master-0 kubenswrapper[9136]: E1203 22:04:45.638263 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(05dd6e8e0dea56089da96190349dd4c1)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" Dec 03 22:04:45.764246 master-0 kubenswrapper[9136]: I1203 22:04:45.764020 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:45.764246 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:45.764246 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:45.764246 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:45.764246 master-0 kubenswrapper[9136]: I1203 22:04:45.764137 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:45.908591 master-0 kubenswrapper[9136]: I1203 22:04:45.908484 9136 scope.go:117] "RemoveContainer" containerID="16ea4a93de4863ed15876a3040be9a6845e308ffe81f55c8cf6208378385e0da" Dec 03 22:04:46.128710 master-0 kubenswrapper[9136]: I1203 22:04:46.128628 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db_f59094ec-47dd-4547-ad41-b15a7933f461/openshift-apiserver-operator/2.log" Dec 03 22:04:46.130866 master-0 kubenswrapper[9136]: I1203 22:04:46.130802 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/2.log" Dec 03 22:04:46.131016 master-0 kubenswrapper[9136]: I1203 22:04:46.130976 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" event={"ID":"6e96335e-1866-41c8-b128-b95e783a9be4","Type":"ContainerStarted","Data":"b02559d419e71fdd466c10524e1fc79b48a1977b035641dbcd6c07a396b73381"} Dec 03 22:04:46.763875 master-0 kubenswrapper[9136]: I1203 22:04:46.763740 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:46.763875 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:46.763875 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:46.763875 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:46.764347 master-0 kubenswrapper[9136]: I1203 22:04:46.763908 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:47.140700 master-0 kubenswrapper[9136]: I1203 22:04:47.140607 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr_5f088999-ec66-402e-9634-8c762206d6b4/service-ca-operator/2.log" Dec 03 22:04:47.141656 master-0 kubenswrapper[9136]: I1203 22:04:47.141467 9136 generic.go:334] "Generic (PLEG): container finished" podID="5f088999-ec66-402e-9634-8c762206d6b4" containerID="33a7bb5f5e679144cd91e9f7be121f1b4387ca8c698a44bc1d39de4b8dcc3580" exitCode=255 Dec 03 22:04:47.141656 master-0 kubenswrapper[9136]: I1203 22:04:47.141520 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerDied","Data":"33a7bb5f5e679144cd91e9f7be121f1b4387ca8c698a44bc1d39de4b8dcc3580"} Dec 03 22:04:47.141656 master-0 kubenswrapper[9136]: I1203 22:04:47.141581 9136 scope.go:117] "RemoveContainer" containerID="257bfb23352e3c96e97fc399573f91ccd44a49ae756eac88f5f3eb6b84b4bf2e" Dec 03 22:04:47.142425 master-0 kubenswrapper[9136]: I1203 22:04:47.142365 9136 scope.go:117] "RemoveContainer" containerID="33a7bb5f5e679144cd91e9f7be121f1b4387ca8c698a44bc1d39de4b8dcc3580" Dec 03 22:04:47.142930 master-0 kubenswrapper[9136]: E1203 22:04:47.142865 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=service-ca-operator pod=service-ca-operator-56f5898f45-mjdfr_openshift-service-ca-operator(5f088999-ec66-402e-9634-8c762206d6b4)\"" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" podUID="5f088999-ec66-402e-9634-8c762206d6b4" Dec 03 22:04:47.764304 master-0 kubenswrapper[9136]: I1203 22:04:47.764211 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:47.764304 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:47.764304 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:47.764304 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:47.764304 master-0 kubenswrapper[9136]: I1203 22:04:47.764304 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:48.154447 master-0 kubenswrapper[9136]: I1203 22:04:48.154348 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr_5f088999-ec66-402e-9634-8c762206d6b4/service-ca-operator/2.log" Dec 03 22:04:48.483906 master-0 kubenswrapper[9136]: I1203 22:04:48.483672 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:48.483906 master-0 kubenswrapper[9136]: I1203 22:04:48.483675 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:48.483906 master-0 kubenswrapper[9136]: I1203 22:04:48.483759 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:48.483906 master-0 kubenswrapper[9136]: I1203 22:04:48.483841 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:48.763220 master-0 kubenswrapper[9136]: I1203 22:04:48.762962 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:48.763220 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:48.763220 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:48.763220 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:48.763220 master-0 kubenswrapper[9136]: I1203 22:04:48.763077 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:48.908113 master-0 kubenswrapper[9136]: I1203 22:04:48.908029 9136 scope.go:117] "RemoveContainer" containerID="53a60280703adaa150a5f659528627f228a9f7e4e33c25a05a6a83bbcae2a4c4" Dec 03 22:04:49.167328 master-0 kubenswrapper[9136]: I1203 22:04:49.167251 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/3.log" Dec 03 22:04:49.168304 master-0 kubenswrapper[9136]: I1203 22:04:49.167926 9136 generic.go:334] "Generic (PLEG): container finished" podID="82055cfc-b4ce-4a00-a51d-141059947693" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" exitCode=255 Dec 03 22:04:49.168304 master-0 kubenswrapper[9136]: I1203 22:04:49.167985 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerDied","Data":"857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26"} Dec 03 22:04:49.168304 master-0 kubenswrapper[9136]: I1203 22:04:49.168089 9136 scope.go:117] "RemoveContainer" containerID="bcd8c2f49452655817240e666a6a44f7aa1bd99557f7056f662406437b17e8fd" Dec 03 22:04:49.168732 master-0 kubenswrapper[9136]: I1203 22:04:49.168682 9136 scope.go:117] "RemoveContainer" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" Dec 03 22:04:49.168986 master-0 kubenswrapper[9136]: E1203 22:04:49.168936 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 22:04:49.763797 master-0 kubenswrapper[9136]: I1203 22:04:49.763696 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:49.763797 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:49.763797 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:49.763797 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:49.764199 master-0 kubenswrapper[9136]: I1203 22:04:49.763818 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:50.179422 master-0 kubenswrapper[9136]: I1203 22:04:50.179323 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-jxw8c_08432be8-0086-48d2-a93d-7a474e96749d/kube-apiserver-operator/2.log" Dec 03 22:04:50.180359 master-0 kubenswrapper[9136]: I1203 22:04:50.179475 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" event={"ID":"08432be8-0086-48d2-a93d-7a474e96749d","Type":"ContainerStarted","Data":"e9259a87418019af7d87002e963446671956d7f4ee30af4cedc46695012d6bb3"} Dec 03 22:04:50.181592 master-0 kubenswrapper[9136]: I1203 22:04:50.181524 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/3.log" Dec 03 22:04:50.763508 master-0 kubenswrapper[9136]: I1203 22:04:50.763374 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:50.763508 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:50.763508 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:50.763508 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:50.764118 master-0 kubenswrapper[9136]: I1203 22:04:50.763509 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:51.483411 master-0 kubenswrapper[9136]: I1203 22:04:51.483321 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:51.483411 master-0 kubenswrapper[9136]: I1203 22:04:51.483320 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:51.484275 master-0 kubenswrapper[9136]: I1203 22:04:51.483521 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:51.484275 master-0 kubenswrapper[9136]: I1203 22:04:51.483389 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:51.484275 master-0 kubenswrapper[9136]: I1203 22:04:51.483668 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:04:51.484884 master-0 kubenswrapper[9136]: I1203 22:04:51.484818 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233"} pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 22:04:51.484884 master-0 kubenswrapper[9136]: I1203 22:04:51.484858 9136 patch_prober.go:28] interesting pod/openshift-config-operator-68c95b6cf5-2cs5d container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Dec 03 22:04:51.484884 master-0 kubenswrapper[9136]: I1203 22:04:51.484888 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Dec 03 22:04:51.485051 master-0 kubenswrapper[9136]: I1203 22:04:51.484900 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerName="openshift-config-operator" containerID="cri-o://f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" gracePeriod=30 Dec 03 22:04:51.763681 master-0 kubenswrapper[9136]: I1203 22:04:51.763502 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:51.763681 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:51.763681 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:51.763681 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:51.763681 master-0 kubenswrapper[9136]: I1203 22:04:51.763610 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:51.910598 master-0 kubenswrapper[9136]: E1203 22:04:51.910489 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:04:52.199439 master-0 kubenswrapper[9136]: I1203 22:04:52.199372 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/4.log" Dec 03 22:04:52.200200 master-0 kubenswrapper[9136]: I1203 22:04:52.200166 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/3.log" Dec 03 22:04:52.200713 master-0 kubenswrapper[9136]: I1203 22:04:52.200667 9136 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" exitCode=255 Dec 03 22:04:52.200808 master-0 kubenswrapper[9136]: I1203 22:04:52.200734 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerDied","Data":"f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233"} Dec 03 22:04:52.200868 master-0 kubenswrapper[9136]: I1203 22:04:52.200831 9136 scope.go:117] "RemoveContainer" containerID="96f6a9e9615e0e6fd6c281ba4315adb19a1c9be7b4a4d8a32df7fb7bc2fa39fb" Dec 03 22:04:52.201662 master-0 kubenswrapper[9136]: I1203 22:04:52.201627 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:04:52.202068 master-0 kubenswrapper[9136]: E1203 22:04:52.202027 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:04:52.763160 master-0 kubenswrapper[9136]: I1203 22:04:52.763064 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:52.763160 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:52.763160 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:52.763160 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:52.763160 master-0 kubenswrapper[9136]: I1203 22:04:52.763160 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:53.213954 master-0 kubenswrapper[9136]: I1203 22:04:53.213859 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/4.log" Dec 03 22:04:53.763671 master-0 kubenswrapper[9136]: I1203 22:04:53.763587 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:53.763671 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:53.763671 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:53.763671 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:53.764814 master-0 kubenswrapper[9136]: I1203 22:04:53.763679 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:53.830885 master-0 kubenswrapper[9136]: E1203 22:04:53.830758 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:04:54.224995 master-0 kubenswrapper[9136]: I1203 22:04:54.224890 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/3.log" Dec 03 22:04:54.226972 master-0 kubenswrapper[9136]: I1203 22:04:54.226915 9136 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64" exitCode=255 Dec 03 22:04:54.227087 master-0 kubenswrapper[9136]: I1203 22:04:54.226978 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerDied","Data":"a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64"} Dec 03 22:04:54.227087 master-0 kubenswrapper[9136]: I1203 22:04:54.227075 9136 scope.go:117] "RemoveContainer" containerID="2e9dd252ba63a20079bc2c561a2c3f98ba9d8e4bd546eae6aff13854e70071f7" Dec 03 22:04:54.227977 master-0 kubenswrapper[9136]: I1203 22:04:54.227921 9136 scope.go:117] "RemoveContainer" containerID="a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64" Dec 03 22:04:54.228385 master-0 kubenswrapper[9136]: E1203 22:04:54.228342 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" Dec 03 22:04:54.763406 master-0 kubenswrapper[9136]: I1203 22:04:54.763322 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:54.763406 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:54.763406 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:54.763406 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:54.763824 master-0 kubenswrapper[9136]: I1203 22:04:54.763429 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:55.237186 master-0 kubenswrapper[9136]: I1203 22:04:55.237136 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/3.log" Dec 03 22:04:55.764567 master-0 kubenswrapper[9136]: I1203 22:04:55.764476 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:55.764567 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:55.764567 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:55.764567 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:55.765470 master-0 kubenswrapper[9136]: I1203 22:04:55.764574 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:56.257488 master-0 kubenswrapper[9136]: I1203 22:04:56.257359 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/3.log" Dec 03 22:04:56.258358 master-0 kubenswrapper[9136]: I1203 22:04:56.258274 9136 generic.go:334] "Generic (PLEG): container finished" podID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" containerID="964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160" exitCode=255 Dec 03 22:04:56.258466 master-0 kubenswrapper[9136]: I1203 22:04:56.258366 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerDied","Data":"964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160"} Dec 03 22:04:56.258466 master-0 kubenswrapper[9136]: I1203 22:04:56.258435 9136 scope.go:117] "RemoveContainer" containerID="5170b74c433fdae3d3f431b680472bc1c20377225f200538e8e7e7d355669e04" Dec 03 22:04:56.259860 master-0 kubenswrapper[9136]: I1203 22:04:56.259763 9136 scope.go:117] "RemoveContainer" containerID="964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160" Dec 03 22:04:56.260762 master-0 kubenswrapper[9136]: E1203 22:04:56.260375 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 22:04:56.763966 master-0 kubenswrapper[9136]: I1203 22:04:56.763893 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:56.763966 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:56.763966 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:56.763966 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:56.764665 master-0 kubenswrapper[9136]: I1203 22:04:56.764613 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:57.268847 master-0 kubenswrapper[9136]: I1203 22:04:57.268739 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/3.log" Dec 03 22:04:57.763413 master-0 kubenswrapper[9136]: I1203 22:04:57.763320 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:57.763413 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:57.763413 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:57.763413 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:57.763846 master-0 kubenswrapper[9136]: I1203 22:04:57.763433 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:57.908438 master-0 kubenswrapper[9136]: I1203 22:04:57.908361 9136 scope.go:117] "RemoveContainer" containerID="33a7bb5f5e679144cd91e9f7be121f1b4387ca8c698a44bc1d39de4b8dcc3580" Dec 03 22:04:57.908438 master-0 kubenswrapper[9136]: I1203 22:04:57.908430 9136 scope.go:117] "RemoveContainer" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" Dec 03 22:04:57.909561 master-0 kubenswrapper[9136]: E1203 22:04:57.908951 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(05dd6e8e0dea56089da96190349dd4c1)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" Dec 03 22:04:58.278183 master-0 kubenswrapper[9136]: I1203 22:04:58.278109 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr_5f088999-ec66-402e-9634-8c762206d6b4/service-ca-operator/2.log" Dec 03 22:04:58.278499 master-0 kubenswrapper[9136]: I1203 22:04:58.278203 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" event={"ID":"5f088999-ec66-402e-9634-8c762206d6b4","Type":"ContainerStarted","Data":"ae805298bf5fdb3b75355c8ef76512d1a6a42e00dbd7f966bc1bae248219b94d"} Dec 03 22:04:58.763069 master-0 kubenswrapper[9136]: I1203 22:04:58.762995 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:58.763069 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:58.763069 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:58.763069 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:58.763692 master-0 kubenswrapper[9136]: I1203 22:04:58.763646 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:04:58.908637 master-0 kubenswrapper[9136]: I1203 22:04:58.908585 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:04:58.909650 master-0 kubenswrapper[9136]: I1203 22:04:58.909591 9136 scope.go:117] "RemoveContainer" containerID="694dfb3a5405c23ea842bfdfc58ff9ae7e4ed9b528fa489e10e2cc964b4f43d4" Dec 03 22:04:58.910058 master-0 kubenswrapper[9136]: E1203 22:04:58.910014 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-86897dd478-g4ldp_openshift-cluster-storage-operator(28c42112-a09e-4b7a-b23b-c06bef69cbfb)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" podUID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" Dec 03 22:04:59.292327 master-0 kubenswrapper[9136]: I1203 22:04:59.292105 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db_f59094ec-47dd-4547-ad41-b15a7933f461/openshift-apiserver-operator/2.log" Dec 03 22:04:59.292327 master-0 kubenswrapper[9136]: I1203 22:04:59.292207 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" event={"ID":"f59094ec-47dd-4547-ad41-b15a7933f461","Type":"ContainerStarted","Data":"fa732200c5e06c40381c5765ef0972cb5829c93d72d4a72a980285c9551fd050"} Dec 03 22:04:59.763457 master-0 kubenswrapper[9136]: I1203 22:04:59.763350 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:04:59.763457 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:04:59.763457 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:04:59.763457 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:04:59.764132 master-0 kubenswrapper[9136]: I1203 22:04:59.763471 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:00.303111 master-0 kubenswrapper[9136]: I1203 22:05:00.303029 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/3.log" Dec 03 22:05:00.304279 master-0 kubenswrapper[9136]: I1203 22:05:00.303692 9136 generic.go:334] "Generic (PLEG): container finished" podID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" exitCode=255 Dec 03 22:05:00.304279 master-0 kubenswrapper[9136]: I1203 22:05:00.303752 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerDied","Data":"8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6"} Dec 03 22:05:00.304279 master-0 kubenswrapper[9136]: I1203 22:05:00.303901 9136 scope.go:117] "RemoveContainer" containerID="4d0e6309b7fa9a40d508e6cabe45feb51aab58957078aefd318e41e113083569" Dec 03 22:05:00.304584 master-0 kubenswrapper[9136]: I1203 22:05:00.304556 9136 scope.go:117] "RemoveContainer" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" Dec 03 22:05:00.305136 master-0 kubenswrapper[9136]: E1203 22:05:00.305069 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 22:05:00.764428 master-0 kubenswrapper[9136]: I1203 22:05:00.764346 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:00.764428 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:00.764428 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:00.764428 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:00.764970 master-0 kubenswrapper[9136]: I1203 22:05:00.764441 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:00.907720 master-0 kubenswrapper[9136]: I1203 22:05:00.907576 9136 scope.go:117] "RemoveContainer" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" Dec 03 22:05:00.908123 master-0 kubenswrapper[9136]: E1203 22:05:00.908059 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 22:05:01.316089 master-0 kubenswrapper[9136]: I1203 22:05:01.315997 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/3.log" Dec 03 22:05:01.763763 master-0 kubenswrapper[9136]: I1203 22:05:01.763627 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:01.763763 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:01.763763 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:01.763763 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:01.763763 master-0 kubenswrapper[9136]: I1203 22:05:01.763725 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:02.763335 master-0 kubenswrapper[9136]: I1203 22:05:02.763059 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:02.763335 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:02.763335 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:02.763335 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:02.763335 master-0 kubenswrapper[9136]: I1203 22:05:02.763160 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:03.764077 master-0 kubenswrapper[9136]: I1203 22:05:03.763976 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:03.764077 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:03.764077 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:03.764077 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:03.765126 master-0 kubenswrapper[9136]: I1203 22:05:03.764094 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:04.093027 master-0 kubenswrapper[9136]: I1203 22:05:04.092855 9136 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:05:04.093813 master-0 kubenswrapper[9136]: I1203 22:05:04.093781 9136 scope.go:117] "RemoveContainer" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" Dec 03 22:05:04.094048 master-0 kubenswrapper[9136]: E1203 22:05:04.094004 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 22:05:04.342820 master-0 kubenswrapper[9136]: I1203 22:05:04.342736 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/3.log" Dec 03 22:05:04.343913 master-0 kubenswrapper[9136]: I1203 22:05:04.343804 9136 generic.go:334] "Generic (PLEG): container finished" podID="6976b503-87da-48fc-b097-d1b315fbee3f" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" exitCode=255 Dec 03 22:05:04.343913 master-0 kubenswrapper[9136]: I1203 22:05:04.343883 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerDied","Data":"3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a"} Dec 03 22:05:04.344045 master-0 kubenswrapper[9136]: I1203 22:05:04.343962 9136 scope.go:117] "RemoveContainer" containerID="bfbf4959ede0c62e98f1d09bfdef8ec6864f4ae84b25850b8bd4a7f61685eda6" Dec 03 22:05:04.344839 master-0 kubenswrapper[9136]: I1203 22:05:04.344759 9136 scope.go:117] "RemoveContainer" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" Dec 03 22:05:04.345218 master-0 kubenswrapper[9136]: E1203 22:05:04.345154 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 22:05:04.764052 master-0 kubenswrapper[9136]: I1203 22:05:04.763916 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:04.764052 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:04.764052 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:04.764052 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:04.764052 master-0 kubenswrapper[9136]: I1203 22:05:04.764007 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:04.909127 master-0 kubenswrapper[9136]: I1203 22:05:04.909038 9136 scope.go:117] "RemoveContainer" containerID="a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64" Dec 03 22:05:04.909496 master-0 kubenswrapper[9136]: E1203 22:05:04.909426 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" Dec 03 22:05:05.355913 master-0 kubenswrapper[9136]: I1203 22:05:05.355841 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/3.log" Dec 03 22:05:05.763870 master-0 kubenswrapper[9136]: I1203 22:05:05.763822 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:05.763870 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:05.763870 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:05.763870 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:05.764325 master-0 kubenswrapper[9136]: I1203 22:05:05.764295 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:06.764322 master-0 kubenswrapper[9136]: I1203 22:05:06.764186 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:06.764322 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:06.764322 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:06.764322 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:06.764322 master-0 kubenswrapper[9136]: I1203 22:05:06.764326 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:07.764321 master-0 kubenswrapper[9136]: I1203 22:05:07.764202 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:07.764321 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:07.764321 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:07.764321 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:07.764321 master-0 kubenswrapper[9136]: I1203 22:05:07.764308 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:07.908548 master-0 kubenswrapper[9136]: I1203 22:05:07.908437 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:05:07.908946 master-0 kubenswrapper[9136]: E1203 22:05:07.908890 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:05:08.764116 master-0 kubenswrapper[9136]: I1203 22:05:08.764029 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:08.764116 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:08.764116 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:08.764116 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:08.764459 master-0 kubenswrapper[9136]: I1203 22:05:08.764119 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:09.390876 master-0 kubenswrapper[9136]: I1203 22:05:09.390635 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/3.log" Dec 03 22:05:09.392297 master-0 kubenswrapper[9136]: I1203 22:05:09.391728 9136 generic.go:334] "Generic (PLEG): container finished" podID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" exitCode=255 Dec 03 22:05:09.392297 master-0 kubenswrapper[9136]: I1203 22:05:09.391879 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerDied","Data":"2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1"} Dec 03 22:05:09.392297 master-0 kubenswrapper[9136]: I1203 22:05:09.391963 9136 scope.go:117] "RemoveContainer" containerID="48ff7474453eab80959871a93dccaeb97ff6c697edec9c60c6932a4cf955d08f" Dec 03 22:05:09.392812 master-0 kubenswrapper[9136]: I1203 22:05:09.392723 9136 scope.go:117] "RemoveContainer" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" Dec 03 22:05:09.393208 master-0 kubenswrapper[9136]: E1203 22:05:09.393148 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 22:05:09.400036 master-0 kubenswrapper[9136]: I1203 22:05:09.399965 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/3.log" Dec 03 22:05:09.400958 master-0 kubenswrapper[9136]: I1203 22:05:09.400896 9136 generic.go:334] "Generic (PLEG): container finished" podID="892d5611-debf-402f-abc5-3f99aa080159" containerID="a0d8b8091c75c8d084f0ab3c649dec1f42114979d34c4d430f5908359e9d65f4" exitCode=255 Dec 03 22:05:09.401093 master-0 kubenswrapper[9136]: I1203 22:05:09.400962 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerDied","Data":"a0d8b8091c75c8d084f0ab3c649dec1f42114979d34c4d430f5908359e9d65f4"} Dec 03 22:05:09.401990 master-0 kubenswrapper[9136]: I1203 22:05:09.401843 9136 scope.go:117] "RemoveContainer" containerID="a0d8b8091c75c8d084f0ab3c649dec1f42114979d34c4d430f5908359e9d65f4" Dec 03 22:05:09.402264 master-0 kubenswrapper[9136]: E1203 22:05:09.402208 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=network-operator pod=network-operator-6cbf58c977-zk7jw_openshift-network-operator(892d5611-debf-402f-abc5-3f99aa080159)\"" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" podUID="892d5611-debf-402f-abc5-3f99aa080159" Dec 03 22:05:09.447383 master-0 kubenswrapper[9136]: I1203 22:05:09.447316 9136 scope.go:117] "RemoveContainer" containerID="ced847df1e024d8eb90b3c6eada46b7c6dd7f30888b68484b877e825feca3e2e" Dec 03 22:05:09.663358 master-0 kubenswrapper[9136]: I1203 22:05:09.663127 9136 status_manager.go:851] "Failed to get status for pod" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" pod="openshift-etcd/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Dec 03 22:05:09.763506 master-0 kubenswrapper[9136]: I1203 22:05:09.763393 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:09.763506 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:09.763506 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:09.763506 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:09.764013 master-0 kubenswrapper[9136]: I1203 22:05:09.763511 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:09.908211 master-0 kubenswrapper[9136]: I1203 22:05:09.908109 9136 scope.go:117] "RemoveContainer" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" Dec 03 22:05:09.908632 master-0 kubenswrapper[9136]: E1203 22:05:09.908560 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(05dd6e8e0dea56089da96190349dd4c1)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" Dec 03 22:05:10.411702 master-0 kubenswrapper[9136]: I1203 22:05:10.411092 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/3.log" Dec 03 22:05:10.413752 master-0 kubenswrapper[9136]: I1203 22:05:10.413636 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/3.log" Dec 03 22:05:10.763808 master-0 kubenswrapper[9136]: I1203 22:05:10.763555 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:10.763808 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:10.763808 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:10.763808 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:10.763808 master-0 kubenswrapper[9136]: I1203 22:05:10.763651 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:10.831975 master-0 kubenswrapper[9136]: E1203 22:05:10.831829 9136 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Dec 03 22:05:10.907574 master-0 kubenswrapper[9136]: I1203 22:05:10.907476 9136 scope.go:117] "RemoveContainer" containerID="964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160" Dec 03 22:05:10.907950 master-0 kubenswrapper[9136]: E1203 22:05:10.907883 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 22:05:11.763567 master-0 kubenswrapper[9136]: I1203 22:05:11.763460 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:11.763567 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:11.763567 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:11.763567 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:11.764331 master-0 kubenswrapper[9136]: I1203 22:05:11.763571 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:11.908269 master-0 kubenswrapper[9136]: I1203 22:05:11.908173 9136 scope.go:117] "RemoveContainer" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" Dec 03 22:05:12.430562 master-0 kubenswrapper[9136]: I1203 22:05:12.430490 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/4.log" Dec 03 22:05:12.430562 master-0 kubenswrapper[9136]: I1203 22:05:12.430563 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" event={"ID":"28c42112-a09e-4b7a-b23b-c06bef69cbfb","Type":"ContainerStarted","Data":"28dc792c4a3bf497842ebccf060d9afb1baaea1b9b12fe04f73b734d9d32ae23"} Dec 03 22:05:12.763850 master-0 kubenswrapper[9136]: I1203 22:05:12.763594 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:12.763850 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:12.763850 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:12.763850 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:12.763850 master-0 kubenswrapper[9136]: I1203 22:05:12.763703 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:12.908593 master-0 kubenswrapper[9136]: I1203 22:05:12.908486 9136 scope.go:117] "RemoveContainer" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" Dec 03 22:05:12.909052 master-0 kubenswrapper[9136]: E1203 22:05:12.908960 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 22:05:13.763603 master-0 kubenswrapper[9136]: I1203 22:05:13.763530 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:13.763603 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:13.763603 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:13.763603 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:13.764608 master-0 kubenswrapper[9136]: I1203 22:05:13.763625 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:14.763186 master-0 kubenswrapper[9136]: I1203 22:05:14.762933 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:14.763186 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:14.763186 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:14.763186 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:14.763186 master-0 kubenswrapper[9136]: I1203 22:05:14.763057 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:15.764698 master-0 kubenswrapper[9136]: I1203 22:05:15.764619 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:15.764698 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:15.764698 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:15.764698 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:15.766304 master-0 kubenswrapper[9136]: I1203 22:05:15.764709 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:15.908547 master-0 kubenswrapper[9136]: I1203 22:05:15.908459 9136 scope.go:117] "RemoveContainer" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" Dec 03 22:05:15.908863 master-0 kubenswrapper[9136]: E1203 22:05:15.908815 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 22:05:16.464827 master-0 kubenswrapper[9136]: I1203 22:05:16.464746 9136 patch_prober.go:28] interesting pod/authentication-operator-7479ffdf48-fqnsm container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" start-of-body= Dec 03 22:05:16.465125 master-0 kubenswrapper[9136]: I1203 22:05:16.464842 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" podUID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.6:8443/healthz\": dial tcp 10.128.0.6:8443: connect: connection refused" Dec 03 22:05:16.763924 master-0 kubenswrapper[9136]: I1203 22:05:16.763818 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:16.763924 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:16.763924 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:16.763924 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:16.763924 master-0 kubenswrapper[9136]: I1203 22:05:16.763925 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:16.953875 master-0 kubenswrapper[9136]: I1203 22:05:16.953746 9136 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-psjj5 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 22:05:16.953875 master-0 kubenswrapper[9136]: I1203 22:05:16.953867 9136 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" podUID="04f5fc52-4ec2-48c3-8441-2b15ad632233" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 22:05:16.954840 master-0 kubenswrapper[9136]: I1203 22:05:16.953983 9136 patch_prober.go:28] interesting pod/package-server-manager-75b4d49d4c-psjj5 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" start-of-body= Dec 03 22:05:16.954840 master-0 kubenswrapper[9136]: I1203 22:05:16.954008 9136 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" podUID="04f5fc52-4ec2-48c3-8441-2b15ad632233" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.128.0.11:8080/healthz\": dial tcp 10.128.0.11:8080: connect: connection refused" Dec 03 22:05:17.469797 master-0 kubenswrapper[9136]: I1203 22:05:17.469676 9136 generic.go:334] "Generic (PLEG): container finished" podID="a124c14f-20c6-4df3-956f-a858de0c73c9" containerID="af6236462e324978e8cd817baf06fb75710668886258289a9caf50532bbe9141" exitCode=0 Dec 03 22:05:17.469797 master-0 kubenswrapper[9136]: I1203 22:05:17.469796 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" event={"ID":"a124c14f-20c6-4df3-956f-a858de0c73c9","Type":"ContainerDied","Data":"af6236462e324978e8cd817baf06fb75710668886258289a9caf50532bbe9141"} Dec 03 22:05:17.470424 master-0 kubenswrapper[9136]: I1203 22:05:17.470381 9136 scope.go:117] "RemoveContainer" containerID="af6236462e324978e8cd817baf06fb75710668886258289a9caf50532bbe9141" Dec 03 22:05:17.474368 master-0 kubenswrapper[9136]: I1203 22:05:17.474285 9136 generic.go:334] "Generic (PLEG): container finished" podID="39f0e973-7864-4842-af8e-47718ab1804c" containerID="0eed9d981ff70eac9619ddc620b8f9e1ce7420952f2e8fb539ac72d9a0cb037f" exitCode=0 Dec 03 22:05:17.474625 master-0 kubenswrapper[9136]: I1203 22:05:17.474587 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" event={"ID":"39f0e973-7864-4842-af8e-47718ab1804c","Type":"ContainerDied","Data":"0eed9d981ff70eac9619ddc620b8f9e1ce7420952f2e8fb539ac72d9a0cb037f"} Dec 03 22:05:17.475580 master-0 kubenswrapper[9136]: I1203 22:05:17.475400 9136 scope.go:117] "RemoveContainer" containerID="0eed9d981ff70eac9619ddc620b8f9e1ce7420952f2e8fb539ac72d9a0cb037f" Dec 03 22:05:17.479157 master-0 kubenswrapper[9136]: I1203 22:05:17.479098 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-75b4d49d4c-psjj5_04f5fc52-4ec2-48c3-8441-2b15ad632233/package-server-manager/0.log" Dec 03 22:05:17.479708 master-0 kubenswrapper[9136]: I1203 22:05:17.479545 9136 generic.go:334] "Generic (PLEG): container finished" podID="04f5fc52-4ec2-48c3-8441-2b15ad632233" containerID="957fee7d4699a08d2cc2951a06021265bc26437f145867b0dd5f82dd49642db5" exitCode=1 Dec 03 22:05:17.479708 master-0 kubenswrapper[9136]: I1203 22:05:17.479640 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" event={"ID":"04f5fc52-4ec2-48c3-8441-2b15ad632233","Type":"ContainerDied","Data":"957fee7d4699a08d2cc2951a06021265bc26437f145867b0dd5f82dd49642db5"} Dec 03 22:05:17.480170 master-0 kubenswrapper[9136]: I1203 22:05:17.480111 9136 scope.go:117] "RemoveContainer" containerID="957fee7d4699a08d2cc2951a06021265bc26437f145867b0dd5f82dd49642db5" Dec 03 22:05:17.482284 master-0 kubenswrapper[9136]: I1203 22:05:17.482228 9136 generic.go:334] "Generic (PLEG): container finished" podID="1b47f2ef-9923-411f-9f2f-ddaea8bc7053" containerID="3d0456f7cf5468e13600eb8bcd1915c3323422019a6601c2df115d88e550552b" exitCode=0 Dec 03 22:05:17.482531 master-0 kubenswrapper[9136]: I1203 22:05:17.482309 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" event={"ID":"1b47f2ef-9923-411f-9f2f-ddaea8bc7053","Type":"ContainerDied","Data":"3d0456f7cf5468e13600eb8bcd1915c3323422019a6601c2df115d88e550552b"} Dec 03 22:05:17.483009 master-0 kubenswrapper[9136]: I1203 22:05:17.482697 9136 scope.go:117] "RemoveContainer" containerID="3d0456f7cf5468e13600eb8bcd1915c3323422019a6601c2df115d88e550552b" Dec 03 22:05:17.484324 master-0 kubenswrapper[9136]: I1203 22:05:17.484273 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-96glt_b7f68d19-71d4-4129-a575-3ee57fa53493/cluster-node-tuning-operator/0.log" Dec 03 22:05:17.484324 master-0 kubenswrapper[9136]: I1203 22:05:17.484318 9136 generic.go:334] "Generic (PLEG): container finished" podID="b7f68d19-71d4-4129-a575-3ee57fa53493" containerID="ddc4b2406902641b1427a12cb2394dec3bff8dffab1d17cd293d7a2306efb279" exitCode=1 Dec 03 22:05:17.484509 master-0 kubenswrapper[9136]: I1203 22:05:17.484371 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" event={"ID":"b7f68d19-71d4-4129-a575-3ee57fa53493","Type":"ContainerDied","Data":"ddc4b2406902641b1427a12cb2394dec3bff8dffab1d17cd293d7a2306efb279"} Dec 03 22:05:17.484688 master-0 kubenswrapper[9136]: I1203 22:05:17.484641 9136 scope.go:117] "RemoveContainer" containerID="ddc4b2406902641b1427a12cb2394dec3bff8dffab1d17cd293d7a2306efb279" Dec 03 22:05:17.486667 master-0 kubenswrapper[9136]: I1203 22:05:17.486634 9136 generic.go:334] "Generic (PLEG): container finished" podID="0e9427b8-d62c-45f7-97d0-1f7667ff27aa" containerID="3ad0afa8f21e830ac9c2172cd6306aca03770c2953877a91f4130533970ae228" exitCode=0 Dec 03 22:05:17.486789 master-0 kubenswrapper[9136]: I1203 22:05:17.486682 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" event={"ID":"0e9427b8-d62c-45f7-97d0-1f7667ff27aa","Type":"ContainerDied","Data":"3ad0afa8f21e830ac9c2172cd6306aca03770c2953877a91f4130533970ae228"} Dec 03 22:05:17.487351 master-0 kubenswrapper[9136]: I1203 22:05:17.487321 9136 scope.go:117] "RemoveContainer" containerID="3ad0afa8f21e830ac9c2172cd6306aca03770c2953877a91f4130533970ae228" Dec 03 22:05:17.488452 master-0 kubenswrapper[9136]: I1203 22:05:17.488407 9136 generic.go:334] "Generic (PLEG): container finished" podID="64856d96-023f-46db-819c-02f1adea5aab" containerID="5af03b1fc8b9d3242cc113182968101569dfe657f33d984d37c023bd205c8309" exitCode=0 Dec 03 22:05:17.488543 master-0 kubenswrapper[9136]: I1203 22:05:17.488497 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" event={"ID":"64856d96-023f-46db-819c-02f1adea5aab","Type":"ContainerDied","Data":"5af03b1fc8b9d3242cc113182968101569dfe657f33d984d37c023bd205c8309"} Dec 03 22:05:17.489173 master-0 kubenswrapper[9136]: I1203 22:05:17.489137 9136 scope.go:117] "RemoveContainer" containerID="5af03b1fc8b9d3242cc113182968101569dfe657f33d984d37c023bd205c8309" Dec 03 22:05:17.489990 master-0 kubenswrapper[9136]: I1203 22:05:17.489951 9136 generic.go:334] "Generic (PLEG): container finished" podID="add88bf0-c88d-427d-94bb-897e088a1378" containerID="50ea9b3d8d8a684066d6791cddb4680be7db2c4667be5e468a6d5e22cffb259f" exitCode=0 Dec 03 22:05:17.490046 master-0 kubenswrapper[9136]: I1203 22:05:17.489989 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" event={"ID":"add88bf0-c88d-427d-94bb-897e088a1378","Type":"ContainerDied","Data":"50ea9b3d8d8a684066d6791cddb4680be7db2c4667be5e468a6d5e22cffb259f"} Dec 03 22:05:17.490457 master-0 kubenswrapper[9136]: I1203 22:05:17.490415 9136 scope.go:117] "RemoveContainer" containerID="50ea9b3d8d8a684066d6791cddb4680be7db2c4667be5e468a6d5e22cffb259f" Dec 03 22:05:17.764428 master-0 kubenswrapper[9136]: I1203 22:05:17.764369 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:17.764428 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:17.764428 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:17.764428 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:17.764687 master-0 kubenswrapper[9136]: I1203 22:05:17.764456 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:17.908310 master-0 kubenswrapper[9136]: I1203 22:05:17.908254 9136 scope.go:117] "RemoveContainer" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" Dec 03 22:05:17.908619 master-0 kubenswrapper[9136]: E1203 22:05:17.908574 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=etcd-operator pod=etcd-operator-7978bf889c-w8hsm_openshift-etcd-operator(82055cfc-b4ce-4a00-a51d-141059947693)\"" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" podUID="82055cfc-b4ce-4a00-a51d-141059947693" Dec 03 22:05:18.498208 master-0 kubenswrapper[9136]: I1203 22:05:18.498131 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-75b4d49d4c-psjj5_04f5fc52-4ec2-48c3-8441-2b15ad632233/package-server-manager/0.log" Dec 03 22:05:18.499005 master-0 kubenswrapper[9136]: I1203 22:05:18.498466 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" event={"ID":"04f5fc52-4ec2-48c3-8441-2b15ad632233","Type":"ContainerStarted","Data":"8228e6612fcdbeb47d78ca3a9c280fb0a173d695cb7eb6259b19c2dc07fd3780"} Dec 03 22:05:18.499409 master-0 kubenswrapper[9136]: I1203 22:05:18.499366 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:05:18.500974 master-0 kubenswrapper[9136]: I1203 22:05:18.500942 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" event={"ID":"39f0e973-7864-4842-af8e-47718ab1804c","Type":"ContainerStarted","Data":"27db1f316e5241d52a90addd5b305c9bbc612aa2e7109a1e6e9b02e4254c179b"} Dec 03 22:05:18.502942 master-0 kubenswrapper[9136]: I1203 22:05:18.502913 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" event={"ID":"1b47f2ef-9923-411f-9f2f-ddaea8bc7053","Type":"ContainerStarted","Data":"d7ccedd21fbdbdb42178159153f7d452981dcd4f1d6c82740c643331ae2c99e0"} Dec 03 22:05:18.505631 master-0 kubenswrapper[9136]: I1203 22:05:18.505587 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-96glt_b7f68d19-71d4-4129-a575-3ee57fa53493/cluster-node-tuning-operator/0.log" Dec 03 22:05:18.505736 master-0 kubenswrapper[9136]: I1203 22:05:18.505679 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" event={"ID":"b7f68d19-71d4-4129-a575-3ee57fa53493","Type":"ContainerStarted","Data":"f9caf215e0a1391884d60a702f86fe3d61697c527130b0b103b16bc748406b63"} Dec 03 22:05:18.508082 master-0 kubenswrapper[9136]: I1203 22:05:18.508033 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" event={"ID":"0e9427b8-d62c-45f7-97d0-1f7667ff27aa","Type":"ContainerStarted","Data":"506d73ff5b4ee8ade8dbbf8733312ea54701b1e9afe3f532a84919ac12f290d2"} Dec 03 22:05:18.509906 master-0 kubenswrapper[9136]: I1203 22:05:18.509868 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" event={"ID":"64856d96-023f-46db-819c-02f1adea5aab","Type":"ContainerStarted","Data":"65df04b5fb3a155ca93bd902fc9d9e8367e9f5c39df53b66ea5a1b06073ed946"} Dec 03 22:05:18.510172 master-0 kubenswrapper[9136]: I1203 22:05:18.510122 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:05:18.515386 master-0 kubenswrapper[9136]: I1203 22:05:18.515339 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" event={"ID":"add88bf0-c88d-427d-94bb-897e088a1378","Type":"ContainerStarted","Data":"d8c844654edf6b38cf9abddd3133c3681b5ffd96e0993db9e0ec283fe5a88865"} Dec 03 22:05:18.516683 master-0 kubenswrapper[9136]: I1203 22:05:18.516626 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:05:18.528808 master-0 kubenswrapper[9136]: I1203 22:05:18.527430 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" event={"ID":"a124c14f-20c6-4df3-956f-a858de0c73c9","Type":"ContainerStarted","Data":"2bfbe7b03d4b29dd1a13ea8a9a89afda8c26aae25e7041848ceb237c1ef8ce65"} Dec 03 22:05:18.764880 master-0 kubenswrapper[9136]: I1203 22:05:18.764692 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:18.764880 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:18.764880 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:18.764880 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:18.764880 master-0 kubenswrapper[9136]: I1203 22:05:18.764757 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:18.907256 master-0 kubenswrapper[9136]: I1203 22:05:18.907186 9136 scope.go:117] "RemoveContainer" containerID="a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64" Dec 03 22:05:18.907490 master-0 kubenswrapper[9136]: E1203 22:05:18.907447 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-olm-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-olm-operator pod=cluster-olm-operator-589f5cdc9d-25qxh_openshift-cluster-olm-operator(c8da5d44-680e-4169-abc6-607bdc37a64d)\"" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" podUID="c8da5d44-680e-4169-abc6-607bdc37a64d" Dec 03 22:05:19.763027 master-0 kubenswrapper[9136]: I1203 22:05:19.762949 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:19.763027 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:19.763027 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:19.763027 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:19.763624 master-0 kubenswrapper[9136]: I1203 22:05:19.763053 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:20.763091 master-0 kubenswrapper[9136]: I1203 22:05:20.763034 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:20.763091 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:20.763091 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:20.763091 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:20.763678 master-0 kubenswrapper[9136]: I1203 22:05:20.763113 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:20.908045 master-0 kubenswrapper[9136]: I1203 22:05:20.907979 9136 scope.go:117] "RemoveContainer" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" Dec 03 22:05:21.550884 master-0 kubenswrapper[9136]: I1203 22:05:21.550827 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/3.log" Dec 03 22:05:21.552556 master-0 kubenswrapper[9136]: I1203 22:05:21.552518 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"05dd6e8e0dea56089da96190349dd4c1","Type":"ContainerStarted","Data":"e6e266ea8a0d197e496699cd337eabb6f247b692621f965213e4454b3e59b018"} Dec 03 22:05:21.763633 master-0 kubenswrapper[9136]: I1203 22:05:21.763476 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:21.763633 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:21.763633 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:21.763633 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:21.763633 master-0 kubenswrapper[9136]: I1203 22:05:21.763614 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:21.908397 master-0 kubenswrapper[9136]: I1203 22:05:21.908320 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:05:21.908693 master-0 kubenswrapper[9136]: E1203 22:05:21.908555 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:05:22.763599 master-0 kubenswrapper[9136]: I1203 22:05:22.763465 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:22.763599 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:22.763599 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:22.763599 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:22.763599 master-0 kubenswrapper[9136]: I1203 22:05:22.763597 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:22.907852 master-0 kubenswrapper[9136]: I1203 22:05:22.907752 9136 scope.go:117] "RemoveContainer" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" Dec 03 22:05:22.908189 master-0 kubenswrapper[9136]: E1203 22:05:22.908045 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 22:05:23.763976 master-0 kubenswrapper[9136]: I1203 22:05:23.763897 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:23.763976 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:23.763976 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:23.763976 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:23.764670 master-0 kubenswrapper[9136]: I1203 22:05:23.763998 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:23.908861 master-0 kubenswrapper[9136]: I1203 22:05:23.908731 9136 scope.go:117] "RemoveContainer" containerID="964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160" Dec 03 22:05:23.909348 master-0 kubenswrapper[9136]: I1203 22:05:23.909046 9136 scope.go:117] "RemoveContainer" containerID="a0d8b8091c75c8d084f0ab3c649dec1f42114979d34c4d430f5908359e9d65f4" Dec 03 22:05:23.909481 master-0 kubenswrapper[9136]: E1203 22:05:23.909332 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-b5dddf8f5-llvrh_openshift-kube-controller-manager-operator(fdfbaebe-d655-4c1e-a039-08802c5c35c5)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" podUID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" Dec 03 22:05:24.574840 master-0 kubenswrapper[9136]: I1203 22:05:24.574309 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/3.log" Dec 03 22:05:24.574840 master-0 kubenswrapper[9136]: I1203 22:05:24.574374 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" event={"ID":"892d5611-debf-402f-abc5-3f99aa080159","Type":"ContainerStarted","Data":"b5c94a084beb0ffcfe11c04a4cc4a2c07cf27ba1d6e46c1318a1a571487a00c3"} Dec 03 22:05:24.764423 master-0 kubenswrapper[9136]: I1203 22:05:24.764225 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:24.764423 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:24.764423 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:24.764423 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:24.764423 master-0 kubenswrapper[9136]: I1203 22:05:24.764327 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:25.636172 master-0 kubenswrapper[9136]: I1203 22:05:25.636004 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:05:25.636172 master-0 kubenswrapper[9136]: I1203 22:05:25.636092 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:05:25.647345 master-0 kubenswrapper[9136]: I1203 22:05:25.647291 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:05:25.763440 master-0 kubenswrapper[9136]: I1203 22:05:25.763340 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:25.763440 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:25.763440 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:25.763440 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:25.763440 master-0 kubenswrapper[9136]: I1203 22:05:25.763413 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:26.565789 master-0 kubenswrapper[9136]: I1203 22:05:26.563029 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj"] Dec 03 22:05:26.565789 master-0 kubenswrapper[9136]: I1203 22:05:26.563539 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-78ddcf56f9-6b8qj"] Dec 03 22:05:26.765894 master-0 kubenswrapper[9136]: I1203 22:05:26.765848 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:26.765894 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:26.765894 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:26.765894 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:26.766244 master-0 kubenswrapper[9136]: I1203 22:05:26.766218 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:26.907533 master-0 kubenswrapper[9136]: I1203 22:05:26.907455 9136 scope.go:117] "RemoveContainer" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" Dec 03 22:05:26.907793 master-0 kubenswrapper[9136]: E1203 22:05:26.907652 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 22:05:27.271878 master-0 kubenswrapper[9136]: I1203 22:05:27.268436 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nrcql"] Dec 03 22:05:27.274284 master-0 kubenswrapper[9136]: I1203 22:05:27.273754 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nrcql"] Dec 03 22:05:27.763279 master-0 kubenswrapper[9136]: I1203 22:05:27.763199 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:27.763279 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:27.763279 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:27.763279 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:27.764092 master-0 kubenswrapper[9136]: I1203 22:05:27.763284 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:27.928670 master-0 kubenswrapper[9136]: I1203 22:05:27.928590 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" path="/var/lib/kubelet/pods/134c10ef-9f37-4a77-8e8b-4f8326bc8f40/volumes" Dec 03 22:05:27.929755 master-0 kubenswrapper[9136]: I1203 22:05:27.929719 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="519e403c-28ab-4750-8143-34c74bf526ce" path="/var/lib/kubelet/pods/519e403c-28ab-4750-8143-34c74bf526ce/volumes" Dec 03 22:05:28.763624 master-0 kubenswrapper[9136]: I1203 22:05:28.763528 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:28.763624 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:28.763624 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:28.763624 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:28.764628 master-0 kubenswrapper[9136]: I1203 22:05:28.763634 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:29.763734 master-0 kubenswrapper[9136]: I1203 22:05:29.763657 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:29.763734 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:29.763734 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:29.763734 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:29.764696 master-0 kubenswrapper[9136]: I1203 22:05:29.763748 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:30.763587 master-0 kubenswrapper[9136]: I1203 22:05:30.763482 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:30.763587 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:30.763587 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:30.763587 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:30.763587 master-0 kubenswrapper[9136]: I1203 22:05:30.763580 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:30.907445 master-0 kubenswrapper[9136]: I1203 22:05:30.907366 9136 scope.go:117] "RemoveContainer" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" Dec 03 22:05:30.907828 master-0 kubenswrapper[9136]: E1203 22:05:30.907645 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 22:05:31.763217 master-0 kubenswrapper[9136]: I1203 22:05:31.763139 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:31.763217 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:31.763217 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:31.763217 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:31.763659 master-0 kubenswrapper[9136]: I1203 22:05:31.763244 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:31.908487 master-0 kubenswrapper[9136]: I1203 22:05:31.908437 9136 scope.go:117] "RemoveContainer" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" Dec 03 22:05:32.676643 master-0 kubenswrapper[9136]: I1203 22:05:32.676592 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/3.log" Dec 03 22:05:32.676902 master-0 kubenswrapper[9136]: I1203 22:05:32.676666 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" event={"ID":"82055cfc-b4ce-4a00-a51d-141059947693","Type":"ContainerStarted","Data":"34a306490c67d90d79ddf4d3e1284f7be3dacfd797ec71163e616887fa0440fd"} Dec 03 22:05:32.765010 master-0 kubenswrapper[9136]: I1203 22:05:32.764934 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:32.765010 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:32.765010 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:32.765010 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:32.765443 master-0 kubenswrapper[9136]: I1203 22:05:32.765413 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:32.908529 master-0 kubenswrapper[9136]: I1203 22:05:32.908457 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:05:32.909208 master-0 kubenswrapper[9136]: E1203 22:05:32.908800 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:05:33.763315 master-0 kubenswrapper[9136]: I1203 22:05:33.763231 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:33.763315 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:33.763315 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:33.763315 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:33.763686 master-0 kubenswrapper[9136]: I1203 22:05:33.763352 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:33.912445 master-0 kubenswrapper[9136]: I1203 22:05:33.912381 9136 scope.go:117] "RemoveContainer" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" Dec 03 22:05:33.912993 master-0 kubenswrapper[9136]: E1203 22:05:33.912672 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 22:05:33.913041 master-0 kubenswrapper[9136]: I1203 22:05:33.912988 9136 scope.go:117] "RemoveContainer" containerID="a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64" Dec 03 22:05:34.698065 master-0 kubenswrapper[9136]: I1203 22:05:34.697920 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/3.log" Dec 03 22:05:34.699226 master-0 kubenswrapper[9136]: I1203 22:05:34.699152 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" event={"ID":"c8da5d44-680e-4169-abc6-607bdc37a64d","Type":"ContainerStarted","Data":"009941f991860db470ab2ff94c9b36becc3fa7c757b84a6b67916262e16fd678"} Dec 03 22:05:34.763983 master-0 kubenswrapper[9136]: I1203 22:05:34.763909 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:34.763983 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:34.763983 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:34.763983 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:34.764421 master-0 kubenswrapper[9136]: I1203 22:05:34.764004 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:35.587661 master-0 kubenswrapper[9136]: I1203 22:05:35.587579 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8jwv5"] Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: E1203 22:05:35.588021 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588044 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: E1203 22:05:35.588074 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="multus-admission-controller" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588086 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="multus-admission-controller" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: E1203 22:05:35.588104 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="kube-rbac-proxy" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588115 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="kube-rbac-proxy" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: E1203 22:05:35.588145 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d60f02e-1803-461e-9606-667d91fcae14" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588156 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d60f02e-1803-461e-9606-667d91fcae14" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: E1203 22:05:35.588176 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588186 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: E1203 22:05:35.588215 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588225 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588371 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="519e403c-28ab-4750-8143-34c74bf526ce" containerName="kube-multus-additional-cni-plugins" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588389 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588403 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="multus-admission-controller" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588415 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d60f02e-1803-461e-9606-667d91fcae14" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588434 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerName="installer" Dec 03 22:05:35.588754 master-0 kubenswrapper[9136]: I1203 22:05:35.588454 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="134c10ef-9f37-4a77-8e8b-4f8326bc8f40" containerName="kube-rbac-proxy" Dec 03 22:05:35.590607 master-0 kubenswrapper[9136]: I1203 22:05:35.589564 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.594160 master-0 kubenswrapper[9136]: I1203 22:05:35.594102 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vqgbh"] Dec 03 22:05:35.599804 master-0 kubenswrapper[9136]: I1203 22:05:35.598379 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.629475 master-0 kubenswrapper[9136]: I1203 22:05:35.622058 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9kzsq"] Dec 03 22:05:35.629475 master-0 kubenswrapper[9136]: I1203 22:05:35.624170 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4ctxd"] Dec 03 22:05:35.629475 master-0 kubenswrapper[9136]: I1203 22:05:35.625012 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.629475 master-0 kubenswrapper[9136]: I1203 22:05:35.626335 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.636751 master-0 kubenswrapper[9136]: I1203 22:05:35.636692 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jwv5"] Dec 03 22:05:35.642767 master-0 kubenswrapper[9136]: I1203 22:05:35.642682 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:05:35.663986 master-0 kubenswrapper[9136]: I1203 22:05:35.663900 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9kzsq"] Dec 03 22:05:35.670552 master-0 kubenswrapper[9136]: I1203 22:05:35.670502 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4ctxd"] Dec 03 22:05:35.682025 master-0 kubenswrapper[9136]: I1203 22:05:35.681918 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqgbh"] Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729293 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-catalog-content\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729350 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-catalog-content\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729379 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv7l4\" (UniqueName: \"kubernetes.io/projected/f6b69a98-5c9a-4136-9727-8d3af63a1158-kube-api-access-bv7l4\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729399 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-utilities\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729423 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7skv\" (UniqueName: \"kubernetes.io/projected/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-kube-api-access-d7skv\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729453 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-catalog-content\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729482 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-utilities\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729515 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6s26\" (UniqueName: \"kubernetes.io/projected/8fbe1b30-1dfb-4d18-b674-c35d49556836-kube-api-access-c6s26\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729548 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh9zw\" (UniqueName: \"kubernetes.io/projected/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-kube-api-access-fh9zw\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729585 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-catalog-content\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729605 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-utilities\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.731806 master-0 kubenswrapper[9136]: I1203 22:05:35.729638 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-utilities\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.782800 master-0 kubenswrapper[9136]: I1203 22:05:35.777066 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:35.782800 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:35.782800 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:35.782800 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:35.782800 master-0 kubenswrapper[9136]: I1203 22:05:35.777128 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:35.830420 master-0 kubenswrapper[9136]: I1203 22:05:35.830347 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-catalog-content\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.830420 master-0 kubenswrapper[9136]: I1203 22:05:35.830399 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv7l4\" (UniqueName: \"kubernetes.io/projected/f6b69a98-5c9a-4136-9727-8d3af63a1158-kube-api-access-bv7l4\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.830420 master-0 kubenswrapper[9136]: I1203 22:05:35.830420 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7skv\" (UniqueName: \"kubernetes.io/projected/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-kube-api-access-d7skv\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.830420 master-0 kubenswrapper[9136]: I1203 22:05:35.830435 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-utilities\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830465 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-catalog-content\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830496 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-utilities\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830526 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6s26\" (UniqueName: \"kubernetes.io/projected/8fbe1b30-1dfb-4d18-b674-c35d49556836-kube-api-access-c6s26\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830554 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh9zw\" (UniqueName: \"kubernetes.io/projected/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-kube-api-access-fh9zw\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830580 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-catalog-content\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830599 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-utilities\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830622 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-utilities\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.830828 master-0 kubenswrapper[9136]: I1203 22:05:35.830647 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-catalog-content\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.831136 master-0 kubenswrapper[9136]: I1203 22:05:35.831035 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-catalog-content\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.831679 master-0 kubenswrapper[9136]: I1203 22:05:35.831644 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-catalog-content\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.832225 master-0 kubenswrapper[9136]: I1203 22:05:35.832189 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-utilities\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.832464 master-0 kubenswrapper[9136]: I1203 22:05:35.832430 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-utilities\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.832601 master-0 kubenswrapper[9136]: I1203 22:05:35.832569 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-catalog-content\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.832791 master-0 kubenswrapper[9136]: I1203 22:05:35.832732 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-utilities\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.832840 master-0 kubenswrapper[9136]: I1203 22:05:35.832786 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-utilities\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.832840 master-0 kubenswrapper[9136]: I1203 22:05:35.832815 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-catalog-content\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.848644 master-0 kubenswrapper[9136]: I1203 22:05:35.848511 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv7l4\" (UniqueName: \"kubernetes.io/projected/f6b69a98-5c9a-4136-9727-8d3af63a1158-kube-api-access-bv7l4\") pod \"community-operators-4ctxd\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:35.850198 master-0 kubenswrapper[9136]: I1203 22:05:35.850093 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6s26\" (UniqueName: \"kubernetes.io/projected/8fbe1b30-1dfb-4d18-b674-c35d49556836-kube-api-access-c6s26\") pod \"certified-operators-vqgbh\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.852019 master-0 kubenswrapper[9136]: I1203 22:05:35.851880 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh9zw\" (UniqueName: \"kubernetes.io/projected/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-kube-api-access-fh9zw\") pod \"redhat-operators-9kzsq\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:35.853715 master-0 kubenswrapper[9136]: I1203 22:05:35.853671 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7skv\" (UniqueName: \"kubernetes.io/projected/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-kube-api-access-d7skv\") pod \"redhat-marketplace-8jwv5\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.907791 master-0 kubenswrapper[9136]: I1203 22:05:35.907726 9136 scope.go:117] "RemoveContainer" containerID="964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160" Dec 03 22:05:35.960867 master-0 kubenswrapper[9136]: I1203 22:05:35.960807 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:35.981357 master-0 kubenswrapper[9136]: I1203 22:05:35.981027 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:35.999761 master-0 kubenswrapper[9136]: I1203 22:05:35.995420 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:36.019895 master-0 kubenswrapper[9136]: I1203 22:05:36.019832 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:36.446374 master-0 kubenswrapper[9136]: W1203 22:05:36.443050 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbe1b30_1dfb_4d18_b674_c35d49556836.slice/crio-563be1da0f876d741321e2d104fce9671565598ae572f073eff32977e474e6c2 WatchSource:0}: Error finding container 563be1da0f876d741321e2d104fce9671565598ae572f073eff32977e474e6c2: Status 404 returned error can't find the container with id 563be1da0f876d741321e2d104fce9671565598ae572f073eff32977e474e6c2 Dec 03 22:05:36.450308 master-0 kubenswrapper[9136]: I1203 22:05:36.450090 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vqgbh"] Dec 03 22:05:36.458890 master-0 kubenswrapper[9136]: I1203 22:05:36.458839 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jwv5"] Dec 03 22:05:36.552376 master-0 kubenswrapper[9136]: I1203 22:05:36.552317 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9kzsq"] Dec 03 22:05:36.565837 master-0 kubenswrapper[9136]: I1203 22:05:36.564341 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4ctxd"] Dec 03 22:05:36.572146 master-0 kubenswrapper[9136]: W1203 22:05:36.572093 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1f77e6_98d7_42f7_9a3c_4b610058e28a.slice/crio-2d6f9d30ec65e16ccf339fda790075965a87731a037ec39fb70f4378b589de18 WatchSource:0}: Error finding container 2d6f9d30ec65e16ccf339fda790075965a87731a037ec39fb70f4378b589de18: Status 404 returned error can't find the container with id 2d6f9d30ec65e16ccf339fda790075965a87731a037ec39fb70f4378b589de18 Dec 03 22:05:36.721977 master-0 kubenswrapper[9136]: I1203 22:05:36.721888 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerStarted","Data":"2d6f9d30ec65e16ccf339fda790075965a87731a037ec39fb70f4378b589de18"} Dec 03 22:05:36.724443 master-0 kubenswrapper[9136]: I1203 22:05:36.724393 9136 generic.go:334] "Generic (PLEG): container finished" podID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerID="4ae290599a5a4911d24c4c92e0c86018ad9b66d87070653c69e5a59a3d60884f" exitCode=0 Dec 03 22:05:36.724540 master-0 kubenswrapper[9136]: I1203 22:05:36.724445 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqgbh" event={"ID":"8fbe1b30-1dfb-4d18-b674-c35d49556836","Type":"ContainerDied","Data":"4ae290599a5a4911d24c4c92e0c86018ad9b66d87070653c69e5a59a3d60884f"} Dec 03 22:05:36.724540 master-0 kubenswrapper[9136]: I1203 22:05:36.724527 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqgbh" event={"ID":"8fbe1b30-1dfb-4d18-b674-c35d49556836","Type":"ContainerStarted","Data":"563be1da0f876d741321e2d104fce9671565598ae572f073eff32977e474e6c2"} Dec 03 22:05:36.727111 master-0 kubenswrapper[9136]: I1203 22:05:36.727085 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerStarted","Data":"ebcf1d9872796a117e8b1df7fbc1dd6951612877562af5c9f4cea496a5190261"} Dec 03 22:05:36.727314 master-0 kubenswrapper[9136]: I1203 22:05:36.727258 9136 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:05:36.729807 master-0 kubenswrapper[9136]: I1203 22:05:36.729733 9136 generic.go:334] "Generic (PLEG): container finished" podID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerID="95060fd16626e19bd698399d02360e7153f4bb501e356a15fa9e52201d71349a" exitCode=0 Dec 03 22:05:36.729890 master-0 kubenswrapper[9136]: I1203 22:05:36.729836 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jwv5" event={"ID":"d1c9edca-31b4-49b0-b3e9-773c1019b1a0","Type":"ContainerDied","Data":"95060fd16626e19bd698399d02360e7153f4bb501e356a15fa9e52201d71349a"} Dec 03 22:05:36.729969 master-0 kubenswrapper[9136]: I1203 22:05:36.729900 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jwv5" event={"ID":"d1c9edca-31b4-49b0-b3e9-773c1019b1a0","Type":"ContainerStarted","Data":"318b1e9b339ee6ccb85534471e88c58789c49450d6bdc3ed2ab982253a8715f7"} Dec 03 22:05:36.734570 master-0 kubenswrapper[9136]: I1203 22:05:36.734516 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/3.log" Dec 03 22:05:36.734645 master-0 kubenswrapper[9136]: I1203 22:05:36.734609 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" event={"ID":"fdfbaebe-d655-4c1e-a039-08802c5c35c5","Type":"ContainerStarted","Data":"2dbe9e7dca560d811f5731920672c6a8142f74b62253419fcbc40cba75b599c9"} Dec 03 22:05:36.763779 master-0 kubenswrapper[9136]: I1203 22:05:36.763701 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:36.763779 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:36.763779 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:36.763779 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:36.764092 master-0 kubenswrapper[9136]: I1203 22:05:36.763806 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:37.744722 master-0 kubenswrapper[9136]: I1203 22:05:37.744669 9136 generic.go:334] "Generic (PLEG): container finished" podID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerID="f75befd7dd16c7fc6c74159d7cf0cbf7ac8aa53eaad393738af4abf58ec2494b" exitCode=0 Dec 03 22:05:37.746189 master-0 kubenswrapper[9136]: I1203 22:05:37.744741 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerDied","Data":"f75befd7dd16c7fc6c74159d7cf0cbf7ac8aa53eaad393738af4abf58ec2494b"} Dec 03 22:05:37.748975 master-0 kubenswrapper[9136]: I1203 22:05:37.747789 9136 generic.go:334] "Generic (PLEG): container finished" podID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerID="0db144fefb9fca2874603f4484f7cc8f50877e75fefbfd1bf1e6f4adea250e9a" exitCode=0 Dec 03 22:05:37.748975 master-0 kubenswrapper[9136]: I1203 22:05:37.747860 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jwv5" event={"ID":"d1c9edca-31b4-49b0-b3e9-773c1019b1a0","Type":"ContainerDied","Data":"0db144fefb9fca2874603f4484f7cc8f50877e75fefbfd1bf1e6f4adea250e9a"} Dec 03 22:05:37.750241 master-0 kubenswrapper[9136]: I1203 22:05:37.750209 9136 generic.go:334] "Generic (PLEG): container finished" podID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerID="13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a" exitCode=0 Dec 03 22:05:37.750241 master-0 kubenswrapper[9136]: I1203 22:05:37.750247 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerDied","Data":"13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a"} Dec 03 22:05:37.753515 master-0 kubenswrapper[9136]: I1203 22:05:37.753482 9136 generic.go:334] "Generic (PLEG): container finished" podID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerID="ed87b579d1be8988182cd9e9665f3aa73d3dac88c71228bf0637f8c7fd2cf289" exitCode=0 Dec 03 22:05:37.753560 master-0 kubenswrapper[9136]: I1203 22:05:37.753517 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqgbh" event={"ID":"8fbe1b30-1dfb-4d18-b674-c35d49556836","Type":"ContainerDied","Data":"ed87b579d1be8988182cd9e9665f3aa73d3dac88c71228bf0637f8c7fd2cf289"} Dec 03 22:05:37.765904 master-0 kubenswrapper[9136]: I1203 22:05:37.764387 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:37.765904 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:37.765904 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:37.765904 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:37.765904 master-0 kubenswrapper[9136]: I1203 22:05:37.764455 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:38.763804 master-0 kubenswrapper[9136]: I1203 22:05:38.763650 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:38.763804 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:38.763804 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:38.763804 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:38.763804 master-0 kubenswrapper[9136]: I1203 22:05:38.763745 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:38.765020 master-0 kubenswrapper[9136]: I1203 22:05:38.764963 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerStarted","Data":"b72597a6b71548538c1b9f82efb552e2915fec481007d76b56a831d2c8c055e8"} Dec 03 22:05:38.768450 master-0 kubenswrapper[9136]: I1203 22:05:38.768395 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jwv5" event={"ID":"d1c9edca-31b4-49b0-b3e9-773c1019b1a0","Type":"ContainerStarted","Data":"784049508bfb28ea94397a63578a07a579626affbf56ae6c453d2f158696255d"} Dec 03 22:05:38.771098 master-0 kubenswrapper[9136]: I1203 22:05:38.771066 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerStarted","Data":"77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b"} Dec 03 22:05:38.774217 master-0 kubenswrapper[9136]: I1203 22:05:38.774175 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqgbh" event={"ID":"8fbe1b30-1dfb-4d18-b674-c35d49556836","Type":"ContainerStarted","Data":"b19f83a1bc17540553f1d1a97ed4721ece481240fa4e790bc33ea24c422de1ac"} Dec 03 22:05:38.849719 master-0 kubenswrapper[9136]: I1203 22:05:38.849535 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vqgbh" podStartSLOduration=27.435424275 podStartE2EDuration="28.8495161s" podCreationTimestamp="2025-12-03 22:05:10 +0000 UTC" firstStartedPulling="2025-12-03 22:05:36.72716139 +0000 UTC m=+943.002337812" lastFinishedPulling="2025-12-03 22:05:38.141253205 +0000 UTC m=+944.416429637" observedRunningTime="2025-12-03 22:05:38.84568569 +0000 UTC m=+945.120862082" watchObservedRunningTime="2025-12-03 22:05:38.8495161 +0000 UTC m=+945.124692492" Dec 03 22:05:38.907957 master-0 kubenswrapper[9136]: I1203 22:05:38.907905 9136 scope.go:117] "RemoveContainer" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" Dec 03 22:05:38.908195 master-0 kubenswrapper[9136]: E1203 22:05:38.908144 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler-operator-container\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-scheduler-operator-container pod=openshift-kube-scheduler-operator-5f574c6c79-6mvwr_openshift-kube-scheduler-operator(c1ee4db7-f2d3-4064-a189-f66fd0a021eb)\"" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" podUID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" Dec 03 22:05:39.762302 master-0 kubenswrapper[9136]: I1203 22:05:39.762235 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:39.762302 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:39.762302 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:39.762302 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:39.762721 master-0 kubenswrapper[9136]: I1203 22:05:39.762322 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:39.783394 master-0 kubenswrapper[9136]: I1203 22:05:39.783322 9136 generic.go:334] "Generic (PLEG): container finished" podID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerID="77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b" exitCode=0 Dec 03 22:05:39.783394 master-0 kubenswrapper[9136]: I1203 22:05:39.783377 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerDied","Data":"77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b"} Dec 03 22:05:39.785977 master-0 kubenswrapper[9136]: I1203 22:05:39.785897 9136 generic.go:334] "Generic (PLEG): container finished" podID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerID="b72597a6b71548538c1b9f82efb552e2915fec481007d76b56a831d2c8c055e8" exitCode=0 Dec 03 22:05:39.786996 master-0 kubenswrapper[9136]: I1203 22:05:39.786947 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerDied","Data":"b72597a6b71548538c1b9f82efb552e2915fec481007d76b56a831d2c8c055e8"} Dec 03 22:05:39.814894 master-0 kubenswrapper[9136]: I1203 22:05:39.814533 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8jwv5" podStartSLOduration=6.3404521 podStartE2EDuration="7.81450846s" podCreationTimestamp="2025-12-03 22:05:32 +0000 UTC" firstStartedPulling="2025-12-03 22:05:36.732345214 +0000 UTC m=+943.007521636" lastFinishedPulling="2025-12-03 22:05:38.206401584 +0000 UTC m=+944.481577996" observedRunningTime="2025-12-03 22:05:38.876248631 +0000 UTC m=+945.151425023" watchObservedRunningTime="2025-12-03 22:05:39.81450846 +0000 UTC m=+946.089684852" Dec 03 22:05:40.764429 master-0 kubenswrapper[9136]: I1203 22:05:40.764331 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:40.764429 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:40.764429 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:40.764429 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:40.764918 master-0 kubenswrapper[9136]: I1203 22:05:40.764435 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:40.798027 master-0 kubenswrapper[9136]: I1203 22:05:40.797891 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerStarted","Data":"5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688"} Dec 03 22:05:40.801519 master-0 kubenswrapper[9136]: I1203 22:05:40.801462 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerStarted","Data":"9f9f69a854f2488eaf6149fa31953e2b8325568d37b62642bfc51b3205e58803"} Dec 03 22:05:40.835965 master-0 kubenswrapper[9136]: I1203 22:05:40.835839 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9kzsq" podStartSLOduration=6.310204319 podStartE2EDuration="8.835807041s" podCreationTimestamp="2025-12-03 22:05:32 +0000 UTC" firstStartedPulling="2025-12-03 22:05:37.751567009 +0000 UTC m=+944.026743391" lastFinishedPulling="2025-12-03 22:05:40.277169721 +0000 UTC m=+946.552346113" observedRunningTime="2025-12-03 22:05:40.830144473 +0000 UTC m=+947.105320915" watchObservedRunningTime="2025-12-03 22:05:40.835807041 +0000 UTC m=+947.110983483" Dec 03 22:05:40.862635 master-0 kubenswrapper[9136]: I1203 22:05:40.862522 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4ctxd" podStartSLOduration=28.278919304 podStartE2EDuration="30.86249113s" podCreationTimestamp="2025-12-03 22:05:10 +0000 UTC" firstStartedPulling="2025-12-03 22:05:37.746910522 +0000 UTC m=+944.022086904" lastFinishedPulling="2025-12-03 22:05:40.330482338 +0000 UTC m=+946.605658730" observedRunningTime="2025-12-03 22:05:40.862080387 +0000 UTC m=+947.137256839" watchObservedRunningTime="2025-12-03 22:05:40.86249113 +0000 UTC m=+947.137667542" Dec 03 22:05:41.763956 master-0 kubenswrapper[9136]: I1203 22:05:41.763874 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:41.763956 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:41.763956 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:41.763956 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:41.764372 master-0 kubenswrapper[9136]: I1203 22:05:41.763985 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:42.765490 master-0 kubenswrapper[9136]: I1203 22:05:42.765426 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:42.765490 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:42.765490 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:42.765490 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:42.766181 master-0 kubenswrapper[9136]: I1203 22:05:42.765512 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:42.907706 master-0 kubenswrapper[9136]: I1203 22:05:42.907624 9136 scope.go:117] "RemoveContainer" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" Dec 03 22:05:42.908063 master-0 kubenswrapper[9136]: E1203 22:05:42.908015 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=openshift-controller-manager-operator pod=openshift-controller-manager-operator-7c4697b5f5-458zh_openshift-controller-manager-operator(6976b503-87da-48fc-b097-d1b315fbee3f)\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" podUID="6976b503-87da-48fc-b097-d1b315fbee3f" Dec 03 22:05:43.763658 master-0 kubenswrapper[9136]: I1203 22:05:43.763590 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:43.763658 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:43.763658 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:43.763658 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:43.764008 master-0 kubenswrapper[9136]: I1203 22:05:43.763681 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:43.915812 master-0 kubenswrapper[9136]: I1203 22:05:43.915722 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:05:43.916484 master-0 kubenswrapper[9136]: E1203 22:05:43.916105 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:05:44.763896 master-0 kubenswrapper[9136]: I1203 22:05:44.763656 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:44.763896 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:44.763896 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:44.763896 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:44.763896 master-0 kubenswrapper[9136]: I1203 22:05:44.763758 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:45.763354 master-0 kubenswrapper[9136]: I1203 22:05:45.763284 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:45.763354 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:45.763354 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:45.763354 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:45.764400 master-0 kubenswrapper[9136]: I1203 22:05:45.764089 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:45.961555 master-0 kubenswrapper[9136]: I1203 22:05:45.961501 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:45.961929 master-0 kubenswrapper[9136]: I1203 22:05:45.961913 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:45.981453 master-0 kubenswrapper[9136]: I1203 22:05:45.981391 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:45.981556 master-0 kubenswrapper[9136]: I1203 22:05:45.981478 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:45.996973 master-0 kubenswrapper[9136]: I1203 22:05:45.996923 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:45.997137 master-0 kubenswrapper[9136]: I1203 22:05:45.996994 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:46.021012 master-0 kubenswrapper[9136]: I1203 22:05:46.020714 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:46.021012 master-0 kubenswrapper[9136]: I1203 22:05:46.020878 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:46.027249 master-0 kubenswrapper[9136]: I1203 22:05:46.027189 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:46.058099 master-0 kubenswrapper[9136]: I1203 22:05:46.058027 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:46.074135 master-0 kubenswrapper[9136]: I1203 22:05:46.073223 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:46.763246 master-0 kubenswrapper[9136]: I1203 22:05:46.763156 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:46.763246 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:46.763246 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:46.763246 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:46.763246 master-0 kubenswrapper[9136]: I1203 22:05:46.763240 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:46.950618 master-0 kubenswrapper[9136]: I1203 22:05:46.950577 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:46.960863 master-0 kubenswrapper[9136]: I1203 22:05:46.960815 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:46.962732 master-0 kubenswrapper[9136]: I1203 22:05:46.962687 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:47.055739 master-0 kubenswrapper[9136]: I1203 22:05:47.055610 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9kzsq" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="registry-server" probeResult="failure" output=< Dec 03 22:05:47.055739 master-0 kubenswrapper[9136]: timeout: failed to connect service ":50051" within 1s Dec 03 22:05:47.055739 master-0 kubenswrapper[9136]: > Dec 03 22:05:47.762894 master-0 kubenswrapper[9136]: I1203 22:05:47.762742 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:47.762894 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:47.762894 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:47.762894 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:47.763371 master-0 kubenswrapper[9136]: I1203 22:05:47.762906 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:47.908669 master-0 kubenswrapper[9136]: I1203 22:05:47.908551 9136 scope.go:117] "RemoveContainer" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" Dec 03 22:05:47.909582 master-0 kubenswrapper[9136]: E1203 22:05:47.908950 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-storage-version-migrator-operator pod=kube-storage-version-migrator-operator-67c4cff67d-vcd7b_openshift-kube-storage-version-migrator-operator(50076985-bbaa-4bcf-9d1a-cc25bed016a7)\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" podUID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" Dec 03 22:05:48.763821 master-0 kubenswrapper[9136]: I1203 22:05:48.763677 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:48.763821 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:48.763821 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:48.763821 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:48.763821 master-0 kubenswrapper[9136]: I1203 22:05:48.763797 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:49.195407 master-0 kubenswrapper[9136]: I1203 22:05:49.195302 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4ctxd"] Dec 03 22:05:49.196261 master-0 kubenswrapper[9136]: I1203 22:05:49.195741 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4ctxd" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="registry-server" containerID="cri-o://9f9f69a854f2488eaf6149fa31953e2b8325568d37b62642bfc51b3205e58803" gracePeriod=2 Dec 03 22:05:49.234226 master-0 kubenswrapper[9136]: I1203 22:05:49.234143 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqgbh"] Dec 03 22:05:49.234574 master-0 kubenswrapper[9136]: I1203 22:05:49.234471 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vqgbh" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="registry-server" containerID="cri-o://b19f83a1bc17540553f1d1a97ed4721ece481240fa4e790bc33ea24c422de1ac" gracePeriod=2 Dec 03 22:05:49.764395 master-0 kubenswrapper[9136]: I1203 22:05:49.763897 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:49.764395 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:49.764395 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:49.764395 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:49.764395 master-0 kubenswrapper[9136]: I1203 22:05:49.763981 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:49.892018 master-0 kubenswrapper[9136]: I1203 22:05:49.891929 9136 generic.go:334] "Generic (PLEG): container finished" podID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerID="b19f83a1bc17540553f1d1a97ed4721ece481240fa4e790bc33ea24c422de1ac" exitCode=0 Dec 03 22:05:49.892333 master-0 kubenswrapper[9136]: I1203 22:05:49.892024 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqgbh" event={"ID":"8fbe1b30-1dfb-4d18-b674-c35d49556836","Type":"ContainerDied","Data":"b19f83a1bc17540553f1d1a97ed4721ece481240fa4e790bc33ea24c422de1ac"} Dec 03 22:05:49.896109 master-0 kubenswrapper[9136]: I1203 22:05:49.895868 9136 generic.go:334] "Generic (PLEG): container finished" podID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerID="9f9f69a854f2488eaf6149fa31953e2b8325568d37b62642bfc51b3205e58803" exitCode=0 Dec 03 22:05:49.896208 master-0 kubenswrapper[9136]: I1203 22:05:49.896110 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerDied","Data":"9f9f69a854f2488eaf6149fa31953e2b8325568d37b62642bfc51b3205e58803"} Dec 03 22:05:50.079049 master-0 kubenswrapper[9136]: I1203 22:05:50.079015 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:50.232901 master-0 kubenswrapper[9136]: I1203 22:05:50.232846 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv7l4\" (UniqueName: \"kubernetes.io/projected/f6b69a98-5c9a-4136-9727-8d3af63a1158-kube-api-access-bv7l4\") pod \"f6b69a98-5c9a-4136-9727-8d3af63a1158\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " Dec 03 22:05:50.233932 master-0 kubenswrapper[9136]: I1203 22:05:50.233887 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-utilities\") pod \"f6b69a98-5c9a-4136-9727-8d3af63a1158\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " Dec 03 22:05:50.234228 master-0 kubenswrapper[9136]: I1203 22:05:50.234197 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-catalog-content\") pod \"f6b69a98-5c9a-4136-9727-8d3af63a1158\" (UID: \"f6b69a98-5c9a-4136-9727-8d3af63a1158\") " Dec 03 22:05:50.234968 master-0 kubenswrapper[9136]: I1203 22:05:50.234886 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-utilities" (OuterVolumeSpecName: "utilities") pod "f6b69a98-5c9a-4136-9727-8d3af63a1158" (UID: "f6b69a98-5c9a-4136-9727-8d3af63a1158"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:50.238005 master-0 kubenswrapper[9136]: I1203 22:05:50.237934 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b69a98-5c9a-4136-9727-8d3af63a1158-kube-api-access-bv7l4" (OuterVolumeSpecName: "kube-api-access-bv7l4") pod "f6b69a98-5c9a-4136-9727-8d3af63a1158" (UID: "f6b69a98-5c9a-4136-9727-8d3af63a1158"). InnerVolumeSpecName "kube-api-access-bv7l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:50.323337 master-0 kubenswrapper[9136]: I1203 22:05:50.323085 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6b69a98-5c9a-4136-9727-8d3af63a1158" (UID: "f6b69a98-5c9a-4136-9727-8d3af63a1158"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:50.337011 master-0 kubenswrapper[9136]: I1203 22:05:50.336923 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv7l4\" (UniqueName: \"kubernetes.io/projected/f6b69a98-5c9a-4136-9727-8d3af63a1158-kube-api-access-bv7l4\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:50.337011 master-0 kubenswrapper[9136]: I1203 22:05:50.336991 9136 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:50.337011 master-0 kubenswrapper[9136]: I1203 22:05:50.337015 9136 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6b69a98-5c9a-4136-9727-8d3af63a1158-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:50.764334 master-0 kubenswrapper[9136]: I1203 22:05:50.764277 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:50.764334 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:50.764334 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:50.764334 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:50.764334 master-0 kubenswrapper[9136]: I1203 22:05:50.764346 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:50.926142 master-0 kubenswrapper[9136]: I1203 22:05:50.925983 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4ctxd" event={"ID":"f6b69a98-5c9a-4136-9727-8d3af63a1158","Type":"ContainerDied","Data":"ebcf1d9872796a117e8b1df7fbc1dd6951612877562af5c9f4cea496a5190261"} Dec 03 22:05:50.926385 master-0 kubenswrapper[9136]: I1203 22:05:50.926104 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4ctxd" Dec 03 22:05:50.926478 master-0 kubenswrapper[9136]: I1203 22:05:50.926234 9136 scope.go:117] "RemoveContainer" containerID="9f9f69a854f2488eaf6149fa31953e2b8325568d37b62642bfc51b3205e58803" Dec 03 22:05:50.964895 master-0 kubenswrapper[9136]: I1203 22:05:50.964830 9136 scope.go:117] "RemoveContainer" containerID="b72597a6b71548538c1b9f82efb552e2915fec481007d76b56a831d2c8c055e8" Dec 03 22:05:50.989822 master-0 kubenswrapper[9136]: I1203 22:05:50.989744 9136 scope.go:117] "RemoveContainer" containerID="f75befd7dd16c7fc6c74159d7cf0cbf7ac8aa53eaad393738af4abf58ec2494b" Dec 03 22:05:51.048371 master-0 kubenswrapper[9136]: I1203 22:05:51.048301 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:51.147894 master-0 kubenswrapper[9136]: I1203 22:05:51.147846 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-catalog-content\") pod \"8fbe1b30-1dfb-4d18-b674-c35d49556836\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " Dec 03 22:05:51.148215 master-0 kubenswrapper[9136]: I1203 22:05:51.148193 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6s26\" (UniqueName: \"kubernetes.io/projected/8fbe1b30-1dfb-4d18-b674-c35d49556836-kube-api-access-c6s26\") pod \"8fbe1b30-1dfb-4d18-b674-c35d49556836\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " Dec 03 22:05:51.148408 master-0 kubenswrapper[9136]: I1203 22:05:51.148392 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-utilities\") pod \"8fbe1b30-1dfb-4d18-b674-c35d49556836\" (UID: \"8fbe1b30-1dfb-4d18-b674-c35d49556836\") " Dec 03 22:05:51.149891 master-0 kubenswrapper[9136]: I1203 22:05:51.149848 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-utilities" (OuterVolumeSpecName: "utilities") pod "8fbe1b30-1dfb-4d18-b674-c35d49556836" (UID: "8fbe1b30-1dfb-4d18-b674-c35d49556836"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:51.152139 master-0 kubenswrapper[9136]: I1203 22:05:51.152024 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbe1b30-1dfb-4d18-b674-c35d49556836-kube-api-access-c6s26" (OuterVolumeSpecName: "kube-api-access-c6s26") pod "8fbe1b30-1dfb-4d18-b674-c35d49556836" (UID: "8fbe1b30-1dfb-4d18-b674-c35d49556836"). InnerVolumeSpecName "kube-api-access-c6s26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:51.208520 master-0 kubenswrapper[9136]: I1203 22:05:51.208372 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fbe1b30-1dfb-4d18-b674-c35d49556836" (UID: "8fbe1b30-1dfb-4d18-b674-c35d49556836"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:51.251236 master-0 kubenswrapper[9136]: I1203 22:05:51.251115 9136 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:51.251236 master-0 kubenswrapper[9136]: I1203 22:05:51.251191 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6s26\" (UniqueName: \"kubernetes.io/projected/8fbe1b30-1dfb-4d18-b674-c35d49556836-kube-api-access-c6s26\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:51.251236 master-0 kubenswrapper[9136]: I1203 22:05:51.251216 9136 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fbe1b30-1dfb-4d18-b674-c35d49556836-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:51.598304 master-0 kubenswrapper[9136]: I1203 22:05:51.598179 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jwv5"] Dec 03 22:05:51.598791 master-0 kubenswrapper[9136]: I1203 22:05:51.598728 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8jwv5" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="registry-server" containerID="cri-o://784049508bfb28ea94397a63578a07a579626affbf56ae6c453d2f158696255d" gracePeriod=2 Dec 03 22:05:51.628810 master-0 kubenswrapper[9136]: I1203 22:05:51.623961 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4ctxd"] Dec 03 22:05:51.631501 master-0 kubenswrapper[9136]: I1203 22:05:51.630591 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4ctxd"] Dec 03 22:05:51.763431 master-0 kubenswrapper[9136]: I1203 22:05:51.763365 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:51.763431 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:51.763431 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:51.763431 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:51.763733 master-0 kubenswrapper[9136]: I1203 22:05:51.763439 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:51.909023 master-0 kubenswrapper[9136]: I1203 22:05:51.908951 9136 scope.go:117] "RemoveContainer" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" Dec 03 22:05:51.921979 master-0 kubenswrapper[9136]: I1203 22:05:51.921915 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" path="/var/lib/kubelet/pods/f6b69a98-5c9a-4136-9727-8d3af63a1158/volumes" Dec 03 22:05:51.939967 master-0 kubenswrapper[9136]: I1203 22:05:51.938665 9136 generic.go:334] "Generic (PLEG): container finished" podID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerID="784049508bfb28ea94397a63578a07a579626affbf56ae6c453d2f158696255d" exitCode=0 Dec 03 22:05:51.939967 master-0 kubenswrapper[9136]: I1203 22:05:51.938764 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jwv5" event={"ID":"d1c9edca-31b4-49b0-b3e9-773c1019b1a0","Type":"ContainerDied","Data":"784049508bfb28ea94397a63578a07a579626affbf56ae6c453d2f158696255d"} Dec 03 22:05:51.942744 master-0 kubenswrapper[9136]: I1203 22:05:51.942674 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vqgbh" event={"ID":"8fbe1b30-1dfb-4d18-b674-c35d49556836","Type":"ContainerDied","Data":"563be1da0f876d741321e2d104fce9671565598ae572f073eff32977e474e6c2"} Dec 03 22:05:51.942744 master-0 kubenswrapper[9136]: I1203 22:05:51.942753 9136 scope.go:117] "RemoveContainer" containerID="b19f83a1bc17540553f1d1a97ed4721ece481240fa4e790bc33ea24c422de1ac" Dec 03 22:05:51.943018 master-0 kubenswrapper[9136]: I1203 22:05:51.942872 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vqgbh" Dec 03 22:05:51.966123 master-0 kubenswrapper[9136]: I1203 22:05:51.966060 9136 scope.go:117] "RemoveContainer" containerID="ed87b579d1be8988182cd9e9665f3aa73d3dac88c71228bf0637f8c7fd2cf289" Dec 03 22:05:51.973461 master-0 kubenswrapper[9136]: I1203 22:05:51.973429 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vqgbh"] Dec 03 22:05:51.978876 master-0 kubenswrapper[9136]: I1203 22:05:51.978841 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vqgbh"] Dec 03 22:05:52.039373 master-0 kubenswrapper[9136]: I1203 22:05:52.039342 9136 scope.go:117] "RemoveContainer" containerID="4ae290599a5a4911d24c4c92e0c86018ad9b66d87070653c69e5a59a3d60884f" Dec 03 22:05:52.070798 master-0 kubenswrapper[9136]: I1203 22:05:52.060368 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:52.165294 master-0 kubenswrapper[9136]: I1203 22:05:52.165226 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7skv\" (UniqueName: \"kubernetes.io/projected/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-kube-api-access-d7skv\") pod \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " Dec 03 22:05:52.165399 master-0 kubenswrapper[9136]: I1203 22:05:52.165316 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-utilities\") pod \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " Dec 03 22:05:52.165399 master-0 kubenswrapper[9136]: I1203 22:05:52.165383 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-catalog-content\") pod \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\" (UID: \"d1c9edca-31b4-49b0-b3e9-773c1019b1a0\") " Dec 03 22:05:52.166457 master-0 kubenswrapper[9136]: I1203 22:05:52.166411 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-utilities" (OuterVolumeSpecName: "utilities") pod "d1c9edca-31b4-49b0-b3e9-773c1019b1a0" (UID: "d1c9edca-31b4-49b0-b3e9-773c1019b1a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:52.168615 master-0 kubenswrapper[9136]: I1203 22:05:52.168559 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-kube-api-access-d7skv" (OuterVolumeSpecName: "kube-api-access-d7skv") pod "d1c9edca-31b4-49b0-b3e9-773c1019b1a0" (UID: "d1c9edca-31b4-49b0-b3e9-773c1019b1a0"). InnerVolumeSpecName "kube-api-access-d7skv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:52.187855 master-0 kubenswrapper[9136]: I1203 22:05:52.187799 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1c9edca-31b4-49b0-b3e9-773c1019b1a0" (UID: "d1c9edca-31b4-49b0-b3e9-773c1019b1a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:52.267421 master-0 kubenswrapper[9136]: I1203 22:05:52.267355 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7skv\" (UniqueName: \"kubernetes.io/projected/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-kube-api-access-d7skv\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:52.267421 master-0 kubenswrapper[9136]: I1203 22:05:52.267413 9136 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:52.267421 master-0 kubenswrapper[9136]: I1203 22:05:52.267431 9136 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1c9edca-31b4-49b0-b3e9-773c1019b1a0-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:52.762828 master-0 kubenswrapper[9136]: I1203 22:05:52.762755 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:52.762828 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:52.762828 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:52.762828 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:52.763185 master-0 kubenswrapper[9136]: I1203 22:05:52.762851 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:52.959951 master-0 kubenswrapper[9136]: I1203 22:05:52.959855 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/3.log" Dec 03 22:05:52.960205 master-0 kubenswrapper[9136]: I1203 22:05:52.960020 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" event={"ID":"c1ee4db7-f2d3-4064-a189-f66fd0a021eb","Type":"ContainerStarted","Data":"b7aabfa670126d55be14ab2dcf46406bb16856dfd43426ec5a9877997370549e"} Dec 03 22:05:52.968556 master-0 kubenswrapper[9136]: I1203 22:05:52.968493 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8jwv5" event={"ID":"d1c9edca-31b4-49b0-b3e9-773c1019b1a0","Type":"ContainerDied","Data":"318b1e9b339ee6ccb85534471e88c58789c49450d6bdc3ed2ab982253a8715f7"} Dec 03 22:05:52.968660 master-0 kubenswrapper[9136]: I1203 22:05:52.968557 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8jwv5" Dec 03 22:05:52.968727 master-0 kubenswrapper[9136]: I1203 22:05:52.968599 9136 scope.go:117] "RemoveContainer" containerID="784049508bfb28ea94397a63578a07a579626affbf56ae6c453d2f158696255d" Dec 03 22:05:52.990092 master-0 kubenswrapper[9136]: I1203 22:05:52.990026 9136 scope.go:117] "RemoveContainer" containerID="0db144fefb9fca2874603f4484f7cc8f50877e75fefbfd1bf1e6f4adea250e9a" Dec 03 22:05:53.029530 master-0 kubenswrapper[9136]: I1203 22:05:53.029489 9136 scope.go:117] "RemoveContainer" containerID="95060fd16626e19bd698399d02360e7153f4bb501e356a15fa9e52201d71349a" Dec 03 22:05:53.057555 master-0 kubenswrapper[9136]: I1203 22:05:53.057509 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jwv5"] Dec 03 22:05:53.127053 master-0 kubenswrapper[9136]: I1203 22:05:53.126930 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8jwv5"] Dec 03 22:05:53.763105 master-0 kubenswrapper[9136]: I1203 22:05:53.763046 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:53.763105 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:53.763105 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:53.763105 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:53.764205 master-0 kubenswrapper[9136]: I1203 22:05:53.764154 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:53.924260 master-0 kubenswrapper[9136]: I1203 22:05:53.924177 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" path="/var/lib/kubelet/pods/8fbe1b30-1dfb-4d18-b674-c35d49556836/volumes" Dec 03 22:05:53.927614 master-0 kubenswrapper[9136]: I1203 22:05:53.927559 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" path="/var/lib/kubelet/pods/d1c9edca-31b4-49b0-b3e9-773c1019b1a0/volumes" Dec 03 22:05:54.764552 master-0 kubenswrapper[9136]: I1203 22:05:54.764364 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:54.764552 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:54.764552 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:54.764552 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:54.764552 master-0 kubenswrapper[9136]: I1203 22:05:54.764467 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:55.763514 master-0 kubenswrapper[9136]: I1203 22:05:55.763380 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:55.763514 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:55.763514 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:55.763514 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:55.764078 master-0 kubenswrapper[9136]: I1203 22:05:55.763551 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:55.909293 master-0 kubenswrapper[9136]: I1203 22:05:55.909176 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:05:55.910202 master-0 kubenswrapper[9136]: E1203 22:05:55.909614 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:05:55.924227 master-0 kubenswrapper[9136]: E1203 22:05:55.924156 9136 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff" to get inode usage: stat /var/lib/containers/storage/overlay/349c4959cc7ef2e778bda8ac811d0aed0a0cde34a8e4c5331e4546cf73ab1f7f/diff: no such file or directory, extraDiskErr: Dec 03 22:05:56.074082 master-0 kubenswrapper[9136]: I1203 22:05:56.073899 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:56.139796 master-0 kubenswrapper[9136]: I1203 22:05:56.139694 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:56.763368 master-0 kubenswrapper[9136]: I1203 22:05:56.763275 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:56.763368 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:56.763368 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:56.763368 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:56.763893 master-0 kubenswrapper[9136]: I1203 22:05:56.763389 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:56.908957 master-0 kubenswrapper[9136]: I1203 22:05:56.908871 9136 scope.go:117] "RemoveContainer" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" Dec 03 22:05:56.961171 master-0 kubenswrapper[9136]: I1203 22:05:56.961119 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:05:57.763121 master-0 kubenswrapper[9136]: I1203 22:05:57.762973 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:57.763121 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:57.763121 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:57.763121 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:57.763121 master-0 kubenswrapper[9136]: I1203 22:05:57.763082 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:58.018094 master-0 kubenswrapper[9136]: I1203 22:05:58.017873 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/3.log" Dec 03 22:05:58.018094 master-0 kubenswrapper[9136]: I1203 22:05:58.017971 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" event={"ID":"6976b503-87da-48fc-b097-d1b315fbee3f","Type":"ContainerStarted","Data":"f38a73a21bf43c943f6c760a1eee5629bdb9e8ace05ee8a36f35754b5eed0004"} Dec 03 22:05:58.763113 master-0 kubenswrapper[9136]: I1203 22:05:58.763027 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:58.763113 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:58.763113 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:58.763113 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:58.763113 master-0 kubenswrapper[9136]: I1203 22:05:58.763107 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:59.116252 master-0 kubenswrapper[9136]: I1203 22:05:59.116153 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9kzsq"] Dec 03 22:05:59.117339 master-0 kubenswrapper[9136]: I1203 22:05:59.116642 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9kzsq" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="registry-server" containerID="cri-o://5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688" gracePeriod=2 Dec 03 22:05:59.582957 master-0 kubenswrapper[9136]: I1203 22:05:59.582890 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:05:59.680130 master-0 kubenswrapper[9136]: I1203 22:05:59.680020 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-utilities\") pod \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " Dec 03 22:05:59.680498 master-0 kubenswrapper[9136]: I1203 22:05:59.680164 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh9zw\" (UniqueName: \"kubernetes.io/projected/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-kube-api-access-fh9zw\") pod \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " Dec 03 22:05:59.681356 master-0 kubenswrapper[9136]: I1203 22:05:59.681273 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-catalog-content\") pod \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\" (UID: \"ae1f77e6-98d7-42f7-9a3c-4b610058e28a\") " Dec 03 22:05:59.681946 master-0 kubenswrapper[9136]: I1203 22:05:59.681866 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-utilities" (OuterVolumeSpecName: "utilities") pod "ae1f77e6-98d7-42f7-9a3c-4b610058e28a" (UID: "ae1f77e6-98d7-42f7-9a3c-4b610058e28a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:59.683178 master-0 kubenswrapper[9136]: I1203 22:05:59.683136 9136 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:59.683998 master-0 kubenswrapper[9136]: I1203 22:05:59.683921 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-kube-api-access-fh9zw" (OuterVolumeSpecName: "kube-api-access-fh9zw") pod "ae1f77e6-98d7-42f7-9a3c-4b610058e28a" (UID: "ae1f77e6-98d7-42f7-9a3c-4b610058e28a"). InnerVolumeSpecName "kube-api-access-fh9zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:05:59.765472 master-0 kubenswrapper[9136]: I1203 22:05:59.765383 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:05:59.765472 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:05:59.765472 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:05:59.765472 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:05:59.765472 master-0 kubenswrapper[9136]: I1203 22:05:59.765468 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:05:59.784610 master-0 kubenswrapper[9136]: I1203 22:05:59.784488 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh9zw\" (UniqueName: \"kubernetes.io/projected/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-kube-api-access-fh9zw\") on node \"master-0\" DevicePath \"\"" Dec 03 22:05:59.835079 master-0 kubenswrapper[9136]: I1203 22:05:59.835011 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae1f77e6-98d7-42f7-9a3c-4b610058e28a" (UID: "ae1f77e6-98d7-42f7-9a3c-4b610058e28a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:05:59.885784 master-0 kubenswrapper[9136]: I1203 22:05:59.885703 9136 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1f77e6-98d7-42f7-9a3c-4b610058e28a-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:06:00.043705 master-0 kubenswrapper[9136]: I1203 22:06:00.043514 9136 generic.go:334] "Generic (PLEG): container finished" podID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerID="5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688" exitCode=0 Dec 03 22:06:00.043705 master-0 kubenswrapper[9136]: I1203 22:06:00.043597 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerDied","Data":"5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688"} Dec 03 22:06:00.043705 master-0 kubenswrapper[9136]: I1203 22:06:00.043612 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9kzsq" Dec 03 22:06:00.043705 master-0 kubenswrapper[9136]: I1203 22:06:00.043651 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9kzsq" event={"ID":"ae1f77e6-98d7-42f7-9a3c-4b610058e28a","Type":"ContainerDied","Data":"2d6f9d30ec65e16ccf339fda790075965a87731a037ec39fb70f4378b589de18"} Dec 03 22:06:00.043705 master-0 kubenswrapper[9136]: I1203 22:06:00.043678 9136 scope.go:117] "RemoveContainer" containerID="5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688" Dec 03 22:06:00.072260 master-0 kubenswrapper[9136]: I1203 22:06:00.072054 9136 scope.go:117] "RemoveContainer" containerID="77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b" Dec 03 22:06:00.085971 master-0 kubenswrapper[9136]: I1203 22:06:00.085710 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9kzsq"] Dec 03 22:06:00.097004 master-0 kubenswrapper[9136]: I1203 22:06:00.096928 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9kzsq"] Dec 03 22:06:00.105642 master-0 kubenswrapper[9136]: I1203 22:06:00.105578 9136 scope.go:117] "RemoveContainer" containerID="13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a" Dec 03 22:06:00.143046 master-0 kubenswrapper[9136]: I1203 22:06:00.142836 9136 scope.go:117] "RemoveContainer" containerID="5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688" Dec 03 22:06:00.143862 master-0 kubenswrapper[9136]: E1203 22:06:00.143759 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688\": container with ID starting with 5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688 not found: ID does not exist" containerID="5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688" Dec 03 22:06:00.143941 master-0 kubenswrapper[9136]: I1203 22:06:00.143871 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688"} err="failed to get container status \"5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688\": rpc error: code = NotFound desc = could not find container \"5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688\": container with ID starting with 5cb6df175950cd2ed6aedc491c3b4eb5dc0d119547c189b59053245118b0f688 not found: ID does not exist" Dec 03 22:06:00.143941 master-0 kubenswrapper[9136]: I1203 22:06:00.143908 9136 scope.go:117] "RemoveContainer" containerID="77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b" Dec 03 22:06:00.144742 master-0 kubenswrapper[9136]: E1203 22:06:00.144702 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b\": container with ID starting with 77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b not found: ID does not exist" containerID="77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b" Dec 03 22:06:00.144742 master-0 kubenswrapper[9136]: I1203 22:06:00.144733 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b"} err="failed to get container status \"77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b\": rpc error: code = NotFound desc = could not find container \"77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b\": container with ID starting with 77f8e3fed8fb66863fda3660af1c84c2a8b79ba9f686429a84fbccef07410c6b not found: ID does not exist" Dec 03 22:06:00.144880 master-0 kubenswrapper[9136]: I1203 22:06:00.144750 9136 scope.go:117] "RemoveContainer" containerID="13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a" Dec 03 22:06:00.145349 master-0 kubenswrapper[9136]: E1203 22:06:00.145293 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a\": container with ID starting with 13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a not found: ID does not exist" containerID="13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a" Dec 03 22:06:00.145425 master-0 kubenswrapper[9136]: I1203 22:06:00.145352 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a"} err="failed to get container status \"13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a\": rpc error: code = NotFound desc = could not find container \"13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a\": container with ID starting with 13e4bb5ae1095efae0f6074d6da18ddc010ddabf844a7161898373edd37bd96a not found: ID does not exist" Dec 03 22:06:00.763759 master-0 kubenswrapper[9136]: I1203 22:06:00.763669 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:00.763759 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:00.763759 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:00.763759 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:00.763759 master-0 kubenswrapper[9136]: I1203 22:06:00.763806 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:01.763949 master-0 kubenswrapper[9136]: I1203 22:06:01.763842 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:01.763949 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:01.763949 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:01.763949 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:01.765070 master-0 kubenswrapper[9136]: I1203 22:06:01.763983 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:01.907817 master-0 kubenswrapper[9136]: I1203 22:06:01.907683 9136 scope.go:117] "RemoveContainer" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" Dec 03 22:06:01.920288 master-0 kubenswrapper[9136]: I1203 22:06:01.920209 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" path="/var/lib/kubelet/pods/ae1f77e6-98d7-42f7-9a3c-4b610058e28a/volumes" Dec 03 22:06:02.763573 master-0 kubenswrapper[9136]: I1203 22:06:02.763460 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:02.763573 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:02.763573 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:02.763573 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:02.764840 master-0 kubenswrapper[9136]: I1203 22:06:02.763579 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:03.077555 master-0 kubenswrapper[9136]: I1203 22:06:03.077354 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/3.log" Dec 03 22:06:03.077555 master-0 kubenswrapper[9136]: I1203 22:06:03.077450 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" event={"ID":"50076985-bbaa-4bcf-9d1a-cc25bed016a7","Type":"ContainerStarted","Data":"8fd91e896a3a6520a16665f3495009c09635d88d3818df9ce9877a23eed7b7e0"} Dec 03 22:06:03.764153 master-0 kubenswrapper[9136]: I1203 22:06:03.764019 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:03.764153 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:03.764153 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:03.764153 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:03.765215 master-0 kubenswrapper[9136]: I1203 22:06:03.764163 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:04.764032 master-0 kubenswrapper[9136]: I1203 22:06:04.763855 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:04.764032 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:04.764032 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:04.764032 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:04.764032 master-0 kubenswrapper[9136]: I1203 22:06:04.763988 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:05.763391 master-0 kubenswrapper[9136]: I1203 22:06:05.763276 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:05.763391 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:05.763391 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:05.763391 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:05.763391 master-0 kubenswrapper[9136]: I1203 22:06:05.763376 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:06.763121 master-0 kubenswrapper[9136]: I1203 22:06:06.762972 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:06.763121 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:06.763121 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:06.763121 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:06.763121 master-0 kubenswrapper[9136]: I1203 22:06:06.763098 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:06.908799 master-0 kubenswrapper[9136]: I1203 22:06:06.908617 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:06:06.908799 master-0 kubenswrapper[9136]: I1203 22:06:06.908685 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:06:06.908799 master-0 kubenswrapper[9136]: I1203 22:06:06.908705 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:06:06.909283 master-0 kubenswrapper[9136]: E1203 22:06:06.909224 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-config-operator\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=openshift-config-operator pod=openshift-config-operator-68c95b6cf5-2cs5d_openshift-config-operator(e50b85a6-7767-4fca-8133-8243bdd85e5d)\"" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" podUID="e50b85a6-7767-4fca-8133-8243bdd85e5d" Dec 03 22:06:06.931915 master-0 kubenswrapper[9136]: I1203 22:06:06.931539 9136 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Dec 03 22:06:06.934347 master-0 kubenswrapper[9136]: I1203 22:06:06.934250 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 03 22:06:06.941705 master-0 kubenswrapper[9136]: I1203 22:06:06.941628 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 03 22:06:06.966837 master-0 kubenswrapper[9136]: I1203 22:06:06.960147 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Dec 03 22:06:07.111616 master-0 kubenswrapper[9136]: I1203 22:06:07.111419 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:06:07.111616 master-0 kubenswrapper[9136]: I1203 22:06:07.111475 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="23f3ae86-6968-4678-9abe-e15a3988201b" Dec 03 22:06:07.764192 master-0 kubenswrapper[9136]: I1203 22:06:07.764095 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:07.764192 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:07.764192 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:07.764192 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:07.765069 master-0 kubenswrapper[9136]: I1203 22:06:07.764211 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:08.764112 master-0 kubenswrapper[9136]: I1203 22:06:08.764023 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:08.764112 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:08.764112 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:08.764112 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:08.765193 master-0 kubenswrapper[9136]: I1203 22:06:08.764121 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:09.764105 master-0 kubenswrapper[9136]: I1203 22:06:09.763994 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:09.764105 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:09.764105 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:09.764105 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:09.764105 master-0 kubenswrapper[9136]: I1203 22:06:09.764097 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:10.763738 master-0 kubenswrapper[9136]: I1203 22:06:10.763640 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:10.763738 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:10.763738 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:10.763738 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:10.764413 master-0 kubenswrapper[9136]: I1203 22:06:10.763755 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:11.763191 master-0 kubenswrapper[9136]: I1203 22:06:11.763149 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:11.763191 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:11.763191 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:11.763191 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:11.763882 master-0 kubenswrapper[9136]: I1203 22:06:11.763854 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:12.763844 master-0 kubenswrapper[9136]: I1203 22:06:12.763720 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:12.763844 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:12.763844 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:12.763844 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:12.764838 master-0 kubenswrapper[9136]: I1203 22:06:12.763883 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:13.763254 master-0 kubenswrapper[9136]: I1203 22:06:13.763142 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:13.763254 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:13.763254 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:13.763254 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:13.763701 master-0 kubenswrapper[9136]: I1203 22:06:13.763260 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:13.970933 master-0 kubenswrapper[9136]: I1203 22:06:13.969798 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=7.969748719 podStartE2EDuration="7.969748719s" podCreationTimestamp="2025-12-03 22:06:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:06:13.966368102 +0000 UTC m=+980.241544514" watchObservedRunningTime="2025-12-03 22:06:13.969748719 +0000 UTC m=+980.244925141" Dec 03 22:06:14.764176 master-0 kubenswrapper[9136]: I1203 22:06:14.764001 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:14.764176 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:14.764176 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:14.764176 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:14.764847 master-0 kubenswrapper[9136]: I1203 22:06:14.764756 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:15.763709 master-0 kubenswrapper[9136]: I1203 22:06:15.763589 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:15.763709 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:15.763709 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:15.763709 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:15.764736 master-0 kubenswrapper[9136]: I1203 22:06:15.764679 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:16.764511 master-0 kubenswrapper[9136]: I1203 22:06:16.764400 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:16.764511 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:16.764511 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:16.764511 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:16.764511 master-0 kubenswrapper[9136]: I1203 22:06:16.764479 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:17.763553 master-0 kubenswrapper[9136]: I1203 22:06:17.763451 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:06:17.763553 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:06:17.763553 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:06:17.763553 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:06:17.763553 master-0 kubenswrapper[9136]: I1203 22:06:17.763553 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:06:17.764049 master-0 kubenswrapper[9136]: I1203 22:06:17.763613 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:06:17.764373 master-0 kubenswrapper[9136]: I1203 22:06:17.764327 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"b2f6734c8b53bbe2c29b93a23c7bd8fd22b8ba672473f7aeb1ceaade6ae54ba1"} pod="openshift-ingress/router-default-54f97f57-xq6ch" containerMessage="Container router failed startup probe, will be restarted" Dec 03 22:06:17.764453 master-0 kubenswrapper[9136]: I1203 22:06:17.764383 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" containerID="cri-o://b2f6734c8b53bbe2c29b93a23c7bd8fd22b8ba672473f7aeb1ceaade6ae54ba1" gracePeriod=3600 Dec 03 22:06:19.908198 master-0 kubenswrapper[9136]: I1203 22:06:19.908122 9136 scope.go:117] "RemoveContainer" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" Dec 03 22:06:21.242098 master-0 kubenswrapper[9136]: I1203 22:06:21.240295 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/4.log" Dec 03 22:06:21.242098 master-0 kubenswrapper[9136]: I1203 22:06:21.240795 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" event={"ID":"e50b85a6-7767-4fca-8133-8243bdd85e5d","Type":"ContainerStarted","Data":"127c0d8a11a6a2f3bd6ce50d8f40bd2b33ab341450c467d68ce5454bc89363cf"} Dec 03 22:06:21.242098 master-0 kubenswrapper[9136]: I1203 22:06:21.241350 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:06:24.490109 master-0 kubenswrapper[9136]: I1203 22:06:24.490026 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:07:04.619827 master-0 kubenswrapper[9136]: I1203 22:07:04.619754 9136 generic.go:334] "Generic (PLEG): container finished" podID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerID="b2f6734c8b53bbe2c29b93a23c7bd8fd22b8ba672473f7aeb1ceaade6ae54ba1" exitCode=0 Dec 03 22:07:04.620593 master-0 kubenswrapper[9136]: I1203 22:07:04.619833 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerDied","Data":"b2f6734c8b53bbe2c29b93a23c7bd8fd22b8ba672473f7aeb1ceaade6ae54ba1"} Dec 03 22:07:04.620593 master-0 kubenswrapper[9136]: I1203 22:07:04.619871 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"cfa4dcf215847e24aecbc2956ffb90a60bdcdbbe3d57e1f15a1b92587d1579de"} Dec 03 22:07:04.620593 master-0 kubenswrapper[9136]: I1203 22:07:04.619892 9136 scope.go:117] "RemoveContainer" containerID="78424a8a743f1d7a594eeae8a1082a555251b4e9cc2031fa13e2e6be36e4505e" Dec 03 22:07:04.760636 master-0 kubenswrapper[9136]: I1203 22:07:04.760410 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:07:04.760636 master-0 kubenswrapper[9136]: I1203 22:07:04.760472 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:07:04.764721 master-0 kubenswrapper[9136]: I1203 22:07:04.764628 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:04.764721 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:04.764721 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:04.764721 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:04.765471 master-0 kubenswrapper[9136]: I1203 22:07:04.764717 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:05.763262 master-0 kubenswrapper[9136]: I1203 22:07:05.763187 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:05.763262 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:05.763262 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:05.763262 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:05.764334 master-0 kubenswrapper[9136]: I1203 22:07:05.763269 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:06.763379 master-0 kubenswrapper[9136]: I1203 22:07:06.763290 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:06.763379 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:06.763379 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:06.763379 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:06.764496 master-0 kubenswrapper[9136]: I1203 22:07:06.764065 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:07.763874 master-0 kubenswrapper[9136]: I1203 22:07:07.763733 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:07.763874 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:07.763874 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:07.763874 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:07.763874 master-0 kubenswrapper[9136]: I1203 22:07:07.763863 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:08.763634 master-0 kubenswrapper[9136]: I1203 22:07:08.763545 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:08.763634 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:08.763634 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:08.763634 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:08.764657 master-0 kubenswrapper[9136]: I1203 22:07:08.763708 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:09.764428 master-0 kubenswrapper[9136]: I1203 22:07:09.764344 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:09.764428 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:09.764428 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:09.764428 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:09.764428 master-0 kubenswrapper[9136]: I1203 22:07:09.764428 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:10.764029 master-0 kubenswrapper[9136]: I1203 22:07:10.763904 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:10.764029 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:10.764029 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:10.764029 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:10.765249 master-0 kubenswrapper[9136]: I1203 22:07:10.764022 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:11.764340 master-0 kubenswrapper[9136]: I1203 22:07:11.764247 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:11.764340 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:11.764340 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:11.764340 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:11.764340 master-0 kubenswrapper[9136]: I1203 22:07:11.764339 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:12.765571 master-0 kubenswrapper[9136]: I1203 22:07:12.765450 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:12.765571 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:12.765571 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:12.765571 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:12.766739 master-0 kubenswrapper[9136]: I1203 22:07:12.765578 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:13.762835 master-0 kubenswrapper[9136]: I1203 22:07:13.762735 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:13.762835 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:13.762835 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:13.762835 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:13.763191 master-0 kubenswrapper[9136]: I1203 22:07:13.762860 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:14.763385 master-0 kubenswrapper[9136]: I1203 22:07:14.763218 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:14.763385 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:14.763385 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:14.763385 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:14.763385 master-0 kubenswrapper[9136]: I1203 22:07:14.763295 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:15.763647 master-0 kubenswrapper[9136]: I1203 22:07:15.763561 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:15.763647 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:15.763647 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:15.763647 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:15.765067 master-0 kubenswrapper[9136]: I1203 22:07:15.763661 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:16.763508 master-0 kubenswrapper[9136]: I1203 22:07:16.763419 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:16.763508 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:16.763508 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:16.763508 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:16.764568 master-0 kubenswrapper[9136]: I1203 22:07:16.763535 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:17.763203 master-0 kubenswrapper[9136]: I1203 22:07:17.763133 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:17.763203 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:17.763203 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:17.763203 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:17.763573 master-0 kubenswrapper[9136]: I1203 22:07:17.763219 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:18.763142 master-0 kubenswrapper[9136]: I1203 22:07:18.763059 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:18.763142 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:18.763142 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:18.763142 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:18.764112 master-0 kubenswrapper[9136]: I1203 22:07:18.763184 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:19.764074 master-0 kubenswrapper[9136]: I1203 22:07:19.763959 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:19.764074 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:19.764074 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:19.764074 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:19.765558 master-0 kubenswrapper[9136]: I1203 22:07:19.764111 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:20.763568 master-0 kubenswrapper[9136]: I1203 22:07:20.763406 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:20.763568 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:20.763568 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:20.763568 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:20.764117 master-0 kubenswrapper[9136]: I1203 22:07:20.763648 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:21.763546 master-0 kubenswrapper[9136]: I1203 22:07:21.763442 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:21.763546 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:21.763546 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:21.763546 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:21.765000 master-0 kubenswrapper[9136]: I1203 22:07:21.763552 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:22.763939 master-0 kubenswrapper[9136]: I1203 22:07:22.763807 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:22.763939 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:22.763939 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:22.763939 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:22.763939 master-0 kubenswrapper[9136]: I1203 22:07:22.763925 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:23.763401 master-0 kubenswrapper[9136]: I1203 22:07:23.763288 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:23.763401 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:23.763401 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:23.763401 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:23.763833 master-0 kubenswrapper[9136]: I1203 22:07:23.763434 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:24.764617 master-0 kubenswrapper[9136]: I1203 22:07:24.764390 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:24.764617 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:24.764617 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:24.764617 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:24.764617 master-0 kubenswrapper[9136]: I1203 22:07:24.764480 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:25.763939 master-0 kubenswrapper[9136]: I1203 22:07:25.763837 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:25.763939 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:25.763939 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:25.763939 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:25.764479 master-0 kubenswrapper[9136]: I1203 22:07:25.763943 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:26.764013 master-0 kubenswrapper[9136]: I1203 22:07:26.763925 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:26.764013 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:26.764013 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:26.764013 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:26.764013 master-0 kubenswrapper[9136]: I1203 22:07:26.763997 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:27.762714 master-0 kubenswrapper[9136]: I1203 22:07:27.762603 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:27.762714 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:27.762714 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:27.762714 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:27.762714 master-0 kubenswrapper[9136]: I1203 22:07:27.762707 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:28.763169 master-0 kubenswrapper[9136]: I1203 22:07:28.763087 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:28.763169 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:28.763169 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:28.763169 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:28.764091 master-0 kubenswrapper[9136]: I1203 22:07:28.763190 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:29.763917 master-0 kubenswrapper[9136]: I1203 22:07:29.763757 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:29.763917 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:29.763917 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:29.763917 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:29.763917 master-0 kubenswrapper[9136]: I1203 22:07:29.763902 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:30.764451 master-0 kubenswrapper[9136]: I1203 22:07:30.764381 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:30.764451 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:30.764451 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:30.764451 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:30.765462 master-0 kubenswrapper[9136]: I1203 22:07:30.765110 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:31.763082 master-0 kubenswrapper[9136]: I1203 22:07:31.763017 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:31.763082 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:31.763082 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:31.763082 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:31.763082 master-0 kubenswrapper[9136]: I1203 22:07:31.763090 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:32.763881 master-0 kubenswrapper[9136]: I1203 22:07:32.763752 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:32.763881 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:32.763881 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:32.763881 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:32.765081 master-0 kubenswrapper[9136]: I1203 22:07:32.763923 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:33.762716 master-0 kubenswrapper[9136]: I1203 22:07:33.762660 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:33.762716 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:33.762716 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:33.762716 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:33.763317 master-0 kubenswrapper[9136]: I1203 22:07:33.763285 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:34.762999 master-0 kubenswrapper[9136]: I1203 22:07:34.762924 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:34.762999 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:34.762999 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:34.762999 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:34.763970 master-0 kubenswrapper[9136]: I1203 22:07:34.763019 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:35.763048 master-0 kubenswrapper[9136]: I1203 22:07:35.762951 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:35.763048 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:35.763048 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:35.763048 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:35.764146 master-0 kubenswrapper[9136]: I1203 22:07:35.763058 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:36.764146 master-0 kubenswrapper[9136]: I1203 22:07:36.764064 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:36.764146 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:36.764146 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:36.764146 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:36.765180 master-0 kubenswrapper[9136]: I1203 22:07:36.764170 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:37.763250 master-0 kubenswrapper[9136]: I1203 22:07:37.763196 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:37.763250 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:37.763250 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:37.763250 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:37.763715 master-0 kubenswrapper[9136]: I1203 22:07:37.763270 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:38.763436 master-0 kubenswrapper[9136]: I1203 22:07:38.763231 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:38.763436 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:38.763436 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:38.763436 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:38.763436 master-0 kubenswrapper[9136]: I1203 22:07:38.763294 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:38.922251 master-0 kubenswrapper[9136]: I1203 22:07:38.922176 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/5.log" Dec 03 22:07:38.923226 master-0 kubenswrapper[9136]: I1203 22:07:38.923174 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/4.log" Dec 03 22:07:38.923938 master-0 kubenswrapper[9136]: I1203 22:07:38.923856 9136 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" exitCode=1 Dec 03 22:07:38.923938 master-0 kubenswrapper[9136]: I1203 22:07:38.923905 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerDied","Data":"e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082"} Dec 03 22:07:38.924200 master-0 kubenswrapper[9136]: I1203 22:07:38.923988 9136 scope.go:117] "RemoveContainer" containerID="fcb6c5c72064adc9aea55e70aa0d6dcc1b315956a26cfe78edc7f73a5d03d23f" Dec 03 22:07:38.925048 master-0 kubenswrapper[9136]: I1203 22:07:38.924932 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:07:38.925519 master-0 kubenswrapper[9136]: E1203 22:07:38.925426 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:07:39.764733 master-0 kubenswrapper[9136]: I1203 22:07:39.764650 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:39.764733 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:39.764733 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:39.764733 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:39.764733 master-0 kubenswrapper[9136]: I1203 22:07:39.764735 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:39.932532 master-0 kubenswrapper[9136]: I1203 22:07:39.932457 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/5.log" Dec 03 22:07:40.764275 master-0 kubenswrapper[9136]: I1203 22:07:40.764189 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:40.764275 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:40.764275 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:40.764275 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:40.764743 master-0 kubenswrapper[9136]: I1203 22:07:40.764316 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:41.763747 master-0 kubenswrapper[9136]: I1203 22:07:41.763671 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:41.763747 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:41.763747 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:41.763747 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:41.764770 master-0 kubenswrapper[9136]: I1203 22:07:41.763754 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:42.763643 master-0 kubenswrapper[9136]: I1203 22:07:42.763571 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:42.763643 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:42.763643 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:42.763643 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:42.764645 master-0 kubenswrapper[9136]: I1203 22:07:42.763673 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:43.762607 master-0 kubenswrapper[9136]: I1203 22:07:43.762510 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:43.762607 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:43.762607 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:43.762607 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:43.763211 master-0 kubenswrapper[9136]: I1203 22:07:43.762619 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:44.763470 master-0 kubenswrapper[9136]: I1203 22:07:44.763392 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:44.763470 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:44.763470 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:44.763470 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:44.763470 master-0 kubenswrapper[9136]: I1203 22:07:44.763472 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:45.763147 master-0 kubenswrapper[9136]: I1203 22:07:45.763021 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:45.763147 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:45.763147 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:45.763147 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:45.763147 master-0 kubenswrapper[9136]: I1203 22:07:45.763144 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:46.763277 master-0 kubenswrapper[9136]: I1203 22:07:46.763214 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:46.763277 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:46.763277 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:46.763277 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:46.764254 master-0 kubenswrapper[9136]: I1203 22:07:46.764149 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:47.764138 master-0 kubenswrapper[9136]: I1203 22:07:47.764057 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:47.764138 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:47.764138 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:47.764138 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:47.765174 master-0 kubenswrapper[9136]: I1203 22:07:47.764153 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:48.762754 master-0 kubenswrapper[9136]: I1203 22:07:48.762665 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:48.762754 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:48.762754 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:48.762754 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:48.763098 master-0 kubenswrapper[9136]: I1203 22:07:48.762797 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:49.763545 master-0 kubenswrapper[9136]: I1203 22:07:49.763459 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:49.763545 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:49.763545 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:49.763545 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:49.764725 master-0 kubenswrapper[9136]: I1203 22:07:49.763557 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:50.765817 master-0 kubenswrapper[9136]: I1203 22:07:50.765733 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:50.765817 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:50.765817 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:50.765817 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:50.766536 master-0 kubenswrapper[9136]: I1203 22:07:50.765830 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:51.763801 master-0 kubenswrapper[9136]: I1203 22:07:51.763692 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:51.763801 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:51.763801 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:51.763801 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:51.764313 master-0 kubenswrapper[9136]: I1203 22:07:51.763822 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:52.763893 master-0 kubenswrapper[9136]: I1203 22:07:52.763754 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:52.763893 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:52.763893 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:52.763893 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:52.763893 master-0 kubenswrapper[9136]: I1203 22:07:52.763875 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:52.907865 master-0 kubenswrapper[9136]: I1203 22:07:52.907754 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:07:52.908266 master-0 kubenswrapper[9136]: E1203 22:07:52.908220 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:07:53.764086 master-0 kubenswrapper[9136]: I1203 22:07:53.764026 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:53.764086 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:53.764086 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:53.764086 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:53.765366 master-0 kubenswrapper[9136]: I1203 22:07:53.765294 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:54.763656 master-0 kubenswrapper[9136]: I1203 22:07:54.763584 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:54.763656 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:54.763656 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:54.763656 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:54.763656 master-0 kubenswrapper[9136]: I1203 22:07:54.763658 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:55.762737 master-0 kubenswrapper[9136]: I1203 22:07:55.762647 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:55.762737 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:55.762737 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:55.762737 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:55.763527 master-0 kubenswrapper[9136]: I1203 22:07:55.762762 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:56.763017 master-0 kubenswrapper[9136]: I1203 22:07:56.762956 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:56.763017 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:56.763017 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:56.763017 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:56.763657 master-0 kubenswrapper[9136]: I1203 22:07:56.763036 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:57.764022 master-0 kubenswrapper[9136]: I1203 22:07:57.763938 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:57.764022 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:57.764022 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:57.764022 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:57.765330 master-0 kubenswrapper[9136]: I1203 22:07:57.764047 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:58.762840 master-0 kubenswrapper[9136]: I1203 22:07:58.762721 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:58.762840 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:58.762840 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:58.762840 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:58.763322 master-0 kubenswrapper[9136]: I1203 22:07:58.762848 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:07:59.763907 master-0 kubenswrapper[9136]: I1203 22:07:59.763809 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:07:59.763907 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:07:59.763907 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:07:59.763907 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:07:59.764877 master-0 kubenswrapper[9136]: I1203 22:07:59.763930 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:00.762835 master-0 kubenswrapper[9136]: I1203 22:08:00.762789 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:00.762835 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:00.762835 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:00.762835 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:00.763215 master-0 kubenswrapper[9136]: I1203 22:08:00.763187 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:01.762924 master-0 kubenswrapper[9136]: I1203 22:08:01.762850 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:01.762924 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:01.762924 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:01.762924 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:01.762924 master-0 kubenswrapper[9136]: I1203 22:08:01.762923 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:02.764056 master-0 kubenswrapper[9136]: I1203 22:08:02.763963 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:02.764056 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:02.764056 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:02.764056 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:02.764921 master-0 kubenswrapper[9136]: I1203 22:08:02.764056 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:03.762800 master-0 kubenswrapper[9136]: I1203 22:08:03.762670 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:03.762800 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:03.762800 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:03.762800 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:03.763188 master-0 kubenswrapper[9136]: I1203 22:08:03.762824 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:04.763284 master-0 kubenswrapper[9136]: I1203 22:08:04.763168 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:04.763284 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:04.763284 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:04.763284 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:04.764305 master-0 kubenswrapper[9136]: I1203 22:08:04.763302 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:05.763639 master-0 kubenswrapper[9136]: I1203 22:08:05.763564 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:05.763639 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:05.763639 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:05.763639 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:05.764280 master-0 kubenswrapper[9136]: I1203 22:08:05.763650 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:06.763448 master-0 kubenswrapper[9136]: I1203 22:08:06.763369 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:06.763448 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:06.763448 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:06.763448 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:06.764086 master-0 kubenswrapper[9136]: I1203 22:08:06.763479 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:06.908908 master-0 kubenswrapper[9136]: I1203 22:08:06.908826 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:08:06.909237 master-0 kubenswrapper[9136]: E1203 22:08:06.909158 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:08:07.763743 master-0 kubenswrapper[9136]: I1203 22:08:07.763672 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:07.763743 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:07.763743 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:07.763743 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:07.764444 master-0 kubenswrapper[9136]: I1203 22:08:07.763756 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:08.762787 master-0 kubenswrapper[9136]: I1203 22:08:08.762689 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:08.762787 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:08.762787 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:08.762787 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:08.762787 master-0 kubenswrapper[9136]: I1203 22:08:08.762760 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:09.762932 master-0 kubenswrapper[9136]: I1203 22:08:09.762843 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:09.762932 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:09.762932 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:09.762932 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:09.762932 master-0 kubenswrapper[9136]: I1203 22:08:09.762934 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:10.763233 master-0 kubenswrapper[9136]: I1203 22:08:10.763168 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:10.763233 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:10.763233 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:10.763233 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:10.763233 master-0 kubenswrapper[9136]: I1203 22:08:10.763245 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:11.763584 master-0 kubenswrapper[9136]: I1203 22:08:11.763508 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:11.763584 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:11.763584 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:11.763584 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:11.763584 master-0 kubenswrapper[9136]: I1203 22:08:11.763582 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:12.763448 master-0 kubenswrapper[9136]: I1203 22:08:12.763353 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:12.763448 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:12.763448 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:12.763448 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:12.763448 master-0 kubenswrapper[9136]: I1203 22:08:12.763458 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:13.763450 master-0 kubenswrapper[9136]: I1203 22:08:13.763319 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:13.763450 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:13.763450 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:13.763450 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:13.763450 master-0 kubenswrapper[9136]: I1203 22:08:13.763433 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:14.763253 master-0 kubenswrapper[9136]: I1203 22:08:14.763166 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:14.763253 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:14.763253 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:14.763253 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:14.763634 master-0 kubenswrapper[9136]: I1203 22:08:14.763271 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:15.763334 master-0 kubenswrapper[9136]: I1203 22:08:15.763257 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:15.763334 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:15.763334 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:15.763334 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:15.763977 master-0 kubenswrapper[9136]: I1203 22:08:15.763375 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:16.763526 master-0 kubenswrapper[9136]: I1203 22:08:16.763465 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:16.763526 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:16.763526 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:16.763526 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:16.764323 master-0 kubenswrapper[9136]: I1203 22:08:16.763538 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:17.762695 master-0 kubenswrapper[9136]: I1203 22:08:17.762558 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:17.762695 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:17.762695 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:17.762695 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:17.762695 master-0 kubenswrapper[9136]: I1203 22:08:17.762659 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:17.908465 master-0 kubenswrapper[9136]: I1203 22:08:17.908394 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:08:17.909538 master-0 kubenswrapper[9136]: E1203 22:08:17.908635 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:08:18.763665 master-0 kubenswrapper[9136]: I1203 22:08:18.763595 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:18.763665 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:18.763665 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:18.763665 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:18.764125 master-0 kubenswrapper[9136]: I1203 22:08:18.763692 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:19.763737 master-0 kubenswrapper[9136]: I1203 22:08:19.763620 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:19.763737 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:19.763737 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:19.763737 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:19.765118 master-0 kubenswrapper[9136]: I1203 22:08:19.763747 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:20.763583 master-0 kubenswrapper[9136]: I1203 22:08:20.763523 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:20.763583 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:20.763583 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:20.763583 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:20.764476 master-0 kubenswrapper[9136]: I1203 22:08:20.764439 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:21.762961 master-0 kubenswrapper[9136]: I1203 22:08:21.762901 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:21.762961 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:21.762961 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:21.762961 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:21.763291 master-0 kubenswrapper[9136]: I1203 22:08:21.762970 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:22.763912 master-0 kubenswrapper[9136]: I1203 22:08:22.763809 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:22.763912 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:22.763912 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:22.763912 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:22.763912 master-0 kubenswrapper[9136]: I1203 22:08:22.763914 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:23.763444 master-0 kubenswrapper[9136]: I1203 22:08:23.763369 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:23.763444 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:23.763444 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:23.763444 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:23.763444 master-0 kubenswrapper[9136]: I1203 22:08:23.763441 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:24.762415 master-0 kubenswrapper[9136]: I1203 22:08:24.762069 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:24.762415 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:24.762415 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:24.762415 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:24.762415 master-0 kubenswrapper[9136]: I1203 22:08:24.762142 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:25.763487 master-0 kubenswrapper[9136]: I1203 22:08:25.763383 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:25.763487 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:25.763487 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:25.763487 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:25.764232 master-0 kubenswrapper[9136]: I1203 22:08:25.763535 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:26.763656 master-0 kubenswrapper[9136]: I1203 22:08:26.763584 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:26.763656 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:26.763656 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:26.763656 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:26.764482 master-0 kubenswrapper[9136]: I1203 22:08:26.763701 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:27.421548 master-0 kubenswrapper[9136]: I1203 22:08:27.421499 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-mbm46"] Dec 03 22:08:27.421879 master-0 kubenswrapper[9136]: E1203 22:08:27.421852 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="registry-server" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: I1203 22:08:27.421881 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="registry-server" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: E1203 22:08:27.421903 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="registry-server" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: I1203 22:08:27.421915 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="registry-server" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: E1203 22:08:27.421933 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="extract-utilities" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: I1203 22:08:27.421942 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="extract-utilities" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: E1203 22:08:27.421957 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="registry-server" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: I1203 22:08:27.421966 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="registry-server" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: E1203 22:08:27.421983 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="extract-utilities" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: I1203 22:08:27.421994 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="extract-utilities" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: E1203 22:08:27.422008 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="extract-content" Dec 03 22:08:27.422020 master-0 kubenswrapper[9136]: I1203 22:08:27.422016 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: E1203 22:08:27.422032 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="extract-utilities" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422041 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="extract-utilities" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: E1203 22:08:27.422057 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422065 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: E1203 22:08:27.422085 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422093 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: E1203 22:08:27.422110 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="registry-server" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422119 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="registry-server" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: E1203 22:08:27.422133 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422141 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="extract-content" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: E1203 22:08:27.422156 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="extract-utilities" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422164 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="extract-utilities" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422307 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c9edca-31b4-49b0-b3e9-773c1019b1a0" containerName="registry-server" Dec 03 22:08:27.422326 master-0 kubenswrapper[9136]: I1203 22:08:27.422331 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbe1b30-1dfb-4d18-b674-c35d49556836" containerName="registry-server" Dec 03 22:08:27.422694 master-0 kubenswrapper[9136]: I1203 22:08:27.422346 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae1f77e6-98d7-42f7-9a3c-4b610058e28a" containerName="registry-server" Dec 03 22:08:27.422694 master-0 kubenswrapper[9136]: I1203 22:08:27.422369 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b69a98-5c9a-4136-9727-8d3af63a1158" containerName="registry-server" Dec 03 22:08:27.422923 master-0 kubenswrapper[9136]: I1203 22:08:27.422899 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.424647 master-0 kubenswrapper[9136]: I1203 22:08:27.424606 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-5mjhs" Dec 03 22:08:27.424893 master-0 kubenswrapper[9136]: I1203 22:08:27.424877 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Dec 03 22:08:27.446343 master-0 kubenswrapper[9136]: I1203 22:08:27.446279 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d577ada1-1db4-4af7-9982-e5ad0945ae83-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.446484 master-0 kubenswrapper[9136]: I1203 22:08:27.446384 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d577ada1-1db4-4af7-9982-e5ad0945ae83-ready\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.446484 master-0 kubenswrapper[9136]: I1203 22:08:27.446444 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d577ada1-1db4-4af7-9982-e5ad0945ae83-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.446637 master-0 kubenswrapper[9136]: I1203 22:08:27.446602 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqx9w\" (UniqueName: \"kubernetes.io/projected/d577ada1-1db4-4af7-9982-e5ad0945ae83-kube-api-access-hqx9w\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.547491 master-0 kubenswrapper[9136]: I1203 22:08:27.547427 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d577ada1-1db4-4af7-9982-e5ad0945ae83-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.547491 master-0 kubenswrapper[9136]: I1203 22:08:27.547492 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d577ada1-1db4-4af7-9982-e5ad0945ae83-ready\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.547743 master-0 kubenswrapper[9136]: I1203 22:08:27.547523 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d577ada1-1db4-4af7-9982-e5ad0945ae83-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.547743 master-0 kubenswrapper[9136]: I1203 22:08:27.547593 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqx9w\" (UniqueName: \"kubernetes.io/projected/d577ada1-1db4-4af7-9982-e5ad0945ae83-kube-api-access-hqx9w\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.548048 master-0 kubenswrapper[9136]: I1203 22:08:27.548015 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d577ada1-1db4-4af7-9982-e5ad0945ae83-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.548482 master-0 kubenswrapper[9136]: I1203 22:08:27.548447 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d577ada1-1db4-4af7-9982-e5ad0945ae83-ready\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.548950 master-0 kubenswrapper[9136]: I1203 22:08:27.548918 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d577ada1-1db4-4af7-9982-e5ad0945ae83-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.563195 master-0 kubenswrapper[9136]: I1203 22:08:27.563147 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqx9w\" (UniqueName: \"kubernetes.io/projected/d577ada1-1db4-4af7-9982-e5ad0945ae83-kube-api-access-hqx9w\") pod \"cni-sysctl-allowlist-ds-mbm46\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.752179 master-0 kubenswrapper[9136]: I1203 22:08:27.752028 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:27.763387 master-0 kubenswrapper[9136]: I1203 22:08:27.763339 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:27.763387 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:27.763387 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:27.763387 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:27.764095 master-0 kubenswrapper[9136]: I1203 22:08:27.763411 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:27.778350 master-0 kubenswrapper[9136]: W1203 22:08:27.778293 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd577ada1_1db4_4af7_9982_e5ad0945ae83.slice/crio-b012d7a06a0f3515da44351c854cdc116d20fc976e6dc7f228c5ddfc0f9836b5 WatchSource:0}: Error finding container b012d7a06a0f3515da44351c854cdc116d20fc976e6dc7f228c5ddfc0f9836b5: Status 404 returned error can't find the container with id b012d7a06a0f3515da44351c854cdc116d20fc976e6dc7f228c5ddfc0f9836b5 Dec 03 22:08:28.360247 master-0 kubenswrapper[9136]: I1203 22:08:28.360160 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" event={"ID":"d577ada1-1db4-4af7-9982-e5ad0945ae83","Type":"ContainerStarted","Data":"b012d7a06a0f3515da44351c854cdc116d20fc976e6dc7f228c5ddfc0f9836b5"} Dec 03 22:08:28.763727 master-0 kubenswrapper[9136]: I1203 22:08:28.763628 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:28.763727 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:28.763727 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:28.763727 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:28.764807 master-0 kubenswrapper[9136]: I1203 22:08:28.763734 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:29.368543 master-0 kubenswrapper[9136]: I1203 22:08:29.368470 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" event={"ID":"d577ada1-1db4-4af7-9982-e5ad0945ae83","Type":"ContainerStarted","Data":"02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c"} Dec 03 22:08:29.368841 master-0 kubenswrapper[9136]: I1203 22:08:29.368757 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:29.387465 master-0 kubenswrapper[9136]: I1203 22:08:29.387374 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:08:29.505879 master-0 kubenswrapper[9136]: I1203 22:08:29.504609 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" podStartSLOduration=2.504586015 podStartE2EDuration="2.504586015s" podCreationTimestamp="2025-12-03 22:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:29.501292833 +0000 UTC m=+1115.776469225" watchObservedRunningTime="2025-12-03 22:08:29.504586015 +0000 UTC m=+1115.779762407" Dec 03 22:08:29.762402 master-0 kubenswrapper[9136]: I1203 22:08:29.762267 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:29.762402 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:29.762402 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:29.762402 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:29.762402 master-0 kubenswrapper[9136]: I1203 22:08:29.762352 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:29.909126 master-0 kubenswrapper[9136]: I1203 22:08:29.908998 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:08:29.910171 master-0 kubenswrapper[9136]: E1203 22:08:29.909561 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:08:30.518925 master-0 kubenswrapper[9136]: I1203 22:08:30.518857 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-mbm46"] Dec 03 22:08:30.763135 master-0 kubenswrapper[9136]: I1203 22:08:30.763071 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:30.763135 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:30.763135 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:30.763135 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:30.763622 master-0 kubenswrapper[9136]: I1203 22:08:30.763163 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:31.762571 master-0 kubenswrapper[9136]: I1203 22:08:31.762516 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:31.762571 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:31.762571 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:31.762571 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:31.763548 master-0 kubenswrapper[9136]: I1203 22:08:31.763508 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:32.394472 master-0 kubenswrapper[9136]: I1203 22:08:32.394338 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" gracePeriod=30 Dec 03 22:08:32.763560 master-0 kubenswrapper[9136]: I1203 22:08:32.763423 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:32.763560 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:32.763560 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:32.763560 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:32.763560 master-0 kubenswrapper[9136]: I1203 22:08:32.763496 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:33.762890 master-0 kubenswrapper[9136]: I1203 22:08:33.762792 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:33.762890 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:33.762890 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:33.762890 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:33.763352 master-0 kubenswrapper[9136]: I1203 22:08:33.762900 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:34.479455 master-0 kubenswrapper[9136]: I1203 22:08:34.479344 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 03 22:08:34.482730 master-0 kubenswrapper[9136]: I1203 22:08:34.480628 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.484269 master-0 kubenswrapper[9136]: I1203 22:08:34.484229 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-tdkqf" Dec 03 22:08:34.485167 master-0 kubenswrapper[9136]: I1203 22:08:34.485119 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 22:08:34.509660 master-0 kubenswrapper[9136]: I1203 22:08:34.498566 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 03 22:08:34.553287 master-0 kubenswrapper[9136]: I1203 22:08:34.553172 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-var-lock\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.553641 master-0 kubenswrapper[9136]: I1203 22:08:34.553301 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31ee3291-2979-4903-98a2-355855cedd55-kube-api-access\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.553641 master-0 kubenswrapper[9136]: I1203 22:08:34.553413 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.654953 master-0 kubenswrapper[9136]: I1203 22:08:34.654870 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-var-lock\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.654953 master-0 kubenswrapper[9136]: I1203 22:08:34.654953 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31ee3291-2979-4903-98a2-355855cedd55-kube-api-access\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.655283 master-0 kubenswrapper[9136]: I1203 22:08:34.655017 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-var-lock\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.655283 master-0 kubenswrapper[9136]: I1203 22:08:34.655038 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.655283 master-0 kubenswrapper[9136]: I1203 22:08:34.655100 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.687636 master-0 kubenswrapper[9136]: I1203 22:08:34.687558 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31ee3291-2979-4903-98a2-355855cedd55-kube-api-access\") pod \"installer-3-master-0\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.763416 master-0 kubenswrapper[9136]: I1203 22:08:34.763213 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:34.763416 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:34.763416 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:34.763416 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:34.764064 master-0 kubenswrapper[9136]: I1203 22:08:34.763355 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:34.819579 master-0 kubenswrapper[9136]: I1203 22:08:34.819347 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:08:34.889403 master-0 kubenswrapper[9136]: I1203 22:08:34.889315 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Dec 03 22:08:34.892600 master-0 kubenswrapper[9136]: I1203 22:08:34.892556 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:34.897491 master-0 kubenswrapper[9136]: I1203 22:08:34.897442 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-wkdbt" Dec 03 22:08:34.897831 master-0 kubenswrapper[9136]: I1203 22:08:34.897750 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 22:08:34.904871 master-0 kubenswrapper[9136]: I1203 22:08:34.904109 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Dec 03 22:08:34.961626 master-0 kubenswrapper[9136]: I1203 22:08:34.961568 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:34.961951 master-0 kubenswrapper[9136]: I1203 22:08:34.961855 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:34.961951 master-0 kubenswrapper[9136]: I1203 22:08:34.961903 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc154774-70d5-4742-8520-60e90471c530-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.063450 master-0 kubenswrapper[9136]: I1203 22:08:35.063328 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.063450 master-0 kubenswrapper[9136]: I1203 22:08:35.063374 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc154774-70d5-4742-8520-60e90471c530-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.063450 master-0 kubenswrapper[9136]: I1203 22:08:35.063411 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.063701 master-0 kubenswrapper[9136]: I1203 22:08:35.063498 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.063701 master-0 kubenswrapper[9136]: I1203 22:08:35.063534 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.081462 master-0 kubenswrapper[9136]: I1203 22:08:35.081218 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc154774-70d5-4742-8520-60e90471c530-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.242635 master-0 kubenswrapper[9136]: I1203 22:08:35.242540 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:08:35.319755 master-0 kubenswrapper[9136]: I1203 22:08:35.319709 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Dec 03 22:08:35.321283 master-0 kubenswrapper[9136]: W1203 22:08:35.321225 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod31ee3291_2979_4903_98a2_355855cedd55.slice/crio-b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22 WatchSource:0}: Error finding container b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22: Status 404 returned error can't find the container with id b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22 Dec 03 22:08:35.429932 master-0 kubenswrapper[9136]: I1203 22:08:35.429864 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"31ee3291-2979-4903-98a2-355855cedd55","Type":"ContainerStarted","Data":"b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22"} Dec 03 22:08:35.675370 master-0 kubenswrapper[9136]: I1203 22:08:35.675305 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Dec 03 22:08:35.684144 master-0 kubenswrapper[9136]: W1203 22:08:35.684083 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddc154774_70d5_4742_8520_60e90471c530.slice/crio-a244ac17047640e6572dd01dfbf0346d713e69fdd83c35ae5ec0098c9545c491 WatchSource:0}: Error finding container a244ac17047640e6572dd01dfbf0346d713e69fdd83c35ae5ec0098c9545c491: Status 404 returned error can't find the container with id a244ac17047640e6572dd01dfbf0346d713e69fdd83c35ae5ec0098c9545c491 Dec 03 22:08:35.762598 master-0 kubenswrapper[9136]: I1203 22:08:35.762511 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:35.762598 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:35.762598 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:35.762598 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:35.762598 master-0 kubenswrapper[9136]: I1203 22:08:35.762579 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:36.445186 master-0 kubenswrapper[9136]: I1203 22:08:36.445111 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"31ee3291-2979-4903-98a2-355855cedd55","Type":"ContainerStarted","Data":"18687aca035ea2cd91fda30a36ce8489d00076f5b5f568339084de2e1d467700"} Dec 03 22:08:36.447542 master-0 kubenswrapper[9136]: I1203 22:08:36.447474 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"dc154774-70d5-4742-8520-60e90471c530","Type":"ContainerStarted","Data":"ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b"} Dec 03 22:08:36.447542 master-0 kubenswrapper[9136]: I1203 22:08:36.447517 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"dc154774-70d5-4742-8520-60e90471c530","Type":"ContainerStarted","Data":"a244ac17047640e6572dd01dfbf0346d713e69fdd83c35ae5ec0098c9545c491"} Dec 03 22:08:36.466410 master-0 kubenswrapper[9136]: I1203 22:08:36.466330 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.466311016 podStartE2EDuration="2.466311016s" podCreationTimestamp="2025-12-03 22:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:36.461548419 +0000 UTC m=+1122.736724831" watchObservedRunningTime="2025-12-03 22:08:36.466311016 +0000 UTC m=+1122.741487398" Dec 03 22:08:36.481417 master-0 kubenswrapper[9136]: I1203 22:08:36.481310 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" podStartSLOduration=2.481287127 podStartE2EDuration="2.481287127s" podCreationTimestamp="2025-12-03 22:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:36.477792019 +0000 UTC m=+1122.752968431" watchObservedRunningTime="2025-12-03 22:08:36.481287127 +0000 UTC m=+1122.756463509" Dec 03 22:08:36.763024 master-0 kubenswrapper[9136]: I1203 22:08:36.762868 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:36.763024 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:36.763024 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:36.763024 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:36.763024 master-0 kubenswrapper[9136]: I1203 22:08:36.762942 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:37.755220 master-0 kubenswrapper[9136]: E1203 22:08:37.755091 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:37.756849 master-0 kubenswrapper[9136]: E1203 22:08:37.756747 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:37.759833 master-0 kubenswrapper[9136]: E1203 22:08:37.759718 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:37.759833 master-0 kubenswrapper[9136]: E1203 22:08:37.759807 9136 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" Dec 03 22:08:37.762948 master-0 kubenswrapper[9136]: I1203 22:08:37.762881 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:37.762948 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:37.762948 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:37.762948 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:37.762948 master-0 kubenswrapper[9136]: I1203 22:08:37.762941 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:38.763465 master-0 kubenswrapper[9136]: I1203 22:08:38.763384 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:38.763465 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:38.763465 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:38.763465 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:38.764269 master-0 kubenswrapper[9136]: I1203 22:08:38.763473 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:38.874916 master-0 kubenswrapper[9136]: I1203 22:08:38.874835 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Dec 03 22:08:38.875445 master-0 kubenswrapper[9136]: I1203 22:08:38.875359 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" podUID="dc154774-70d5-4742-8520-60e90471c530" containerName="installer" containerID="cri-o://ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b" gracePeriod=30 Dec 03 22:08:39.763580 master-0 kubenswrapper[9136]: I1203 22:08:39.763497 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:39.763580 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:39.763580 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:39.763580 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:39.763580 master-0 kubenswrapper[9136]: I1203 22:08:39.763578 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:40.764055 master-0 kubenswrapper[9136]: I1203 22:08:40.763938 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:40.764055 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:40.764055 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:40.764055 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:40.764869 master-0 kubenswrapper[9136]: I1203 22:08:40.764061 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:41.764188 master-0 kubenswrapper[9136]: I1203 22:08:41.764086 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:41.764188 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:41.764188 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:41.764188 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:41.764860 master-0 kubenswrapper[9136]: I1203 22:08:41.764210 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:42.068476 master-0 kubenswrapper[9136]: I1203 22:08:42.068277 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 22:08:42.069964 master-0 kubenswrapper[9136]: I1203 22:08:42.069896 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.070984 master-0 kubenswrapper[9136]: I1203 22:08:42.070903 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.071108 master-0 kubenswrapper[9136]: I1203 22:08:42.071022 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-var-lock\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.071189 master-0 kubenswrapper[9136]: I1203 22:08:42.071146 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.108360 master-0 kubenswrapper[9136]: I1203 22:08:42.108254 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 22:08:42.173374 master-0 kubenswrapper[9136]: I1203 22:08:42.173324 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-var-lock\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.173813 master-0 kubenswrapper[9136]: I1203 22:08:42.173729 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.174138 master-0 kubenswrapper[9136]: I1203 22:08:42.173458 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-var-lock\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.174338 master-0 kubenswrapper[9136]: I1203 22:08:42.174300 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.174537 master-0 kubenswrapper[9136]: I1203 22:08:42.174405 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.193837 master-0 kubenswrapper[9136]: I1203 22:08:42.193743 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kube-api-access\") pod \"installer-3-master-0\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.418271 master-0 kubenswrapper[9136]: I1203 22:08:42.413246 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:08:42.764202 master-0 kubenswrapper[9136]: I1203 22:08:42.764051 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:42.764202 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:42.764202 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:42.764202 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:42.764202 master-0 kubenswrapper[9136]: I1203 22:08:42.764124 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:42.927389 master-0 kubenswrapper[9136]: I1203 22:08:42.927306 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 22:08:42.939903 master-0 kubenswrapper[9136]: W1203 22:08:42.939833 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb56c556d_dac0_4c12_a992_44e5f2eb32e2.slice/crio-1204437a6fc239716e874f0192dfd99393826c54111a382098102d88fcf7224b WatchSource:0}: Error finding container 1204437a6fc239716e874f0192dfd99393826c54111a382098102d88fcf7224b: Status 404 returned error can't find the container with id 1204437a6fc239716e874f0192dfd99393826c54111a382098102d88fcf7224b Dec 03 22:08:43.508877 master-0 kubenswrapper[9136]: I1203 22:08:43.508736 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"b56c556d-dac0-4c12-a992-44e5f2eb32e2","Type":"ContainerStarted","Data":"85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d"} Dec 03 22:08:43.508877 master-0 kubenswrapper[9136]: I1203 22:08:43.508846 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"b56c556d-dac0-4c12-a992-44e5f2eb32e2","Type":"ContainerStarted","Data":"1204437a6fc239716e874f0192dfd99393826c54111a382098102d88fcf7224b"} Dec 03 22:08:43.538741 master-0 kubenswrapper[9136]: I1203 22:08:43.538624 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=1.538589891 podStartE2EDuration="1.538589891s" podCreationTimestamp="2025-12-03 22:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:08:43.532679299 +0000 UTC m=+1129.807855741" watchObservedRunningTime="2025-12-03 22:08:43.538589891 +0000 UTC m=+1129.813766323" Dec 03 22:08:43.764067 master-0 kubenswrapper[9136]: I1203 22:08:43.763858 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:43.764067 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:43.764067 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:43.764067 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:43.764067 master-0 kubenswrapper[9136]: I1203 22:08:43.763964 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:43.915153 master-0 kubenswrapper[9136]: I1203 22:08:43.915064 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:08:43.915580 master-0 kubenswrapper[9136]: E1203 22:08:43.915516 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:08:44.762900 master-0 kubenswrapper[9136]: I1203 22:08:44.762830 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:44.762900 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:44.762900 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:44.762900 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:44.763258 master-0 kubenswrapper[9136]: I1203 22:08:44.762909 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:45.762589 master-0 kubenswrapper[9136]: I1203 22:08:45.762536 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:45.762589 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:45.762589 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:45.762589 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:45.763391 master-0 kubenswrapper[9136]: I1203 22:08:45.762600 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:46.764030 master-0 kubenswrapper[9136]: I1203 22:08:46.763902 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:46.764030 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:46.764030 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:46.764030 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:46.764030 master-0 kubenswrapper[9136]: I1203 22:08:46.764008 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:47.756077 master-0 kubenswrapper[9136]: E1203 22:08:47.755945 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:47.758107 master-0 kubenswrapper[9136]: E1203 22:08:47.758025 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:47.760670 master-0 kubenswrapper[9136]: E1203 22:08:47.760520 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:47.760877 master-0 kubenswrapper[9136]: E1203 22:08:47.760709 9136 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" Dec 03 22:08:47.763799 master-0 kubenswrapper[9136]: I1203 22:08:47.763648 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:47.763799 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:47.763799 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:47.763799 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:47.765553 master-0 kubenswrapper[9136]: I1203 22:08:47.763847 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:48.763317 master-0 kubenswrapper[9136]: I1203 22:08:48.763189 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:48.763317 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:48.763317 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:48.763317 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:48.763317 master-0 kubenswrapper[9136]: I1203 22:08:48.763307 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:49.763721 master-0 kubenswrapper[9136]: I1203 22:08:49.763606 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:49.763721 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:49.763721 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:49.763721 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:49.763721 master-0 kubenswrapper[9136]: I1203 22:08:49.763716 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:50.764173 master-0 kubenswrapper[9136]: I1203 22:08:50.764044 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:50.764173 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:50.764173 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:50.764173 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:50.765393 master-0 kubenswrapper[9136]: I1203 22:08:50.764186 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:51.763208 master-0 kubenswrapper[9136]: I1203 22:08:51.763103 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:51.763208 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:51.763208 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:51.763208 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:51.763604 master-0 kubenswrapper[9136]: I1203 22:08:51.763232 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:52.764034 master-0 kubenswrapper[9136]: I1203 22:08:52.763947 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:52.764034 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:52.764034 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:52.764034 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:52.764034 master-0 kubenswrapper[9136]: I1203 22:08:52.764029 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:53.762766 master-0 kubenswrapper[9136]: I1203 22:08:53.762690 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:53.762766 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:53.762766 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:53.762766 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:53.762766 master-0 kubenswrapper[9136]: I1203 22:08:53.762795 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:54.762737 master-0 kubenswrapper[9136]: I1203 22:08:54.762693 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:54.762737 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:54.762737 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:54.762737 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:54.763748 master-0 kubenswrapper[9136]: I1203 22:08:54.763714 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:55.764558 master-0 kubenswrapper[9136]: I1203 22:08:55.764439 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:55.764558 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:55.764558 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:55.764558 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:55.765539 master-0 kubenswrapper[9136]: I1203 22:08:55.764566 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:56.763481 master-0 kubenswrapper[9136]: I1203 22:08:56.763378 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:56.763481 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:56.763481 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:56.763481 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:56.763481 master-0 kubenswrapper[9136]: I1203 22:08:56.763479 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:57.756096 master-0 kubenswrapper[9136]: E1203 22:08:57.755996 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:57.758016 master-0 kubenswrapper[9136]: E1203 22:08:57.757917 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:57.760303 master-0 kubenswrapper[9136]: E1203 22:08:57.760247 9136 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" cmd=["/bin/bash","-c","test -f /ready/ready"] Dec 03 22:08:57.760303 master-0 kubenswrapper[9136]: E1203 22:08:57.760291 9136 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" Dec 03 22:08:57.763523 master-0 kubenswrapper[9136]: I1203 22:08:57.763435 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:57.763523 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:57.763523 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:57.763523 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:57.763993 master-0 kubenswrapper[9136]: I1203 22:08:57.763532 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:58.763762 master-0 kubenswrapper[9136]: I1203 22:08:58.763672 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:58.763762 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:58.763762 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:58.763762 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:58.763762 master-0 kubenswrapper[9136]: I1203 22:08:58.763748 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:59.764100 master-0 kubenswrapper[9136]: I1203 22:08:59.763987 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:08:59.764100 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:08:59.764100 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:08:59.764100 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:08:59.764100 master-0 kubenswrapper[9136]: I1203 22:08:59.764102 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:08:59.908332 master-0 kubenswrapper[9136]: I1203 22:08:59.908226 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:08:59.908686 master-0 kubenswrapper[9136]: E1203 22:08:59.908642 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:09:00.763073 master-0 kubenswrapper[9136]: I1203 22:09:00.762962 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:00.763073 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:00.763073 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:00.763073 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:00.763662 master-0 kubenswrapper[9136]: I1203 22:09:00.763082 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:01.763498 master-0 kubenswrapper[9136]: I1203 22:09:01.763406 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:01.763498 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:01.763498 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:01.763498 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:01.763498 master-0 kubenswrapper[9136]: I1203 22:09:01.763485 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:02.669433 master-0 kubenswrapper[9136]: I1203 22:09:02.669275 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-mbm46_d577ada1-1db4-4af7-9982-e5ad0945ae83/kube-multus-additional-cni-plugins/0.log" Dec 03 22:09:02.669433 master-0 kubenswrapper[9136]: I1203 22:09:02.669365 9136 generic.go:334] "Generic (PLEG): container finished" podID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" exitCode=137 Dec 03 22:09:02.669725 master-0 kubenswrapper[9136]: I1203 22:09:02.669435 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" event={"ID":"d577ada1-1db4-4af7-9982-e5ad0945ae83","Type":"ContainerDied","Data":"02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c"} Dec 03 22:09:02.735043 master-0 kubenswrapper[9136]: I1203 22:09:02.734977 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-mbm46_d577ada1-1db4-4af7-9982-e5ad0945ae83/kube-multus-additional-cni-plugins/0.log" Dec 03 22:09:02.735297 master-0 kubenswrapper[9136]: I1203 22:09:02.735080 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:09:02.764505 master-0 kubenswrapper[9136]: I1203 22:09:02.764408 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:02.764505 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:02.764505 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:02.764505 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:02.765216 master-0 kubenswrapper[9136]: I1203 22:09:02.764518 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:02.810981 master-0 kubenswrapper[9136]: I1203 22:09:02.810899 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqx9w\" (UniqueName: \"kubernetes.io/projected/d577ada1-1db4-4af7-9982-e5ad0945ae83-kube-api-access-hqx9w\") pod \"d577ada1-1db4-4af7-9982-e5ad0945ae83\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " Dec 03 22:09:02.811230 master-0 kubenswrapper[9136]: I1203 22:09:02.811025 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d577ada1-1db4-4af7-9982-e5ad0945ae83-cni-sysctl-allowlist\") pod \"d577ada1-1db4-4af7-9982-e5ad0945ae83\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " Dec 03 22:09:02.811230 master-0 kubenswrapper[9136]: I1203 22:09:02.811120 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d577ada1-1db4-4af7-9982-e5ad0945ae83-ready\") pod \"d577ada1-1db4-4af7-9982-e5ad0945ae83\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " Dec 03 22:09:02.811230 master-0 kubenswrapper[9136]: I1203 22:09:02.811164 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d577ada1-1db4-4af7-9982-e5ad0945ae83-tuning-conf-dir\") pod \"d577ada1-1db4-4af7-9982-e5ad0945ae83\" (UID: \"d577ada1-1db4-4af7-9982-e5ad0945ae83\") " Dec 03 22:09:02.811815 master-0 kubenswrapper[9136]: I1203 22:09:02.811698 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d577ada1-1db4-4af7-9982-e5ad0945ae83-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "d577ada1-1db4-4af7-9982-e5ad0945ae83" (UID: "d577ada1-1db4-4af7-9982-e5ad0945ae83"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:02.811923 master-0 kubenswrapper[9136]: I1203 22:09:02.811891 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d577ada1-1db4-4af7-9982-e5ad0945ae83-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "d577ada1-1db4-4af7-9982-e5ad0945ae83" (UID: "d577ada1-1db4-4af7-9982-e5ad0945ae83"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:09:02.812178 master-0 kubenswrapper[9136]: I1203 22:09:02.812146 9136 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d577ada1-1db4-4af7-9982-e5ad0945ae83-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:02.812178 master-0 kubenswrapper[9136]: I1203 22:09:02.812164 9136 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d577ada1-1db4-4af7-9982-e5ad0945ae83-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:02.812378 master-0 kubenswrapper[9136]: I1203 22:09:02.812294 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d577ada1-1db4-4af7-9982-e5ad0945ae83-ready" (OuterVolumeSpecName: "ready") pod "d577ada1-1db4-4af7-9982-e5ad0945ae83" (UID: "d577ada1-1db4-4af7-9982-e5ad0945ae83"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:09:02.814911 master-0 kubenswrapper[9136]: I1203 22:09:02.813749 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d577ada1-1db4-4af7-9982-e5ad0945ae83-kube-api-access-hqx9w" (OuterVolumeSpecName: "kube-api-access-hqx9w") pod "d577ada1-1db4-4af7-9982-e5ad0945ae83" (UID: "d577ada1-1db4-4af7-9982-e5ad0945ae83"). InnerVolumeSpecName "kube-api-access-hqx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:02.914192 master-0 kubenswrapper[9136]: I1203 22:09:02.914105 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqx9w\" (UniqueName: \"kubernetes.io/projected/d577ada1-1db4-4af7-9982-e5ad0945ae83-kube-api-access-hqx9w\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:02.914192 master-0 kubenswrapper[9136]: I1203 22:09:02.914175 9136 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/d577ada1-1db4-4af7-9982-e5ad0945ae83-ready\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:03.682280 master-0 kubenswrapper[9136]: I1203 22:09:03.682219 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-mbm46_d577ada1-1db4-4af7-9982-e5ad0945ae83/kube-multus-additional-cni-plugins/0.log" Dec 03 22:09:03.682542 master-0 kubenswrapper[9136]: I1203 22:09:03.682316 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" event={"ID":"d577ada1-1db4-4af7-9982-e5ad0945ae83","Type":"ContainerDied","Data":"b012d7a06a0f3515da44351c854cdc116d20fc976e6dc7f228c5ddfc0f9836b5"} Dec 03 22:09:03.682542 master-0 kubenswrapper[9136]: I1203 22:09:03.682376 9136 scope.go:117] "RemoveContainer" containerID="02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c" Dec 03 22:09:03.682542 master-0 kubenswrapper[9136]: I1203 22:09:03.682427 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-mbm46" Dec 03 22:09:03.745869 master-0 kubenswrapper[9136]: I1203 22:09:03.745743 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-mbm46"] Dec 03 22:09:03.753219 master-0 kubenswrapper[9136]: I1203 22:09:03.753160 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-mbm46"] Dec 03 22:09:03.762973 master-0 kubenswrapper[9136]: I1203 22:09:03.762900 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:03.762973 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:03.762973 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:03.762973 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:03.763375 master-0 kubenswrapper[9136]: I1203 22:09:03.762986 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:03.763375 master-0 kubenswrapper[9136]: I1203 22:09:03.763054 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:09:03.764017 master-0 kubenswrapper[9136]: I1203 22:09:03.763955 9136 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"cfa4dcf215847e24aecbc2956ffb90a60bdcdbbe3d57e1f15a1b92587d1579de"} pod="openshift-ingress/router-default-54f97f57-xq6ch" containerMessage="Container router failed startup probe, will be restarted" Dec 03 22:09:03.764162 master-0 kubenswrapper[9136]: I1203 22:09:03.764039 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" containerID="cri-o://cfa4dcf215847e24aecbc2956ffb90a60bdcdbbe3d57e1f15a1b92587d1579de" gracePeriod=3600 Dec 03 22:09:03.920168 master-0 kubenswrapper[9136]: I1203 22:09:03.920054 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" path="/var/lib/kubelet/pods/d577ada1-1db4-4af7-9982-e5ad0945ae83/volumes" Dec 03 22:09:06.874922 master-0 kubenswrapper[9136]: E1203 22:09:06.874835 9136 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-poddc154774_70d5_4742_8520_60e90471c530.slice/crio-ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd577ada1_1db4_4af7_9982_e5ad0945ae83.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-poddc154774_70d5_4742_8520_60e90471c530.slice/crio-conmon-ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd577ada1_1db4_4af7_9982_e5ad0945ae83.slice/crio-conmon-02125b582b21b46d85295cd97fb3344c0a933b8843c964ca138b8c1c3656ba5c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd577ada1_1db4_4af7_9982_e5ad0945ae83.slice/crio-b012d7a06a0f3515da44351c854cdc116d20fc976e6dc7f228c5ddfc0f9836b5\": RecentStats: unable to find data in memory cache]" Dec 03 22:09:06.875543 master-0 kubenswrapper[9136]: E1203 22:09:06.875173 9136 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-poddc154774_70d5_4742_8520_60e90471c530.slice/crio-conmon-ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:09:07.617651 master-0 kubenswrapper[9136]: I1203 22:09:07.617257 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-retry-1-master-0_dc154774-70d5-4742-8520-60e90471c530/installer/0.log" Dec 03 22:09:07.617651 master-0 kubenswrapper[9136]: I1203 22:09:07.617329 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:09:07.728869 master-0 kubenswrapper[9136]: I1203 22:09:07.728821 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-retry-1-master-0_dc154774-70d5-4742-8520-60e90471c530/installer/0.log" Dec 03 22:09:07.728869 master-0 kubenswrapper[9136]: I1203 22:09:07.728871 9136 generic.go:334] "Generic (PLEG): container finished" podID="dc154774-70d5-4742-8520-60e90471c530" containerID="ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b" exitCode=1 Dec 03 22:09:07.729228 master-0 kubenswrapper[9136]: I1203 22:09:07.728902 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"dc154774-70d5-4742-8520-60e90471c530","Type":"ContainerDied","Data":"ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b"} Dec 03 22:09:07.729228 master-0 kubenswrapper[9136]: I1203 22:09:07.728931 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"dc154774-70d5-4742-8520-60e90471c530","Type":"ContainerDied","Data":"a244ac17047640e6572dd01dfbf0346d713e69fdd83c35ae5ec0098c9545c491"} Dec 03 22:09:07.729228 master-0 kubenswrapper[9136]: I1203 22:09:07.728949 9136 scope.go:117] "RemoveContainer" containerID="ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b" Dec 03 22:09:07.729228 master-0 kubenswrapper[9136]: I1203 22:09:07.728993 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Dec 03 22:09:07.753129 master-0 kubenswrapper[9136]: I1203 22:09:07.753073 9136 scope.go:117] "RemoveContainer" containerID="ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b" Dec 03 22:09:07.753998 master-0 kubenswrapper[9136]: E1203 22:09:07.753930 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b\": container with ID starting with ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b not found: ID does not exist" containerID="ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b" Dec 03 22:09:07.754089 master-0 kubenswrapper[9136]: I1203 22:09:07.754020 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b"} err="failed to get container status \"ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b\": rpc error: code = NotFound desc = could not find container \"ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b\": container with ID starting with ea37be2fed82dfa41a195078ccfba8737b077eab010d209c73c3b68bd31e2b5b not found: ID does not exist" Dec 03 22:09:07.776368 master-0 kubenswrapper[9136]: I1203 22:09:07.776283 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-kubelet-dir\") pod \"dc154774-70d5-4742-8520-60e90471c530\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " Dec 03 22:09:07.776368 master-0 kubenswrapper[9136]: I1203 22:09:07.776371 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-var-lock\") pod \"dc154774-70d5-4742-8520-60e90471c530\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " Dec 03 22:09:07.776711 master-0 kubenswrapper[9136]: I1203 22:09:07.776456 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc154774-70d5-4742-8520-60e90471c530-kube-api-access\") pod \"dc154774-70d5-4742-8520-60e90471c530\" (UID: \"dc154774-70d5-4742-8520-60e90471c530\") " Dec 03 22:09:07.776711 master-0 kubenswrapper[9136]: I1203 22:09:07.776469 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc154774-70d5-4742-8520-60e90471c530" (UID: "dc154774-70d5-4742-8520-60e90471c530"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:07.776711 master-0 kubenswrapper[9136]: I1203 22:09:07.776539 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-var-lock" (OuterVolumeSpecName: "var-lock") pod "dc154774-70d5-4742-8520-60e90471c530" (UID: "dc154774-70d5-4742-8520-60e90471c530"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:07.776969 master-0 kubenswrapper[9136]: I1203 22:09:07.776927 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:07.776969 master-0 kubenswrapper[9136]: I1203 22:09:07.776960 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc154774-70d5-4742-8520-60e90471c530-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:07.781647 master-0 kubenswrapper[9136]: I1203 22:09:07.781574 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc154774-70d5-4742-8520-60e90471c530-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc154774-70d5-4742-8520-60e90471c530" (UID: "dc154774-70d5-4742-8520-60e90471c530"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:07.878405 master-0 kubenswrapper[9136]: I1203 22:09:07.878311 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc154774-70d5-4742-8520-60e90471c530-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:08.065137 master-0 kubenswrapper[9136]: I1203 22:09:08.065007 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Dec 03 22:09:08.073284 master-0 kubenswrapper[9136]: I1203 22:09:08.073202 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Dec 03 22:09:08.467698 master-0 kubenswrapper[9136]: I1203 22:09:08.467637 9136 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:09:08.467998 master-0 kubenswrapper[9136]: I1203 22:09:08.467956 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager" containerID="cri-o://ddeb5f184ad031c33b8f3f52c83dc7bc7558153b3f92fd847bf08cf2b2c45bec" gracePeriod=30 Dec 03 22:09:08.468119 master-0 kubenswrapper[9136]: I1203 22:09:08.468089 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" containerID="cri-o://e6e266ea8a0d197e496699cd337eabb6f247b692621f965213e4454b3e59b018" gracePeriod=30 Dec 03 22:09:08.468219 master-0 kubenswrapper[9136]: I1203 22:09:08.468125 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://fbbfa9bd2a1be205a1331c5dee6338e477cf874c4552e42ed29c6c6059d3ca04" gracePeriod=30 Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.468124 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://5ea02cb72627330e0d9cfb5b5ff03c2263f176b244172690291c3611f9a855f2" gracePeriod=30 Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469469 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.469814 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469832 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.469850 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469908 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.469923 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469932 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.469940 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-cert-syncer" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469947 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-cert-syncer" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.469967 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469974 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.469984 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.469991 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.470000 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-recovery-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.470007 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-recovery-controller" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.470020 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc154774-70d5-4742-8520-60e90471c530" containerName="installer" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.470028 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc154774-70d5-4742-8520-60e90471c530" containerName="installer" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: E1203 22:09:08.470045 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager" Dec 03 22:09:08.470128 master-0 kubenswrapper[9136]: I1203 22:09:08.470053 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470198 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470212 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470226 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="d577ada1-1db4-4af7-9982-e5ad0945ae83" containerName="kube-multus-additional-cni-plugins" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470247 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc154774-70d5-4742-8520-60e90471c530" containerName="installer" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470257 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470267 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-recovery-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470278 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470286 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="kube-controller-manager-cert-syncer" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: E1203 22:09:08.470410 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470422 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470572 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.471175 master-0 kubenswrapper[9136]: I1203 22:09:08.470596 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:09:08.589908 master-0 kubenswrapper[9136]: I1203 22:09:08.589747 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.589908 master-0 kubenswrapper[9136]: I1203 22:09:08.589829 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.660290 master-0 kubenswrapper[9136]: I1203 22:09:08.660229 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/3.log" Dec 03 22:09:08.661239 master-0 kubenswrapper[9136]: I1203 22:09:08.661208 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/kube-controller-manager-cert-syncer/0.log" Dec 03 22:09:08.661647 master-0 kubenswrapper[9136]: I1203 22:09:08.661614 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.667648 master-0 kubenswrapper[9136]: I1203 22:09:08.667554 9136 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="05dd6e8e0dea56089da96190349dd4c1" podUID="287a70e59cc5430b23b208b9a03b5ac7" Dec 03 22:09:08.692037 master-0 kubenswrapper[9136]: I1203 22:09:08.691956 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.692263 master-0 kubenswrapper[9136]: I1203 22:09:08.692036 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.692263 master-0 kubenswrapper[9136]: I1203 22:09:08.692125 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.692347 master-0 kubenswrapper[9136]: I1203 22:09:08.692277 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.738020 master-0 kubenswrapper[9136]: I1203 22:09:08.737858 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/cluster-policy-controller/3.log" Dec 03 22:09:08.738865 master-0 kubenswrapper[9136]: I1203 22:09:08.738837 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/kube-controller-manager-cert-syncer/0.log" Dec 03 22:09:08.739319 master-0 kubenswrapper[9136]: I1203 22:09:08.739260 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="e6e266ea8a0d197e496699cd337eabb6f247b692621f965213e4454b3e59b018" exitCode=0 Dec 03 22:09:08.739417 master-0 kubenswrapper[9136]: I1203 22:09:08.739314 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="fbbfa9bd2a1be205a1331c5dee6338e477cf874c4552e42ed29c6c6059d3ca04" exitCode=0 Dec 03 22:09:08.739417 master-0 kubenswrapper[9136]: I1203 22:09:08.739336 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="5ea02cb72627330e0d9cfb5b5ff03c2263f176b244172690291c3611f9a855f2" exitCode=2 Dec 03 22:09:08.739417 master-0 kubenswrapper[9136]: I1203 22:09:08.739360 9136 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="ddeb5f184ad031c33b8f3f52c83dc7bc7558153b3f92fd847bf08cf2b2c45bec" exitCode=0 Dec 03 22:09:08.739417 master-0 kubenswrapper[9136]: I1203 22:09:08.739401 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf20ff99c26562f4b242d4f427025d07982e61e3333d9eba9727037ba6c8316" Dec 03 22:09:08.739759 master-0 kubenswrapper[9136]: I1203 22:09:08.739440 9136 scope.go:117] "RemoveContainer" containerID="9ecdc29233a09147c3f19e627f474272f5bd98113e6fc6d4ddc9ffef0668320f" Dec 03 22:09:08.739759 master-0 kubenswrapper[9136]: I1203 22:09:08.739545 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:08.742616 master-0 kubenswrapper[9136]: I1203 22:09:08.742554 9136 generic.go:334] "Generic (PLEG): container finished" podID="31ee3291-2979-4903-98a2-355855cedd55" containerID="18687aca035ea2cd91fda30a36ce8489d00076f5b5f568339084de2e1d467700" exitCode=0 Dec 03 22:09:08.742758 master-0 kubenswrapper[9136]: I1203 22:09:08.742616 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"31ee3291-2979-4903-98a2-355855cedd55","Type":"ContainerDied","Data":"18687aca035ea2cd91fda30a36ce8489d00076f5b5f568339084de2e1d467700"} Dec 03 22:09:08.744119 master-0 kubenswrapper[9136]: I1203 22:09:08.744061 9136 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="05dd6e8e0dea56089da96190349dd4c1" podUID="287a70e59cc5430b23b208b9a03b5ac7" Dec 03 22:09:08.793549 master-0 kubenswrapper[9136]: I1203 22:09:08.793473 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-cert-dir\") pod \"05dd6e8e0dea56089da96190349dd4c1\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " Dec 03 22:09:08.793754 master-0 kubenswrapper[9136]: I1203 22:09:08.793605 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-resource-dir\") pod \"05dd6e8e0dea56089da96190349dd4c1\" (UID: \"05dd6e8e0dea56089da96190349dd4c1\") " Dec 03 22:09:08.793754 master-0 kubenswrapper[9136]: I1203 22:09:08.793710 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "05dd6e8e0dea56089da96190349dd4c1" (UID: "05dd6e8e0dea56089da96190349dd4c1"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:08.793934 master-0 kubenswrapper[9136]: I1203 22:09:08.793802 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "05dd6e8e0dea56089da96190349dd4c1" (UID: "05dd6e8e0dea56089da96190349dd4c1"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:08.794181 master-0 kubenswrapper[9136]: I1203 22:09:08.794133 9136 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:08.794181 master-0 kubenswrapper[9136]: I1203 22:09:08.794168 9136 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/05dd6e8e0dea56089da96190349dd4c1-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:09.059451 master-0 kubenswrapper[9136]: I1203 22:09:09.059378 9136 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="05dd6e8e0dea56089da96190349dd4c1" podUID="287a70e59cc5430b23b208b9a03b5ac7" Dec 03 22:09:09.755889 master-0 kubenswrapper[9136]: I1203 22:09:09.755816 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_05dd6e8e0dea56089da96190349dd4c1/kube-controller-manager-cert-syncer/0.log" Dec 03 22:09:09.922041 master-0 kubenswrapper[9136]: I1203 22:09:09.921976 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05dd6e8e0dea56089da96190349dd4c1" path="/var/lib/kubelet/pods/05dd6e8e0dea56089da96190349dd4c1/volumes" Dec 03 22:09:09.923604 master-0 kubenswrapper[9136]: I1203 22:09:09.923451 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc154774-70d5-4742-8520-60e90471c530" path="/var/lib/kubelet/pods/dc154774-70d5-4742-8520-60e90471c530/volumes" Dec 03 22:09:10.085702 master-0 kubenswrapper[9136]: I1203 22:09:10.085658 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:09:10.231012 master-0 kubenswrapper[9136]: I1203 22:09:10.230906 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31ee3291-2979-4903-98a2-355855cedd55-kube-api-access\") pod \"31ee3291-2979-4903-98a2-355855cedd55\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " Dec 03 22:09:10.231354 master-0 kubenswrapper[9136]: I1203 22:09:10.231080 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-kubelet-dir\") pod \"31ee3291-2979-4903-98a2-355855cedd55\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " Dec 03 22:09:10.231354 master-0 kubenswrapper[9136]: I1203 22:09:10.231198 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-var-lock\") pod \"31ee3291-2979-4903-98a2-355855cedd55\" (UID: \"31ee3291-2979-4903-98a2-355855cedd55\") " Dec 03 22:09:10.231354 master-0 kubenswrapper[9136]: I1203 22:09:10.231209 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "31ee3291-2979-4903-98a2-355855cedd55" (UID: "31ee3291-2979-4903-98a2-355855cedd55"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:10.231637 master-0 kubenswrapper[9136]: I1203 22:09:10.231418 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-var-lock" (OuterVolumeSpecName: "var-lock") pod "31ee3291-2979-4903-98a2-355855cedd55" (UID: "31ee3291-2979-4903-98a2-355855cedd55"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:10.231881 master-0 kubenswrapper[9136]: I1203 22:09:10.231760 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:10.231881 master-0 kubenswrapper[9136]: I1203 22:09:10.231827 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/31ee3291-2979-4903-98a2-355855cedd55-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:10.234424 master-0 kubenswrapper[9136]: I1203 22:09:10.234352 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ee3291-2979-4903-98a2-355855cedd55-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "31ee3291-2979-4903-98a2-355855cedd55" (UID: "31ee3291-2979-4903-98a2-355855cedd55"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:10.334064 master-0 kubenswrapper[9136]: I1203 22:09:10.333866 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/31ee3291-2979-4903-98a2-355855cedd55-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:10.766600 master-0 kubenswrapper[9136]: I1203 22:09:10.766499 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"31ee3291-2979-4903-98a2-355855cedd55","Type":"ContainerDied","Data":"b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22"} Dec 03 22:09:10.766600 master-0 kubenswrapper[9136]: I1203 22:09:10.766583 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22" Dec 03 22:09:10.766600 master-0 kubenswrapper[9136]: I1203 22:09:10.766590 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:09:10.907746 master-0 kubenswrapper[9136]: I1203 22:09:10.907635 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:09:10.908275 master-0 kubenswrapper[9136]: E1203 22:09:10.908212 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:09:14.860418 master-0 kubenswrapper[9136]: I1203 22:09:14.860282 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 22:09:14.861048 master-0 kubenswrapper[9136]: I1203 22:09:14.860511 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-3-master-0" podUID="b56c556d-dac0-4c12-a992-44e5f2eb32e2" containerName="installer" containerID="cri-o://85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d" gracePeriod=30 Dec 03 22:09:19.463390 master-0 kubenswrapper[9136]: I1203 22:09:19.463224 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 22:09:19.464045 master-0 kubenswrapper[9136]: E1203 22:09:19.463553 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ee3291-2979-4903-98a2-355855cedd55" containerName="installer" Dec 03 22:09:19.464045 master-0 kubenswrapper[9136]: I1203 22:09:19.463572 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ee3291-2979-4903-98a2-355855cedd55" containerName="installer" Dec 03 22:09:19.464045 master-0 kubenswrapper[9136]: I1203 22:09:19.463740 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ee3291-2979-4903-98a2-355855cedd55" containerName="installer" Dec 03 22:09:19.464331 master-0 kubenswrapper[9136]: I1203 22:09:19.464295 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.472806 master-0 kubenswrapper[9136]: I1203 22:09:19.472700 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kube-api-access\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.472941 master-0 kubenswrapper[9136]: I1203 22:09:19.472870 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-var-lock\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.472988 master-0 kubenswrapper[9136]: I1203 22:09:19.472956 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.496231 master-0 kubenswrapper[9136]: I1203 22:09:19.495951 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 22:09:19.575157 master-0 kubenswrapper[9136]: I1203 22:09:19.575049 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kube-api-access\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.575157 master-0 kubenswrapper[9136]: I1203 22:09:19.575155 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-var-lock\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.575579 master-0 kubenswrapper[9136]: I1203 22:09:19.575204 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.575579 master-0 kubenswrapper[9136]: I1203 22:09:19.575370 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-var-lock\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.575579 master-0 kubenswrapper[9136]: I1203 22:09:19.575388 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.597049 master-0 kubenswrapper[9136]: I1203 22:09:19.596944 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kube-api-access\") pod \"installer-4-master-0\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.801410 master-0 kubenswrapper[9136]: I1203 22:09:19.801237 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:19.909536 master-0 kubenswrapper[9136]: I1203 22:09:19.908865 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:19.925612 master-0 kubenswrapper[9136]: I1203 22:09:19.925296 9136 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dbea974b-42c4-49f6-98dd-1512d74ac16c" Dec 03 22:09:19.925612 master-0 kubenswrapper[9136]: I1203 22:09:19.925338 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dbea974b-42c4-49f6-98dd-1512d74ac16c" Dec 03 22:09:19.948051 master-0 kubenswrapper[9136]: I1203 22:09:19.946378 9136 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:19.952945 master-0 kubenswrapper[9136]: I1203 22:09:19.952887 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:09:19.957538 master-0 kubenswrapper[9136]: I1203 22:09:19.957485 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:09:19.963655 master-0 kubenswrapper[9136]: I1203 22:09:19.963612 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:19.969810 master-0 kubenswrapper[9136]: I1203 22:09:19.969754 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:09:20.234547 master-0 kubenswrapper[9136]: I1203 22:09:20.234434 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 22:09:20.237844 master-0 kubenswrapper[9136]: W1203 22:09:20.237754 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod81cc2f2c_0d28_4249_bf30_8ad5904b74fc.slice/crio-630c7d863696298e26882f27666493c0d5a92f1ca8a4b46cc24618b97dedfc75 WatchSource:0}: Error finding container 630c7d863696298e26882f27666493c0d5a92f1ca8a4b46cc24618b97dedfc75: Status 404 returned error can't find the container with id 630c7d863696298e26882f27666493c0d5a92f1ca8a4b46cc24618b97dedfc75 Dec 03 22:09:20.851461 master-0 kubenswrapper[9136]: I1203 22:09:20.851385 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"affb4bc279c4e26b0213bf26fa803d2a6b54fe054c87700ae68e278a97fca108"} Dec 03 22:09:20.851461 master-0 kubenswrapper[9136]: I1203 22:09:20.851435 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"5ecce271c239c91c72a501424ca7e835ff72e1bf3e5847efbd0d8ee1120b7b78"} Dec 03 22:09:20.851461 master-0 kubenswrapper[9136]: I1203 22:09:20.851446 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"7f76635edb60bf491a762b76e0a7b8207d158079fd706ba1977dae6cce9c45ed"} Dec 03 22:09:20.851461 master-0 kubenswrapper[9136]: I1203 22:09:20.851456 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"263e27d05b8311eaef7fd597428646a031a70345e2468d3a197a3e76a71409ad"} Dec 03 22:09:20.855497 master-0 kubenswrapper[9136]: I1203 22:09:20.855444 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"81cc2f2c-0d28-4249-bf30-8ad5904b74fc","Type":"ContainerStarted","Data":"d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae"} Dec 03 22:09:20.855593 master-0 kubenswrapper[9136]: I1203 22:09:20.855505 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"81cc2f2c-0d28-4249-bf30-8ad5904b74fc","Type":"ContainerStarted","Data":"630c7d863696298e26882f27666493c0d5a92f1ca8a4b46cc24618b97dedfc75"} Dec 03 22:09:20.877080 master-0 kubenswrapper[9136]: I1203 22:09:20.875542 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=1.875520151 podStartE2EDuration="1.875520151s" podCreationTimestamp="2025-12-03 22:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:09:20.871594991 +0000 UTC m=+1167.146771373" watchObservedRunningTime="2025-12-03 22:09:20.875520151 +0000 UTC m=+1167.150696533" Dec 03 22:09:21.864107 master-0 kubenswrapper[9136]: I1203 22:09:21.864037 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"f632f4d3a35c98012f1cece56605d69139c5283e86fa145d2f6236cf3af716de"} Dec 03 22:09:21.889869 master-0 kubenswrapper[9136]: I1203 22:09:21.889711 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.8896842339999997 podStartE2EDuration="2.889684234s" podCreationTimestamp="2025-12-03 22:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:09:21.887652272 +0000 UTC m=+1168.162828684" watchObservedRunningTime="2025-12-03 22:09:21.889684234 +0000 UTC m=+1168.164860636" Dec 03 22:09:22.908814 master-0 kubenswrapper[9136]: I1203 22:09:22.908641 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:09:22.910030 master-0 kubenswrapper[9136]: E1203 22:09:22.909220 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:09:29.964191 master-0 kubenswrapper[9136]: I1203 22:09:29.964116 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:29.964191 master-0 kubenswrapper[9136]: I1203 22:09:29.964192 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:29.964191 master-0 kubenswrapper[9136]: I1203 22:09:29.964205 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:29.964191 master-0 kubenswrapper[9136]: I1203 22:09:29.964216 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:29.965269 master-0 kubenswrapper[9136]: I1203 22:09:29.964712 9136 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Dec 03 22:09:29.965269 master-0 kubenswrapper[9136]: I1203 22:09:29.964854 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Dec 03 22:09:29.971253 master-0 kubenswrapper[9136]: I1203 22:09:29.971207 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:30.945880 master-0 kubenswrapper[9136]: I1203 22:09:30.945807 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:33.660719 master-0 kubenswrapper[9136]: I1203 22:09:33.660651 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 22:09:33.661360 master-0 kubenswrapper[9136]: I1203 22:09:33.660952 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-4-master-0" podUID="81cc2f2c-0d28-4249-bf30-8ad5904b74fc" containerName="installer" containerID="cri-o://d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae" gracePeriod=30 Dec 03 22:09:33.763482 master-0 kubenswrapper[9136]: E1203 22:09:33.763389 9136 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod81cc2f2c_0d28_4249_bf30_8ad5904b74fc.slice/crio-d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod81cc2f2c_0d28_4249_bf30_8ad5904b74fc.slice/crio-conmon-d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:09:33.966162 master-0 kubenswrapper[9136]: I1203 22:09:33.966028 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_81cc2f2c-0d28-4249-bf30-8ad5904b74fc/installer/0.log" Dec 03 22:09:33.966162 master-0 kubenswrapper[9136]: I1203 22:09:33.966096 9136 generic.go:334] "Generic (PLEG): container finished" podID="81cc2f2c-0d28-4249-bf30-8ad5904b74fc" containerID="d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae" exitCode=1 Dec 03 22:09:33.966162 master-0 kubenswrapper[9136]: I1203 22:09:33.966133 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"81cc2f2c-0d28-4249-bf30-8ad5904b74fc","Type":"ContainerDied","Data":"d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae"} Dec 03 22:09:34.130363 master-0 kubenswrapper[9136]: I1203 22:09:34.130301 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_81cc2f2c-0d28-4249-bf30-8ad5904b74fc/installer/0.log" Dec 03 22:09:34.130695 master-0 kubenswrapper[9136]: I1203 22:09:34.130386 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:34.313431 master-0 kubenswrapper[9136]: I1203 22:09:34.313255 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kubelet-dir\") pod \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " Dec 03 22:09:34.313680 master-0 kubenswrapper[9136]: I1203 22:09:34.313453 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kube-api-access\") pod \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " Dec 03 22:09:34.313680 master-0 kubenswrapper[9136]: I1203 22:09:34.313511 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "81cc2f2c-0d28-4249-bf30-8ad5904b74fc" (UID: "81cc2f2c-0d28-4249-bf30-8ad5904b74fc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:34.313680 master-0 kubenswrapper[9136]: I1203 22:09:34.313532 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-var-lock\") pod \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\" (UID: \"81cc2f2c-0d28-4249-bf30-8ad5904b74fc\") " Dec 03 22:09:34.313680 master-0 kubenswrapper[9136]: I1203 22:09:34.313598 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-var-lock" (OuterVolumeSpecName: "var-lock") pod "81cc2f2c-0d28-4249-bf30-8ad5904b74fc" (UID: "81cc2f2c-0d28-4249-bf30-8ad5904b74fc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:34.314273 master-0 kubenswrapper[9136]: I1203 22:09:34.314228 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:34.314273 master-0 kubenswrapper[9136]: I1203 22:09:34.314258 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:34.319058 master-0 kubenswrapper[9136]: I1203 22:09:34.318978 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "81cc2f2c-0d28-4249-bf30-8ad5904b74fc" (UID: "81cc2f2c-0d28-4249-bf30-8ad5904b74fc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:34.415272 master-0 kubenswrapper[9136]: I1203 22:09:34.415173 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81cc2f2c-0d28-4249-bf30-8ad5904b74fc-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:34.979379 master-0 kubenswrapper[9136]: I1203 22:09:34.979300 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_81cc2f2c-0d28-4249-bf30-8ad5904b74fc/installer/0.log" Dec 03 22:09:34.980894 master-0 kubenswrapper[9136]: I1203 22:09:34.979393 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"81cc2f2c-0d28-4249-bf30-8ad5904b74fc","Type":"ContainerDied","Data":"630c7d863696298e26882f27666493c0d5a92f1ca8a4b46cc24618b97dedfc75"} Dec 03 22:09:34.980894 master-0 kubenswrapper[9136]: I1203 22:09:34.979453 9136 scope.go:117] "RemoveContainer" containerID="d5823dca0e66bcae32de0643ec3f5a04f5163b818c3068e900288ada8226f9ae" Dec 03 22:09:34.980894 master-0 kubenswrapper[9136]: I1203 22:09:34.979493 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Dec 03 22:09:35.038497 master-0 kubenswrapper[9136]: I1203 22:09:35.038381 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 22:09:35.042687 master-0 kubenswrapper[9136]: I1203 22:09:35.042613 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Dec 03 22:09:35.916277 master-0 kubenswrapper[9136]: I1203 22:09:35.916184 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81cc2f2c-0d28-4249-bf30-8ad5904b74fc" path="/var/lib/kubelet/pods/81cc2f2c-0d28-4249-bf30-8ad5904b74fc/volumes" Dec 03 22:09:36.811870 master-0 kubenswrapper[9136]: I1203 22:09:36.811816 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 03 22:09:36.812473 master-0 kubenswrapper[9136]: E1203 22:09:36.812120 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81cc2f2c-0d28-4249-bf30-8ad5904b74fc" containerName="installer" Dec 03 22:09:36.812473 master-0 kubenswrapper[9136]: I1203 22:09:36.812136 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="81cc2f2c-0d28-4249-bf30-8ad5904b74fc" containerName="installer" Dec 03 22:09:36.812473 master-0 kubenswrapper[9136]: I1203 22:09:36.812291 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="81cc2f2c-0d28-4249-bf30-8ad5904b74fc" containerName="installer" Dec 03 22:09:36.812831 master-0 kubenswrapper[9136]: I1203 22:09:36.812808 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.815019 master-0 kubenswrapper[9136]: I1203 22:09:36.814980 9136 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-m76lw" Dec 03 22:09:36.815134 master-0 kubenswrapper[9136]: I1203 22:09:36.814995 9136 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Dec 03 22:09:36.820648 master-0 kubenswrapper[9136]: I1203 22:09:36.820592 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 03 22:09:36.857518 master-0 kubenswrapper[9136]: I1203 22:09:36.857404 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kube-api-access\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.857518 master-0 kubenswrapper[9136]: I1203 22:09:36.857532 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-var-lock\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.857849 master-0 kubenswrapper[9136]: I1203 22:09:36.857620 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.959000 master-0 kubenswrapper[9136]: I1203 22:09:36.958932 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-var-lock\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.959300 master-0 kubenswrapper[9136]: I1203 22:09:36.959080 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-var-lock\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.959300 master-0 kubenswrapper[9136]: I1203 22:09:36.959109 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.959447 master-0 kubenswrapper[9136]: I1203 22:09:36.959181 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.959447 master-0 kubenswrapper[9136]: I1203 22:09:36.959345 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kube-api-access\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:36.993255 master-0 kubenswrapper[9136]: I1203 22:09:36.993196 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kube-api-access\") pod \"installer-5-master-0\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:37.153376 master-0 kubenswrapper[9136]: I1203 22:09:37.153307 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:09:37.620466 master-0 kubenswrapper[9136]: I1203 22:09:37.620415 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Dec 03 22:09:37.625416 master-0 kubenswrapper[9136]: W1203 22:09:37.625354 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod237bf861_d24e_4fd7_9aee_24b6a79cd6c2.slice/crio-2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4 WatchSource:0}: Error finding container 2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4: Status 404 returned error can't find the container with id 2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4 Dec 03 22:09:37.864641 master-0 kubenswrapper[9136]: I1203 22:09:37.864558 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Dec 03 22:09:37.866810 master-0 kubenswrapper[9136]: I1203 22:09:37.866719 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.877493 master-0 kubenswrapper[9136]: I1203 22:09:37.877356 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.877877 master-0 kubenswrapper[9136]: I1203 22:09:37.877846 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.878164 master-0 kubenswrapper[9136]: I1203 22:09:37.878133 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.878459 master-0 kubenswrapper[9136]: I1203 22:09:37.878434 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Dec 03 22:09:37.909041 master-0 kubenswrapper[9136]: I1203 22:09:37.908085 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:09:37.909041 master-0 kubenswrapper[9136]: E1203 22:09:37.908486 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:09:37.979941 master-0 kubenswrapper[9136]: I1203 22:09:37.979886 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.980327 master-0 kubenswrapper[9136]: I1203 22:09:37.980296 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.980589 master-0 kubenswrapper[9136]: I1203 22:09:37.980522 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.980750 master-0 kubenswrapper[9136]: I1203 22:09:37.980534 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:37.980750 master-0 kubenswrapper[9136]: I1203 22:09:37.980087 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:38.013098 master-0 kubenswrapper[9136]: I1203 22:09:38.012990 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"237bf861-d24e-4fd7-9aee-24b6a79cd6c2","Type":"ContainerStarted","Data":"2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4"} Dec 03 22:09:38.041784 master-0 kubenswrapper[9136]: I1203 22:09:38.041705 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:38.202566 master-0 kubenswrapper[9136]: I1203 22:09:38.202353 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:09:38.627292 master-0 kubenswrapper[9136]: I1203 22:09:38.627191 9136 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Dec 03 22:09:38.633082 master-0 kubenswrapper[9136]: W1203 22:09:38.633031 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c8c7291_3150_46a5_9d14_57a23bb51cc0.slice/crio-b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce WatchSource:0}: Error finding container b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce: Status 404 returned error can't find the container with id b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce Dec 03 22:09:39.032175 master-0 kubenswrapper[9136]: I1203 22:09:39.032048 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"5c8c7291-3150-46a5-9d14-57a23bb51cc0","Type":"ContainerStarted","Data":"b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce"} Dec 03 22:09:39.035368 master-0 kubenswrapper[9136]: I1203 22:09:39.035297 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"237bf861-d24e-4fd7-9aee-24b6a79cd6c2","Type":"ContainerStarted","Data":"18479610f2526d3c7733e22ac3dd7ab3ea47a3f6618f2f508eccebdebab45285"} Dec 03 22:09:39.062892 master-0 kubenswrapper[9136]: I1203 22:09:39.060077 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=3.060042361 podStartE2EDuration="3.060042361s" podCreationTimestamp="2025-12-03 22:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:09:39.056737439 +0000 UTC m=+1185.331913851" watchObservedRunningTime="2025-12-03 22:09:39.060042361 +0000 UTC m=+1185.335218743" Dec 03 22:09:39.969109 master-0 kubenswrapper[9136]: I1203 22:09:39.969053 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:39.974386 master-0 kubenswrapper[9136]: I1203 22:09:39.974319 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:09:40.045184 master-0 kubenswrapper[9136]: I1203 22:09:40.045064 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"5c8c7291-3150-46a5-9d14-57a23bb51cc0","Type":"ContainerStarted","Data":"404a35f8de80b669ffc0285b3b2f8a18d00355d93f5f0568bb71370a69a93877"} Dec 03 22:09:40.068042 master-0 kubenswrapper[9136]: I1203 22:09:40.067888 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=3.067851249 podStartE2EDuration="3.067851249s" podCreationTimestamp="2025-12-03 22:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:09:40.066741814 +0000 UTC m=+1186.341918286" watchObservedRunningTime="2025-12-03 22:09:40.067851249 +0000 UTC m=+1186.343027721" Dec 03 22:09:44.842860 master-0 kubenswrapper[9136]: I1203 22:09:44.842809 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_b56c556d-dac0-4c12-a992-44e5f2eb32e2/installer/0.log" Dec 03 22:09:44.843347 master-0 kubenswrapper[9136]: I1203 22:09:44.842884 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:09:44.990636 master-0 kubenswrapper[9136]: I1203 22:09:44.990574 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-var-lock\") pod \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " Dec 03 22:09:44.990893 master-0 kubenswrapper[9136]: I1203 22:09:44.990704 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kube-api-access\") pod \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " Dec 03 22:09:44.990893 master-0 kubenswrapper[9136]: I1203 22:09:44.990825 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kubelet-dir\") pod \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\" (UID: \"b56c556d-dac0-4c12-a992-44e5f2eb32e2\") " Dec 03 22:09:44.991112 master-0 kubenswrapper[9136]: I1203 22:09:44.991071 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b56c556d-dac0-4c12-a992-44e5f2eb32e2" (UID: "b56c556d-dac0-4c12-a992-44e5f2eb32e2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:44.991247 master-0 kubenswrapper[9136]: I1203 22:09:44.991225 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:44.991910 master-0 kubenswrapper[9136]: I1203 22:09:44.991860 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-var-lock" (OuterVolumeSpecName: "var-lock") pod "b56c556d-dac0-4c12-a992-44e5f2eb32e2" (UID: "b56c556d-dac0-4c12-a992-44e5f2eb32e2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:09:44.994548 master-0 kubenswrapper[9136]: I1203 22:09:44.994492 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b56c556d-dac0-4c12-a992-44e5f2eb32e2" (UID: "b56c556d-dac0-4c12-a992-44e5f2eb32e2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:09:45.090749 master-0 kubenswrapper[9136]: I1203 22:09:45.090606 9136 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_b56c556d-dac0-4c12-a992-44e5f2eb32e2/installer/0.log" Dec 03 22:09:45.090749 master-0 kubenswrapper[9136]: I1203 22:09:45.090692 9136 generic.go:334] "Generic (PLEG): container finished" podID="b56c556d-dac0-4c12-a992-44e5f2eb32e2" containerID="85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d" exitCode=1 Dec 03 22:09:45.090749 master-0 kubenswrapper[9136]: I1203 22:09:45.090735 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"b56c556d-dac0-4c12-a992-44e5f2eb32e2","Type":"ContainerDied","Data":"85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d"} Dec 03 22:09:45.090999 master-0 kubenswrapper[9136]: I1203 22:09:45.090808 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"b56c556d-dac0-4c12-a992-44e5f2eb32e2","Type":"ContainerDied","Data":"1204437a6fc239716e874f0192dfd99393826c54111a382098102d88fcf7224b"} Dec 03 22:09:45.090999 master-0 kubenswrapper[9136]: I1203 22:09:45.090840 9136 scope.go:117] "RemoveContainer" containerID="85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d" Dec 03 22:09:45.090999 master-0 kubenswrapper[9136]: I1203 22:09:45.090845 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Dec 03 22:09:45.092871 master-0 kubenswrapper[9136]: I1203 22:09:45.092786 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b56c556d-dac0-4c12-a992-44e5f2eb32e2-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:45.092871 master-0 kubenswrapper[9136]: I1203 22:09:45.092816 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56c556d-dac0-4c12-a992-44e5f2eb32e2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:09:45.115395 master-0 kubenswrapper[9136]: I1203 22:09:45.115358 9136 scope.go:117] "RemoveContainer" containerID="85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d" Dec 03 22:09:45.117065 master-0 kubenswrapper[9136]: E1203 22:09:45.116109 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d\": container with ID starting with 85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d not found: ID does not exist" containerID="85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d" Dec 03 22:09:45.117173 master-0 kubenswrapper[9136]: I1203 22:09:45.117103 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d"} err="failed to get container status \"85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d\": rpc error: code = NotFound desc = could not find container \"85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d\": container with ID starting with 85dcc32d871cde0ddd586cf9e7d1328fe4b69f53eb3d8b5d846ee07038f9a74d not found: ID does not exist" Dec 03 22:09:45.138488 master-0 kubenswrapper[9136]: I1203 22:09:45.138429 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 22:09:45.156528 master-0 kubenswrapper[9136]: I1203 22:09:45.156475 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Dec 03 22:09:45.919965 master-0 kubenswrapper[9136]: I1203 22:09:45.919896 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56c556d-dac0-4c12-a992-44e5f2eb32e2" path="/var/lib/kubelet/pods/b56c556d-dac0-4c12-a992-44e5f2eb32e2/volumes" Dec 03 22:09:50.142066 master-0 kubenswrapper[9136]: I1203 22:09:50.141985 9136 generic.go:334] "Generic (PLEG): container finished" podID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerID="cfa4dcf215847e24aecbc2956ffb90a60bdcdbbe3d57e1f15a1b92587d1579de" exitCode=0 Dec 03 22:09:50.142710 master-0 kubenswrapper[9136]: I1203 22:09:50.142079 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerDied","Data":"cfa4dcf215847e24aecbc2956ffb90a60bdcdbbe3d57e1f15a1b92587d1579de"} Dec 03 22:09:50.142710 master-0 kubenswrapper[9136]: I1203 22:09:50.142171 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-54f97f57-xq6ch" event={"ID":"698e6d87-1a58-493c-8b69-d22c89d26ac5","Type":"ContainerStarted","Data":"7519b630b49d0e71a1321981181a030a9b593519d5516a9945f43cf1f405f9ca"} Dec 03 22:09:50.142710 master-0 kubenswrapper[9136]: I1203 22:09:50.142205 9136 scope.go:117] "RemoveContainer" containerID="b2f6734c8b53bbe2c29b93a23c7bd8fd22b8ba672473f7aeb1ceaade6ae54ba1" Dec 03 22:09:50.760807 master-0 kubenswrapper[9136]: I1203 22:09:50.760678 9136 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:09:50.764020 master-0 kubenswrapper[9136]: I1203 22:09:50.763965 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:50.764020 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:50.764020 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:50.764020 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:50.764567 master-0 kubenswrapper[9136]: I1203 22:09:50.764510 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:51.763641 master-0 kubenswrapper[9136]: I1203 22:09:51.763547 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:51.763641 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:51.763641 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:51.763641 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:51.764417 master-0 kubenswrapper[9136]: I1203 22:09:51.763669 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:52.763811 master-0 kubenswrapper[9136]: I1203 22:09:52.763672 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:52.763811 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:52.763811 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:52.763811 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:52.764936 master-0 kubenswrapper[9136]: I1203 22:09:52.763826 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:52.908115 master-0 kubenswrapper[9136]: I1203 22:09:52.908044 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:09:52.908353 master-0 kubenswrapper[9136]: E1203 22:09:52.908331 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:09:53.762995 master-0 kubenswrapper[9136]: I1203 22:09:53.762931 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:53.762995 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:53.762995 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:53.762995 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:53.763345 master-0 kubenswrapper[9136]: I1203 22:09:53.763013 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:54.690006 master-0 kubenswrapper[9136]: I1203 22:09:54.689955 9136 scope.go:117] "RemoveContainer" containerID="fbbfa9bd2a1be205a1331c5dee6338e477cf874c4552e42ed29c6c6059d3ca04" Dec 03 22:09:54.706641 master-0 kubenswrapper[9136]: I1203 22:09:54.706395 9136 scope.go:117] "RemoveContainer" containerID="ddeb5f184ad031c33b8f3f52c83dc7bc7558153b3f92fd847bf08cf2b2c45bec" Dec 03 22:09:54.725485 master-0 kubenswrapper[9136]: I1203 22:09:54.725437 9136 scope.go:117] "RemoveContainer" containerID="5ea02cb72627330e0d9cfb5b5ff03c2263f176b244172690291c3611f9a855f2" Dec 03 22:09:54.760205 master-0 kubenswrapper[9136]: I1203 22:09:54.760162 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:09:54.762212 master-0 kubenswrapper[9136]: I1203 22:09:54.762171 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:54.762212 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:54.762212 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:54.762212 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:54.762350 master-0 kubenswrapper[9136]: I1203 22:09:54.762228 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:55.762289 master-0 kubenswrapper[9136]: I1203 22:09:55.762224 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:55.762289 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:55.762289 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:55.762289 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:55.762289 master-0 kubenswrapper[9136]: I1203 22:09:55.762290 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:56.762590 master-0 kubenswrapper[9136]: I1203 22:09:56.762513 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:56.762590 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:56.762590 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:56.762590 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:56.763391 master-0 kubenswrapper[9136]: I1203 22:09:56.762613 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:57.762971 master-0 kubenswrapper[9136]: I1203 22:09:57.762890 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:57.762971 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:57.762971 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:57.762971 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:57.762971 master-0 kubenswrapper[9136]: I1203 22:09:57.762959 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:58.763622 master-0 kubenswrapper[9136]: I1203 22:09:58.763519 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:58.763622 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:58.763622 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:58.763622 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:58.763622 master-0 kubenswrapper[9136]: I1203 22:09:58.763603 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:09:59.764132 master-0 kubenswrapper[9136]: I1203 22:09:59.764080 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:09:59.764132 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:09:59.764132 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:09:59.764132 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:09:59.765006 master-0 kubenswrapper[9136]: I1203 22:09:59.764904 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:00.764124 master-0 kubenswrapper[9136]: I1203 22:10:00.764037 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:00.764124 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:00.764124 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:00.764124 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:00.765265 master-0 kubenswrapper[9136]: I1203 22:10:00.764145 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:01.763406 master-0 kubenswrapper[9136]: I1203 22:10:01.763313 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:01.763406 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:01.763406 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:01.763406 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:01.763406 master-0 kubenswrapper[9136]: I1203 22:10:01.763395 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:02.763248 master-0 kubenswrapper[9136]: I1203 22:10:02.763138 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:02.763248 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:02.763248 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:02.763248 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:02.763248 master-0 kubenswrapper[9136]: I1203 22:10:02.763240 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:03.763584 master-0 kubenswrapper[9136]: I1203 22:10:03.763488 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:03.763584 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:03.763584 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:03.763584 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:03.763584 master-0 kubenswrapper[9136]: I1203 22:10:03.763574 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:04.762987 master-0 kubenswrapper[9136]: I1203 22:10:04.762904 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:04.762987 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:04.762987 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:04.762987 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:04.763528 master-0 kubenswrapper[9136]: I1203 22:10:04.763014 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:05.763387 master-0 kubenswrapper[9136]: I1203 22:10:05.763324 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:05.763387 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:05.763387 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:05.763387 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:05.764135 master-0 kubenswrapper[9136]: I1203 22:10:05.763406 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:05.909984 master-0 kubenswrapper[9136]: I1203 22:10:05.908836 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:10:05.909984 master-0 kubenswrapper[9136]: E1203 22:10:05.909177 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:10:06.762887 master-0 kubenswrapper[9136]: I1203 22:10:06.762766 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:06.762887 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:06.762887 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:06.762887 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:06.762887 master-0 kubenswrapper[9136]: I1203 22:10:06.762887 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:07.763709 master-0 kubenswrapper[9136]: I1203 22:10:07.763609 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:07.763709 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:07.763709 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:07.763709 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:07.763709 master-0 kubenswrapper[9136]: I1203 22:10:07.763702 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:08.762627 master-0 kubenswrapper[9136]: I1203 22:10:08.762543 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:08.762627 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:08.762627 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:08.762627 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:08.762627 master-0 kubenswrapper[9136]: I1203 22:10:08.762633 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:09.358636 master-0 kubenswrapper[9136]: I1203 22:10:09.358568 9136 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 22:10:09.359318 master-0 kubenswrapper[9136]: I1203 22:10:09.358915 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" containerID="cri-o://c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac" gracePeriod=30 Dec 03 22:10:09.360398 master-0 kubenswrapper[9136]: I1203 22:10:09.360064 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 22:10:09.360444 master-0 kubenswrapper[9136]: E1203 22:10:09.360414 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56c556d-dac0-4c12-a992-44e5f2eb32e2" containerName="installer" Dec 03 22:10:09.360444 master-0 kubenswrapper[9136]: I1203 22:10:09.360431 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56c556d-dac0-4c12-a992-44e5f2eb32e2" containerName="installer" Dec 03 22:10:09.360510 master-0 kubenswrapper[9136]: E1203 22:10:09.360450 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360510 master-0 kubenswrapper[9136]: I1203 22:10:09.360459 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360510 master-0 kubenswrapper[9136]: E1203 22:10:09.360473 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360510 master-0 kubenswrapper[9136]: I1203 22:10:09.360483 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360684 master-0 kubenswrapper[9136]: I1203 22:10:09.360645 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56c556d-dac0-4c12-a992-44e5f2eb32e2" containerName="installer" Dec 03 22:10:09.360684 master-0 kubenswrapper[9136]: I1203 22:10:09.360679 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360758 master-0 kubenswrapper[9136]: I1203 22:10:09.360693 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360758 master-0 kubenswrapper[9136]: I1203 22:10:09.360711 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360901 master-0 kubenswrapper[9136]: E1203 22:10:09.360874 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.360901 master-0 kubenswrapper[9136]: I1203 22:10:09.360891 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="d78739a7694769882b7e47ea5ac08a10" containerName="kube-scheduler" Dec 03 22:10:09.362179 master-0 kubenswrapper[9136]: I1203 22:10:09.362141 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.527125 master-0 kubenswrapper[9136]: I1203 22:10:09.527034 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.527385 master-0 kubenswrapper[9136]: I1203 22:10:09.527205 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.600079 master-0 kubenswrapper[9136]: I1203 22:10:09.599983 9136 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 22:10:09.628895 master-0 kubenswrapper[9136]: I1203 22:10:09.628824 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.629143 master-0 kubenswrapper[9136]: I1203 22:10:09.628921 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.629143 master-0 kubenswrapper[9136]: I1203 22:10:09.628969 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.629143 master-0 kubenswrapper[9136]: I1203 22:10:09.628984 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.763002 master-0 kubenswrapper[9136]: I1203 22:10:09.762884 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:09.763002 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:09.763002 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:09.763002 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:09.763002 master-0 kubenswrapper[9136]: I1203 22:10:09.763004 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:09.893018 master-0 kubenswrapper[9136]: I1203 22:10:09.892822 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:09.917973 master-0 kubenswrapper[9136]: W1203 22:10:09.917915 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd2fa610bb2a39c39fcdd00db03a511a.slice/crio-7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d WatchSource:0}: Error finding container 7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d: Status 404 returned error can't find the container with id 7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d Dec 03 22:10:10.299217 master-0 kubenswrapper[9136]: I1203 22:10:10.299141 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 22:10:10.304126 master-0 kubenswrapper[9136]: I1203 22:10:10.304082 9136 generic.go:334] "Generic (PLEG): container finished" podID="fd2fa610bb2a39c39fcdd00db03a511a" containerID="8c83e2edc2c0d9a9e848a5cf55074f9b40879a47fa830dc6cab1377b18fd6f6a" exitCode=0 Dec 03 22:10:10.304334 master-0 kubenswrapper[9136]: I1203 22:10:10.304197 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerDied","Data":"8c83e2edc2c0d9a9e848a5cf55074f9b40879a47fa830dc6cab1377b18fd6f6a"} Dec 03 22:10:10.304334 master-0 kubenswrapper[9136]: I1203 22:10:10.304267 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d"} Dec 03 22:10:10.306142 master-0 kubenswrapper[9136]: I1203 22:10:10.306093 9136 generic.go:334] "Generic (PLEG): container finished" podID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerID="18479610f2526d3c7733e22ac3dd7ab3ea47a3f6618f2f508eccebdebab45285" exitCode=0 Dec 03 22:10:10.306248 master-0 kubenswrapper[9136]: I1203 22:10:10.306174 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"237bf861-d24e-4fd7-9aee-24b6a79cd6c2","Type":"ContainerDied","Data":"18479610f2526d3c7733e22ac3dd7ab3ea47a3f6618f2f508eccebdebab45285"} Dec 03 22:10:10.308510 master-0 kubenswrapper[9136]: I1203 22:10:10.308479 9136 generic.go:334] "Generic (PLEG): container finished" podID="d78739a7694769882b7e47ea5ac08a10" containerID="c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac" exitCode=0 Dec 03 22:10:10.308648 master-0 kubenswrapper[9136]: I1203 22:10:10.308536 9136 scope.go:117] "RemoveContainer" containerID="c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac" Dec 03 22:10:10.308648 master-0 kubenswrapper[9136]: I1203 22:10:10.308538 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Dec 03 22:10:10.336212 master-0 kubenswrapper[9136]: I1203 22:10:10.336165 9136 scope.go:117] "RemoveContainer" containerID="f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26" Dec 03 22:10:10.363149 master-0 kubenswrapper[9136]: I1203 22:10:10.362661 9136 scope.go:117] "RemoveContainer" containerID="c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac" Dec 03 22:10:10.363149 master-0 kubenswrapper[9136]: E1203 22:10:10.363096 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac\": container with ID starting with c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac not found: ID does not exist" containerID="c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac" Dec 03 22:10:10.363149 master-0 kubenswrapper[9136]: I1203 22:10:10.363132 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac"} err="failed to get container status \"c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac\": rpc error: code = NotFound desc = could not find container \"c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac\": container with ID starting with c37b6d8d35e4b765e494025a008e55fe665237a67df50b7c3c4756845027a1ac not found: ID does not exist" Dec 03 22:10:10.363733 master-0 kubenswrapper[9136]: I1203 22:10:10.363161 9136 scope.go:117] "RemoveContainer" containerID="f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26" Dec 03 22:10:10.363733 master-0 kubenswrapper[9136]: E1203 22:10:10.363470 9136 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26\": container with ID starting with f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26 not found: ID does not exist" containerID="f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26" Dec 03 22:10:10.363733 master-0 kubenswrapper[9136]: I1203 22:10:10.363489 9136 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26"} err="failed to get container status \"f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26\": rpc error: code = NotFound desc = could not find container \"f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26\": container with ID starting with f4703285d4a8783bd97e162d56593bf91baa2b6d06429f7c4f21034e93b27d26 not found: ID does not exist" Dec 03 22:10:10.443873 master-0 kubenswrapper[9136]: I1203 22:10:10.443619 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") pod \"d78739a7694769882b7e47ea5ac08a10\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " Dec 03 22:10:10.443873 master-0 kubenswrapper[9136]: I1203 22:10:10.443747 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") pod \"d78739a7694769882b7e47ea5ac08a10\" (UID: \"d78739a7694769882b7e47ea5ac08a10\") " Dec 03 22:10:10.443873 master-0 kubenswrapper[9136]: I1203 22:10:10.443785 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs" (OuterVolumeSpecName: "logs") pod "d78739a7694769882b7e47ea5ac08a10" (UID: "d78739a7694769882b7e47ea5ac08a10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:10.444384 master-0 kubenswrapper[9136]: I1203 22:10:10.443895 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets" (OuterVolumeSpecName: "secrets") pod "d78739a7694769882b7e47ea5ac08a10" (UID: "d78739a7694769882b7e47ea5ac08a10"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:10.444384 master-0 kubenswrapper[9136]: I1203 22:10:10.444310 9136 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:10.444384 master-0 kubenswrapper[9136]: I1203 22:10:10.444333 9136 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/d78739a7694769882b7e47ea5ac08a10-secrets\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:10.762517 master-0 kubenswrapper[9136]: I1203 22:10:10.762447 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:10.762517 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:10.762517 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:10.762517 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:10.762879 master-0 kubenswrapper[9136]: I1203 22:10:10.762526 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:11.317988 master-0 kubenswrapper[9136]: I1203 22:10:11.317420 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"dcc6040d2844c3f3c3156ab1bc868c6f57b5cdffda04d188c4419d66f9b0084e"} Dec 03 22:10:11.317988 master-0 kubenswrapper[9136]: I1203 22:10:11.317473 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"4469364ea09f6c46de0174b9f26041a598b86281f7be9ab5063b1bb3fbaf0cc1"} Dec 03 22:10:11.317988 master-0 kubenswrapper[9136]: I1203 22:10:11.317485 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"2ef318f46b5f8577afb9290b7c23fb5a9f5de2346d418c5ad5d719a19e57967f"} Dec 03 22:10:11.317988 master-0 kubenswrapper[9136]: I1203 22:10:11.317825 9136 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:11.339260 master-0 kubenswrapper[9136]: I1203 22:10:11.339193 9136 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.339174354 podStartE2EDuration="2.339174354s" podCreationTimestamp="2025-12-03 22:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:11.335514728 +0000 UTC m=+1217.610691120" watchObservedRunningTime="2025-12-03 22:10:11.339174354 +0000 UTC m=+1217.614350736" Dec 03 22:10:11.667922 master-0 kubenswrapper[9136]: I1203 22:10:11.667800 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:10:11.670126 master-0 kubenswrapper[9136]: I1203 22:10:11.670093 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kubelet-dir\") pod \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " Dec 03 22:10:11.670211 master-0 kubenswrapper[9136]: I1203 22:10:11.670201 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-var-lock\") pod \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " Dec 03 22:10:11.670277 master-0 kubenswrapper[9136]: I1203 22:10:11.670201 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "237bf861-d24e-4fd7-9aee-24b6a79cd6c2" (UID: "237bf861-d24e-4fd7-9aee-24b6a79cd6c2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:11.670277 master-0 kubenswrapper[9136]: I1203 22:10:11.670240 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kube-api-access\") pod \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\" (UID: \"237bf861-d24e-4fd7-9aee-24b6a79cd6c2\") " Dec 03 22:10:11.670277 master-0 kubenswrapper[9136]: I1203 22:10:11.670256 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-var-lock" (OuterVolumeSpecName: "var-lock") pod "237bf861-d24e-4fd7-9aee-24b6a79cd6c2" (UID: "237bf861-d24e-4fd7-9aee-24b6a79cd6c2"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:11.670471 master-0 kubenswrapper[9136]: I1203 22:10:11.670451 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:11.670471 master-0 kubenswrapper[9136]: I1203 22:10:11.670470 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:11.675265 master-0 kubenswrapper[9136]: I1203 22:10:11.675216 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "237bf861-d24e-4fd7-9aee-24b6a79cd6c2" (UID: "237bf861-d24e-4fd7-9aee-24b6a79cd6c2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:11.763067 master-0 kubenswrapper[9136]: I1203 22:10:11.762986 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:11.763067 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:11.763067 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:11.763067 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:11.763067 master-0 kubenswrapper[9136]: I1203 22:10:11.763070 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:11.772449 master-0 kubenswrapper[9136]: I1203 22:10:11.772397 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/237bf861-d24e-4fd7-9aee-24b6a79cd6c2-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:11.927269 master-0 kubenswrapper[9136]: I1203 22:10:11.922858 9136 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d78739a7694769882b7e47ea5ac08a10" path="/var/lib/kubelet/pods/d78739a7694769882b7e47ea5ac08a10/volumes" Dec 03 22:10:11.927269 master-0 kubenswrapper[9136]: I1203 22:10:11.923195 9136 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Dec 03 22:10:11.945353 master-0 kubenswrapper[9136]: I1203 22:10:11.945270 9136 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 22:10:11.945353 master-0 kubenswrapper[9136]: I1203 22:10:11.945345 9136 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="bedd8813-eb3a-46ce-ade5-0694be86acc3" Dec 03 22:10:11.952860 master-0 kubenswrapper[9136]: I1203 22:10:11.952799 9136 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Dec 03 22:10:11.952860 master-0 kubenswrapper[9136]: I1203 22:10:11.952853 9136 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="bedd8813-eb3a-46ce-ade5-0694be86acc3" Dec 03 22:10:12.330135 master-0 kubenswrapper[9136]: I1203 22:10:12.329951 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"237bf861-d24e-4fd7-9aee-24b6a79cd6c2","Type":"ContainerDied","Data":"2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4"} Dec 03 22:10:12.330135 master-0 kubenswrapper[9136]: I1203 22:10:12.330032 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4" Dec 03 22:10:12.330135 master-0 kubenswrapper[9136]: I1203 22:10:12.329992 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:10:12.763499 master-0 kubenswrapper[9136]: I1203 22:10:12.763423 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:12.763499 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:12.763499 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:12.763499 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:12.764257 master-0 kubenswrapper[9136]: I1203 22:10:12.763532 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:13.762925 master-0 kubenswrapper[9136]: I1203 22:10:13.762818 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:13.762925 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:13.762925 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:13.762925 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:13.763564 master-0 kubenswrapper[9136]: I1203 22:10:13.762932 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:14.762451 master-0 kubenswrapper[9136]: I1203 22:10:14.762388 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:14.762451 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:14.762451 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:14.762451 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:14.763104 master-0 kubenswrapper[9136]: I1203 22:10:14.763062 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:15.763474 master-0 kubenswrapper[9136]: I1203 22:10:15.763397 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:15.763474 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:15.763474 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:15.763474 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:15.764167 master-0 kubenswrapper[9136]: I1203 22:10:15.763492 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:16.763872 master-0 kubenswrapper[9136]: I1203 22:10:16.763809 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:16.763872 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:16.763872 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:16.763872 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:16.765081 master-0 kubenswrapper[9136]: I1203 22:10:16.763881 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:16.907735 master-0 kubenswrapper[9136]: I1203 22:10:16.907694 9136 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:10:16.908304 master-0 kubenswrapper[9136]: E1203 22:10:16.908282 9136 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ingress-operator\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=ingress-operator pod=ingress-operator-85dbd94574-2hxlh_openshift-ingress-operator(0869de9b-6f5b-4c31-81ad-02a9c8888193)\"" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" podUID="0869de9b-6f5b-4c31-81ad-02a9c8888193" Dec 03 22:10:17.762266 master-0 kubenswrapper[9136]: I1203 22:10:17.762212 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:17.762266 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:17.762266 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:17.762266 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:17.762717 master-0 kubenswrapper[9136]: I1203 22:10:17.762681 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:18.764229 master-0 kubenswrapper[9136]: I1203 22:10:18.764115 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:18.764229 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:18.764229 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:18.764229 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:18.764229 master-0 kubenswrapper[9136]: I1203 22:10:18.764212 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:19.763969 master-0 kubenswrapper[9136]: I1203 22:10:19.763876 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:19.763969 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:19.763969 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:19.763969 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:19.763969 master-0 kubenswrapper[9136]: I1203 22:10:19.763966 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:20.763267 master-0 kubenswrapper[9136]: I1203 22:10:20.763158 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:20.763267 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:20.763267 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:20.763267 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:20.763267 master-0 kubenswrapper[9136]: I1203 22:10:20.763247 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:21.764046 master-0 kubenswrapper[9136]: I1203 22:10:21.763966 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:21.764046 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:21.764046 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:21.764046 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:21.765046 master-0 kubenswrapper[9136]: I1203 22:10:21.764067 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:22.764596 master-0 kubenswrapper[9136]: I1203 22:10:22.764508 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:22.764596 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:22.764596 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:22.764596 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:22.765645 master-0 kubenswrapper[9136]: I1203 22:10:22.764606 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:23.762673 master-0 kubenswrapper[9136]: I1203 22:10:23.762609 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:23.762673 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:23.762673 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:23.762673 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:23.763040 master-0 kubenswrapper[9136]: I1203 22:10:23.762675 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:24.763253 master-0 kubenswrapper[9136]: I1203 22:10:24.763195 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:24.763253 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:24.763253 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:24.763253 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:24.763253 master-0 kubenswrapper[9136]: I1203 22:10:24.763264 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:25.762695 master-0 kubenswrapper[9136]: I1203 22:10:25.762606 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:25.762695 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:25.762695 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:25.762695 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:25.762695 master-0 kubenswrapper[9136]: I1203 22:10:25.762678 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:26.766941 master-0 kubenswrapper[9136]: I1203 22:10:26.765280 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:26.766941 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:26.766941 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:26.766941 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:26.767794 master-0 kubenswrapper[9136]: I1203 22:10:26.766989 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:26.969150 master-0 kubenswrapper[9136]: I1203 22:10:26.969073 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 22:10:26.969526 master-0 kubenswrapper[9136]: E1203 22:10:26.969483 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerName="installer" Dec 03 22:10:26.969526 master-0 kubenswrapper[9136]: I1203 22:10:26.969510 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerName="installer" Dec 03 22:10:26.969684 master-0 kubenswrapper[9136]: I1203 22:10:26.969633 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerName="installer" Dec 03 22:10:26.970314 master-0 kubenswrapper[9136]: I1203 22:10:26.970057 9136 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 22:10:26.970803 master-0 kubenswrapper[9136]: I1203 22:10:26.970698 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:26.971072 master-0 kubenswrapper[9136]: I1203 22:10:26.970997 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6c639bc69c380d96355863b219dedf37b52fea28072ae5913704ecc349c5d8c1" gracePeriod=15 Dec 03 22:10:26.971176 master-0 kubenswrapper[9136]: I1203 22:10:26.970943 9136 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" containerID="cri-o://70791961c08656ffa1017f6f966c1f2ba603d16610929ed1a69c881343d1bfec" gracePeriod=15 Dec 03 22:10:26.983820 master-0 kubenswrapper[9136]: I1203 22:10:26.981683 9136 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 22:10:26.983999 master-0 kubenswrapper[9136]: E1203 22:10:26.983844 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:10:26.983999 master-0 kubenswrapper[9136]: I1203 22:10:26.983884 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:10:26.983999 master-0 kubenswrapper[9136]: E1203 22:10:26.983935 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 22:10:26.983999 master-0 kubenswrapper[9136]: I1203 22:10:26.983949 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 22:10:26.984133 master-0 kubenswrapper[9136]: E1203 22:10:26.984015 9136 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 22:10:26.984133 master-0 kubenswrapper[9136]: I1203 22:10:26.984029 9136 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 22:10:26.984877 master-0 kubenswrapper[9136]: I1203 22:10:26.984859 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:10:26.984965 master-0 kubenswrapper[9136]: I1203 22:10:26.984953 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 22:10:26.985029 master-0 kubenswrapper[9136]: I1203 22:10:26.985019 9136 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 22:10:26.989944 master-0 kubenswrapper[9136]: I1203 22:10:26.989913 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.042629 master-0 kubenswrapper[9136]: E1203 22:10:27.042494 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.066818 master-0 kubenswrapper[9136]: E1203 22:10:27.066440 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.117155 master-0 kubenswrapper[9136]: I1203 22:10:27.117090 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.117313 master-0 kubenswrapper[9136]: I1203 22:10:27.117182 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.117313 master-0 kubenswrapper[9136]: I1203 22:10:27.117255 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.117313 master-0 kubenswrapper[9136]: I1203 22:10:27.117287 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.117313 master-0 kubenswrapper[9136]: I1203 22:10:27.117309 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.117449 master-0 kubenswrapper[9136]: I1203 22:10:27.117331 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.117449 master-0 kubenswrapper[9136]: I1203 22:10:27.117358 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.117449 master-0 kubenswrapper[9136]: I1203 22:10:27.117391 9136 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.219621 master-0 kubenswrapper[9136]: I1203 22:10:27.219539 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219644 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219674 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219711 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219812 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219838 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219874 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219905 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219884 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.219967 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.220013 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.220027 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.220084 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.220086 9136 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.220137 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.221154 master-0 kubenswrapper[9136]: I1203 22:10:27.220020 9136 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.344561 master-0 kubenswrapper[9136]: I1203 22:10:27.344354 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:27.368501 master-0 kubenswrapper[9136]: I1203 22:10:27.368373 9136 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:27.401929 master-0 kubenswrapper[9136]: W1203 22:10:27.401860 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf08d11be0e2919664ff2ea4b2440d0e0.slice/crio-f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e WatchSource:0}: Error finding container f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e: Status 404 returned error can't find the container with id f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e Dec 03 22:10:27.411912 master-0 kubenswrapper[9136]: E1203 22:10:27.411747 9136 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.187dd417bc75744a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:f08d11be0e2919664ff2ea4b2440d0e0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:91cbda9693e888881e7c45cd6e504b91ba8a203fe0596237a4a17b3ca4e18eef\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 22:10:27.410252874 +0000 UTC m=+1233.685429276,LastTimestamp:2025-12-03 22:10:27.410252874 +0000 UTC m=+1233.685429276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Dec 03 22:10:27.419323 master-0 kubenswrapper[9136]: W1203 22:10:27.419181 9136 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a00233b22d19df39b2e1c8ba133b3c2.slice/crio-6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445 WatchSource:0}: Error finding container 6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445: Status 404 returned error can't find the container with id 6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445 Dec 03 22:10:27.463356 master-0 kubenswrapper[9136]: I1203 22:10:27.463310 9136 generic.go:334] "Generic (PLEG): container finished" podID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" containerID="404a35f8de80b669ffc0285b3b2f8a18d00355d93f5f0568bb71370a69a93877" exitCode=0 Dec 03 22:10:27.463740 master-0 kubenswrapper[9136]: I1203 22:10:27.463433 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"5c8c7291-3150-46a5-9d14-57a23bb51cc0","Type":"ContainerDied","Data":"404a35f8de80b669ffc0285b3b2f8a18d00355d93f5f0568bb71370a69a93877"} Dec 03 22:10:27.465159 master-0 kubenswrapper[9136]: I1203 22:10:27.465106 9136 status_manager.go:851] "Failed to get status for pod" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:10:27.465745 master-0 kubenswrapper[9136]: I1203 22:10:27.465701 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f08d11be0e2919664ff2ea4b2440d0e0","Type":"ContainerStarted","Data":"f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e"} Dec 03 22:10:27.468798 master-0 kubenswrapper[9136]: I1203 22:10:27.468719 9136 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="6c639bc69c380d96355863b219dedf37b52fea28072ae5913704ecc349c5d8c1" exitCode=0 Dec 03 22:10:27.470253 master-0 kubenswrapper[9136]: I1203 22:10:27.470214 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445"} Dec 03 22:10:27.763073 master-0 kubenswrapper[9136]: I1203 22:10:27.762987 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:27.763073 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:27.763073 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:27.763073 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:27.763502 master-0 kubenswrapper[9136]: I1203 22:10:27.763110 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:28.482060 master-0 kubenswrapper[9136]: I1203 22:10:28.481962 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f08d11be0e2919664ff2ea4b2440d0e0","Type":"ContainerStarted","Data":"42ac76745cd48697a9d60b7a3008b7dd9f6c94eb9ad7c1bc7b99f348cd44c91a"} Dec 03 22:10:28.483842 master-0 kubenswrapper[9136]: I1203 22:10:28.483670 9136 status_manager.go:851] "Failed to get status for pod" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:10:28.484058 master-0 kubenswrapper[9136]: E1203 22:10:28.483933 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:28.484740 master-0 kubenswrapper[9136]: I1203 22:10:28.484690 9136 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402" exitCode=0 Dec 03 22:10:28.485083 master-0 kubenswrapper[9136]: I1203 22:10:28.485019 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerDied","Data":"d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402"} Dec 03 22:10:28.486306 master-0 kubenswrapper[9136]: I1203 22:10:28.486215 9136 status_manager.go:851] "Failed to get status for pod" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:10:28.486589 master-0 kubenswrapper[9136]: E1203 22:10:28.486537 9136 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:28.763624 master-0 kubenswrapper[9136]: I1203 22:10:28.763458 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:28.763624 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:28.763624 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:28.763624 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:28.763624 master-0 kubenswrapper[9136]: I1203 22:10:28.763548 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:28.886081 master-0 kubenswrapper[9136]: I1203 22:10:28.885991 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:28.888297 master-0 kubenswrapper[9136]: I1203 22:10:28.887500 9136 status_manager.go:851] "Failed to get status for pod" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:10:29.052419 master-0 kubenswrapper[9136]: I1203 22:10:29.050938 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " Dec 03 22:10:29.052419 master-0 kubenswrapper[9136]: I1203 22:10:29.051070 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " Dec 03 22:10:29.052419 master-0 kubenswrapper[9136]: I1203 22:10:29.051276 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " Dec 03 22:10:29.052419 master-0 kubenswrapper[9136]: I1203 22:10:29.051746 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c8c7291-3150-46a5-9d14-57a23bb51cc0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.052419 master-0 kubenswrapper[9136]: I1203 22:10:29.051822 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock" (OuterVolumeSpecName: "var-lock") pod "5c8c7291-3150-46a5-9d14-57a23bb51cc0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.068788 master-0 kubenswrapper[9136]: I1203 22:10:29.068685 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c8c7291-3150-46a5-9d14-57a23bb51cc0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:10:29.155016 master-0 kubenswrapper[9136]: I1203 22:10:29.154860 9136 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:29.155016 master-0 kubenswrapper[9136]: I1203 22:10:29.155027 9136 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:29.155285 master-0 kubenswrapper[9136]: I1203 22:10:29.155041 9136 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:29.494912 master-0 kubenswrapper[9136]: I1203 22:10:29.494729 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"5c8c7291-3150-46a5-9d14-57a23bb51cc0","Type":"ContainerDied","Data":"b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce"} Dec 03 22:10:29.494912 master-0 kubenswrapper[9136]: I1203 22:10:29.494787 9136 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce" Dec 03 22:10:29.494912 master-0 kubenswrapper[9136]: I1203 22:10:29.494884 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:29.498121 master-0 kubenswrapper[9136]: I1203 22:10:29.498064 9136 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="70791961c08656ffa1017f6f966c1f2ba603d16610929ed1a69c881343d1bfec" exitCode=0 Dec 03 22:10:29.502605 master-0 kubenswrapper[9136]: I1203 22:10:29.502519 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35"} Dec 03 22:10:29.502883 master-0 kubenswrapper[9136]: I1203 22:10:29.502591 9136 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b"} Dec 03 22:10:29.763693 master-0 kubenswrapper[9136]: I1203 22:10:29.763631 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:29.763693 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:29.763693 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:29.763693 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:29.763990 master-0 kubenswrapper[9136]: I1203 22:10:29.763713 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:29.796706 master-0 kubenswrapper[9136]: I1203 22:10:29.796638 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 22:10:29.936532 master-0 kubenswrapper[9136]: I1203 22:10:29.936477 9136 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 03 22:10:29.969336 master-0 kubenswrapper[9136]: I1203 22:10:29.969287 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 22:10:29.969468 master-0 kubenswrapper[9136]: I1203 22:10:29.969340 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 22:10:29.969468 master-0 kubenswrapper[9136]: I1203 22:10:29.969371 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 22:10:29.969564 master-0 kubenswrapper[9136]: I1203 22:10:29.969475 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 22:10:29.969564 master-0 kubenswrapper[9136]: I1203 22:10:29.969509 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 22:10:29.969564 master-0 kubenswrapper[9136]: I1203 22:10:29.969528 9136 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") pod \"13238af3704fe583f617f61e755cf4c2\" (UID: \"13238af3704fe583f617f61e755cf4c2\") " Dec 03 22:10:29.969966 master-0 kubenswrapper[9136]: I1203 22:10:29.969939 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.970049 master-0 kubenswrapper[9136]: I1203 22:10:29.969983 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs" (OuterVolumeSpecName: "logs") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.970049 master-0 kubenswrapper[9136]: I1203 22:10:29.970005 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets" (OuterVolumeSpecName: "secrets") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.970049 master-0 kubenswrapper[9136]: I1203 22:10:29.970024 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.970049 master-0 kubenswrapper[9136]: I1203 22:10:29.970043 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config" (OuterVolumeSpecName: "config") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:29.970235 master-0 kubenswrapper[9136]: I1203 22:10:29.970062 9136 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "13238af3704fe583f617f61e755cf4c2" (UID: "13238af3704fe583f617f61e755cf4c2"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:30.072207 master-0 kubenswrapper[9136]: I1203 22:10:30.072095 9136 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:30.072692 master-0 kubenswrapper[9136]: I1203 22:10:30.072663 9136 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-secrets\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:30.072756 master-0 kubenswrapper[9136]: I1203 22:10:30.072691 9136 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:30.072756 master-0 kubenswrapper[9136]: I1203 22:10:30.072705 9136 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:30.072756 master-0 kubenswrapper[9136]: I1203 22:10:30.072716 9136 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:30.072756 master-0 kubenswrapper[9136]: I1203 22:10:30.072729 9136 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/13238af3704fe583f617f61e755cf4c2-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:30.518489 master-0 kubenswrapper[9136]: I1203 22:10:30.518357 9136 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Dec 03 22:10:30.774878 master-0 kubenswrapper[9136]: I1203 22:10:30.769418 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:30.774878 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:30.774878 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:30.774878 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:30.774878 master-0 kubenswrapper[9136]: I1203 22:10:30.769483 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:31.762511 master-0 kubenswrapper[9136]: I1203 22:10:31.762433 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:31.762511 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:31.762511 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:31.762511 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:31.762511 master-0 kubenswrapper[9136]: I1203 22:10:31.762507 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:32.764169 master-0 kubenswrapper[9136]: I1203 22:10:32.764074 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:32.764169 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:32.764169 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:32.764169 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:32.765375 master-0 kubenswrapper[9136]: I1203 22:10:32.765310 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:33.763856 master-0 kubenswrapper[9136]: I1203 22:10:33.763800 9136 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:33.763856 master-0 kubenswrapper[9136]: [-]has-synced failed: reason withheld Dec 03 22:10:33.763856 master-0 kubenswrapper[9136]: [+]process-running ok Dec 03 22:10:33.763856 master-0 kubenswrapper[9136]: healthz check failed Dec 03 22:10:33.764214 master-0 kubenswrapper[9136]: I1203 22:10:33.763859 9136 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:34.646152 master-0 systemd[1]: Stopping Kubernetes Kubelet... Dec 03 22:10:34.668244 master-0 systemd[1]: kubelet.service: Deactivated successfully. Dec 03 22:10:34.668730 master-0 systemd[1]: Stopped Kubernetes Kubelet. Dec 03 22:10:34.674725 master-0 systemd[1]: kubelet.service: Consumed 3min 18.842s CPU time. Dec 03 22:10:34.703314 master-0 systemd[1]: Starting Kubernetes Kubelet... Dec 03 22:10:34.859220 master-0 kubenswrapper[36504]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:10:34.859220 master-0 kubenswrapper[36504]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 22:10:34.859220 master-0 kubenswrapper[36504]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:10:34.859220 master-0 kubenswrapper[36504]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:10:34.859220 master-0 kubenswrapper[36504]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 22:10:34.859220 master-0 kubenswrapper[36504]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 22:10:34.860501 master-0 kubenswrapper[36504]: I1203 22:10:34.859541 36504 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 22:10:34.867950 master-0 kubenswrapper[36504]: W1203 22:10:34.867903 36504 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:10:34.868131 master-0 kubenswrapper[36504]: W1203 22:10:34.868110 36504 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:10:34.868251 master-0 kubenswrapper[36504]: W1203 22:10:34.868233 36504 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:10:34.868360 master-0 kubenswrapper[36504]: W1203 22:10:34.868342 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:10:34.868467 master-0 kubenswrapper[36504]: W1203 22:10:34.868450 36504 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:10:34.868574 master-0 kubenswrapper[36504]: W1203 22:10:34.868557 36504 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:10:34.868738 master-0 kubenswrapper[36504]: W1203 22:10:34.868715 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:10:34.868930 master-0 kubenswrapper[36504]: W1203 22:10:34.868906 36504 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:10:34.869050 master-0 kubenswrapper[36504]: W1203 22:10:34.869032 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:10:34.869163 master-0 kubenswrapper[36504]: W1203 22:10:34.869145 36504 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:10:34.869294 master-0 kubenswrapper[36504]: W1203 22:10:34.869270 36504 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:10:34.869459 master-0 kubenswrapper[36504]: W1203 22:10:34.869428 36504 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:10:34.869598 master-0 kubenswrapper[36504]: W1203 22:10:34.869577 36504 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:10:34.869739 master-0 kubenswrapper[36504]: W1203 22:10:34.869720 36504 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:10:34.869911 master-0 kubenswrapper[36504]: W1203 22:10:34.869891 36504 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:10:34.870026 master-0 kubenswrapper[36504]: W1203 22:10:34.870008 36504 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:10:34.870134 master-0 kubenswrapper[36504]: W1203 22:10:34.870117 36504 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:10:34.870241 master-0 kubenswrapper[36504]: W1203 22:10:34.870224 36504 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:10:34.870347 master-0 kubenswrapper[36504]: W1203 22:10:34.870330 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:10:34.870471 master-0 kubenswrapper[36504]: W1203 22:10:34.870454 36504 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:10:34.870592 master-0 kubenswrapper[36504]: W1203 22:10:34.870574 36504 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:10:34.870702 master-0 kubenswrapper[36504]: W1203 22:10:34.870684 36504 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:10:34.870869 master-0 kubenswrapper[36504]: W1203 22:10:34.870846 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:10:34.871022 master-0 kubenswrapper[36504]: W1203 22:10:34.871000 36504 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:10:34.871132 master-0 kubenswrapper[36504]: W1203 22:10:34.871115 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:10:34.871240 master-0 kubenswrapper[36504]: W1203 22:10:34.871222 36504 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:10:34.871345 master-0 kubenswrapper[36504]: W1203 22:10:34.871329 36504 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:10:34.871458 master-0 kubenswrapper[36504]: W1203 22:10:34.871441 36504 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:10:34.871564 master-0 kubenswrapper[36504]: W1203 22:10:34.871548 36504 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:10:34.871675 master-0 kubenswrapper[36504]: W1203 22:10:34.871658 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:10:34.871809 master-0 kubenswrapper[36504]: W1203 22:10:34.871763 36504 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:10:34.871931 master-0 kubenswrapper[36504]: W1203 22:10:34.871913 36504 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:10:34.872039 master-0 kubenswrapper[36504]: W1203 22:10:34.872023 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:10:34.872153 master-0 kubenswrapper[36504]: W1203 22:10:34.872137 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:10:34.872260 master-0 kubenswrapper[36504]: W1203 22:10:34.872242 36504 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:10:34.872373 master-0 kubenswrapper[36504]: W1203 22:10:34.872355 36504 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:10:34.872485 master-0 kubenswrapper[36504]: W1203 22:10:34.872467 36504 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:10:34.872643 master-0 kubenswrapper[36504]: W1203 22:10:34.872583 36504 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:10:34.872792 master-0 kubenswrapper[36504]: W1203 22:10:34.872750 36504 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:10:34.872919 master-0 kubenswrapper[36504]: W1203 22:10:34.872900 36504 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:10:34.873025 master-0 kubenswrapper[36504]: W1203 22:10:34.873007 36504 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:10:34.873139 master-0 kubenswrapper[36504]: W1203 22:10:34.873122 36504 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:10:34.873255 master-0 kubenswrapper[36504]: W1203 22:10:34.873238 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:10:34.873387 master-0 kubenswrapper[36504]: W1203 22:10:34.873363 36504 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:10:34.873550 master-0 kubenswrapper[36504]: W1203 22:10:34.873526 36504 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:10:34.873745 master-0 kubenswrapper[36504]: W1203 22:10:34.873726 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:10:34.873897 master-0 kubenswrapper[36504]: W1203 22:10:34.873878 36504 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:10:34.874013 master-0 kubenswrapper[36504]: W1203 22:10:34.873996 36504 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:10:34.874123 master-0 kubenswrapper[36504]: W1203 22:10:34.874106 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:10:34.874230 master-0 kubenswrapper[36504]: W1203 22:10:34.874213 36504 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:10:34.874334 master-0 kubenswrapper[36504]: W1203 22:10:34.874317 36504 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:10:34.874454 master-0 kubenswrapper[36504]: W1203 22:10:34.874437 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:10:34.874569 master-0 kubenswrapper[36504]: W1203 22:10:34.874549 36504 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:10:34.874678 master-0 kubenswrapper[36504]: W1203 22:10:34.874661 36504 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:10:34.874821 master-0 kubenswrapper[36504]: W1203 22:10:34.874796 36504 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:10:34.875000 master-0 kubenswrapper[36504]: W1203 22:10:34.874974 36504 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:10:34.875171 master-0 kubenswrapper[36504]: W1203 22:10:34.875143 36504 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:10:34.875333 master-0 kubenswrapper[36504]: W1203 22:10:34.875307 36504 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:10:34.875468 master-0 kubenswrapper[36504]: W1203 22:10:34.875450 36504 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:10:34.875585 master-0 kubenswrapper[36504]: W1203 22:10:34.875567 36504 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:10:34.875717 master-0 kubenswrapper[36504]: W1203 22:10:34.875693 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:10:34.875971 master-0 kubenswrapper[36504]: W1203 22:10:34.875944 36504 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:10:34.876153 master-0 kubenswrapper[36504]: W1203 22:10:34.876128 36504 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:10:34.876280 master-0 kubenswrapper[36504]: W1203 22:10:34.876262 36504 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:10:34.876389 master-0 kubenswrapper[36504]: W1203 22:10:34.876372 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:10:34.876659 master-0 kubenswrapper[36504]: W1203 22:10:34.876639 36504 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:10:34.876804 master-0 kubenswrapper[36504]: W1203 22:10:34.876760 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 22:10:34.876929 master-0 kubenswrapper[36504]: W1203 22:10:34.876911 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:10:34.877038 master-0 kubenswrapper[36504]: W1203 22:10:34.877020 36504 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:10:34.877152 master-0 kubenswrapper[36504]: W1203 22:10:34.877134 36504 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:10:34.877287 master-0 kubenswrapper[36504]: W1203 22:10:34.877264 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:10:34.877419 master-0 kubenswrapper[36504]: W1203 22:10:34.877399 36504 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:10:34.877756 master-0 kubenswrapper[36504]: I1203 22:10:34.877725 36504 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 22:10:34.877942 master-0 kubenswrapper[36504]: I1203 22:10:34.877914 36504 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 22:10:34.878116 master-0 kubenswrapper[36504]: I1203 22:10:34.878041 36504 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 22:10:34.878308 master-0 kubenswrapper[36504]: I1203 22:10:34.878282 36504 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 22:10:34.878460 master-0 kubenswrapper[36504]: I1203 22:10:34.878433 36504 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 22:10:34.878624 master-0 kubenswrapper[36504]: I1203 22:10:34.878596 36504 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 22:10:34.878750 master-0 kubenswrapper[36504]: I1203 22:10:34.878727 36504 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 22:10:34.878964 master-0 kubenswrapper[36504]: I1203 22:10:34.878943 36504 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 22:10:34.879082 master-0 kubenswrapper[36504]: I1203 22:10:34.879063 36504 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 22:10:34.879196 master-0 kubenswrapper[36504]: I1203 22:10:34.879176 36504 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 22:10:34.879314 master-0 kubenswrapper[36504]: I1203 22:10:34.879293 36504 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 22:10:34.879425 master-0 kubenswrapper[36504]: I1203 22:10:34.879404 36504 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 22:10:34.879533 master-0 kubenswrapper[36504]: I1203 22:10:34.879515 36504 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 22:10:34.879642 master-0 kubenswrapper[36504]: I1203 22:10:34.879624 36504 flags.go:64] FLAG: --cgroup-root="" Dec 03 22:10:34.879761 master-0 kubenswrapper[36504]: I1203 22:10:34.879742 36504 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 22:10:34.879916 master-0 kubenswrapper[36504]: I1203 22:10:34.879894 36504 flags.go:64] FLAG: --client-ca-file="" Dec 03 22:10:34.880020 master-0 kubenswrapper[36504]: I1203 22:10:34.880001 36504 flags.go:64] FLAG: --cloud-config="" Dec 03 22:10:34.880128 master-0 kubenswrapper[36504]: I1203 22:10:34.880110 36504 flags.go:64] FLAG: --cloud-provider="" Dec 03 22:10:34.880240 master-0 kubenswrapper[36504]: I1203 22:10:34.880216 36504 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 22:10:34.880348 master-0 kubenswrapper[36504]: I1203 22:10:34.880331 36504 flags.go:64] FLAG: --cluster-domain="" Dec 03 22:10:34.880455 master-0 kubenswrapper[36504]: I1203 22:10:34.880437 36504 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 22:10:34.880576 master-0 kubenswrapper[36504]: I1203 22:10:34.880554 36504 flags.go:64] FLAG: --config-dir="" Dec 03 22:10:34.880757 master-0 kubenswrapper[36504]: I1203 22:10:34.880729 36504 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 22:10:34.880980 master-0 kubenswrapper[36504]: I1203 22:10:34.880947 36504 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 22:10:34.881145 master-0 kubenswrapper[36504]: I1203 22:10:34.881116 36504 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 22:10:34.881305 master-0 kubenswrapper[36504]: I1203 22:10:34.881278 36504 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 22:10:34.881526 master-0 kubenswrapper[36504]: I1203 22:10:34.881495 36504 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 22:10:34.881709 master-0 kubenswrapper[36504]: I1203 22:10:34.881680 36504 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 22:10:34.881932 master-0 kubenswrapper[36504]: I1203 22:10:34.881899 36504 flags.go:64] FLAG: --contention-profiling="false" Dec 03 22:10:34.882089 master-0 kubenswrapper[36504]: I1203 22:10:34.882064 36504 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 22:10:34.882245 master-0 kubenswrapper[36504]: I1203 22:10:34.882217 36504 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 22:10:34.882394 master-0 kubenswrapper[36504]: I1203 22:10:34.882370 36504 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 22:10:34.882561 master-0 kubenswrapper[36504]: I1203 22:10:34.882526 36504 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882797 36504 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882827 36504 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882839 36504 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882851 36504 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882864 36504 flags.go:64] FLAG: --enable-server="true" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882876 36504 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882892 36504 flags.go:64] FLAG: --event-burst="100" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882905 36504 flags.go:64] FLAG: --event-qps="50" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882918 36504 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882930 36504 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882942 36504 flags.go:64] FLAG: --eviction-hard="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882958 36504 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882970 36504 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882986 36504 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.882999 36504 flags.go:64] FLAG: --eviction-soft="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883011 36504 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883023 36504 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883035 36504 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883046 36504 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883058 36504 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883071 36504 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883083 36504 flags.go:64] FLAG: --feature-gates="" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883114 36504 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883126 36504 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 22:10:34.884935 master-0 kubenswrapper[36504]: I1203 22:10:34.883139 36504 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883151 36504 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883163 36504 flags.go:64] FLAG: --healthz-port="10248" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883175 36504 flags.go:64] FLAG: --help="false" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883187 36504 flags.go:64] FLAG: --hostname-override="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883199 36504 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883211 36504 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883223 36504 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883234 36504 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883247 36504 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883259 36504 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883270 36504 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883282 36504 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883293 36504 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883306 36504 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883319 36504 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883331 36504 flags.go:64] FLAG: --kube-reserved="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883342 36504 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883352 36504 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883364 36504 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883376 36504 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883388 36504 flags.go:64] FLAG: --lock-file="" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883399 36504 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883410 36504 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883422 36504 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883443 36504 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 22:10:34.886617 master-0 kubenswrapper[36504]: I1203 22:10:34.883455 36504 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883467 36504 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883478 36504 flags.go:64] FLAG: --logging-format="text" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883490 36504 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883502 36504 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883514 36504 flags.go:64] FLAG: --manifest-url="" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883526 36504 flags.go:64] FLAG: --manifest-url-header="" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883542 36504 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883554 36504 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883570 36504 flags.go:64] FLAG: --max-pods="110" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883584 36504 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883597 36504 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883608 36504 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883621 36504 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883633 36504 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883646 36504 flags.go:64] FLAG: --node-ip="192.168.32.10" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883659 36504 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883690 36504 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883702 36504 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883714 36504 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883726 36504 flags.go:64] FLAG: --pod-cidr="" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883738 36504 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fff930cf757e23d388d86d05942b76e44d3bda5e387b299c239e4d12545d26dd" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883758 36504 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 22:10:34.888174 master-0 kubenswrapper[36504]: I1203 22:10:34.883841 36504 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883858 36504 flags.go:64] FLAG: --pods-per-core="0" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883870 36504 flags.go:64] FLAG: --port="10250" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883883 36504 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883895 36504 flags.go:64] FLAG: --provider-id="" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883906 36504 flags.go:64] FLAG: --qos-reserved="" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883926 36504 flags.go:64] FLAG: --read-only-port="10255" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883938 36504 flags.go:64] FLAG: --register-node="true" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883950 36504 flags.go:64] FLAG: --register-schedulable="true" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883961 36504 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883982 36504 flags.go:64] FLAG: --registry-burst="10" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.883993 36504 flags.go:64] FLAG: --registry-qps="5" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884009 36504 flags.go:64] FLAG: --reserved-cpus="" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884024 36504 flags.go:64] FLAG: --reserved-memory="" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884053 36504 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884071 36504 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884082 36504 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884094 36504 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884106 36504 flags.go:64] FLAG: --runonce="false" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884118 36504 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884130 36504 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884142 36504 flags.go:64] FLAG: --seccomp-default="false" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884153 36504 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884164 36504 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884177 36504 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884189 36504 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 22:10:34.889542 master-0 kubenswrapper[36504]: I1203 22:10:34.884201 36504 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884213 36504 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884223 36504 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884233 36504 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884242 36504 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884251 36504 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884260 36504 flags.go:64] FLAG: --system-cgroups="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884270 36504 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884285 36504 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884294 36504 flags.go:64] FLAG: --tls-cert-file="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884303 36504 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884316 36504 flags.go:64] FLAG: --tls-min-version="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884328 36504 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884337 36504 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884346 36504 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884355 36504 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884364 36504 flags.go:64] FLAG: --v="2" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884377 36504 flags.go:64] FLAG: --version="false" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884393 36504 flags.go:64] FLAG: --vmodule="" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884407 36504 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: I1203 22:10:34.884420 36504 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: W1203 22:10:34.884703 36504 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: W1203 22:10:34.884720 36504 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: W1203 22:10:34.884728 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:10:34.890973 master-0 kubenswrapper[36504]: W1203 22:10:34.884737 36504 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884745 36504 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884753 36504 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884760 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884808 36504 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884819 36504 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884827 36504 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884835 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884842 36504 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884853 36504 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884864 36504 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884873 36504 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884882 36504 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884891 36504 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884901 36504 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884967 36504 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884982 36504 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.884994 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.885003 36504 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:10:34.892307 master-0 kubenswrapper[36504]: W1203 22:10:34.885012 36504 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885023 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885032 36504 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885040 36504 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885049 36504 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885058 36504 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885068 36504 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885078 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885086 36504 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885095 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885110 36504 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885120 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885131 36504 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885142 36504 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885152 36504 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885164 36504 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885175 36504 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885187 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885195 36504 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885204 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:10:34.893497 master-0 kubenswrapper[36504]: W1203 22:10:34.885213 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885220 36504 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885248 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885284 36504 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885295 36504 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885307 36504 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885317 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885326 36504 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885335 36504 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885343 36504 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885353 36504 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885363 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885374 36504 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885385 36504 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885399 36504 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885411 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885422 36504 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885433 36504 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885447 36504 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:10:34.894589 master-0 kubenswrapper[36504]: W1203 22:10:34.885460 36504 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885471 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885482 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885496 36504 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885505 36504 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885513 36504 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885521 36504 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885531 36504 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885542 36504 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885552 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.885563 36504 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: I1203 22:10:34.885599 36504 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: I1203 22:10:34.894052 36504 server.go:491] "Kubelet version" kubeletVersion="v1.31.13" Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: I1203 22:10:34.894094 36504 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.894228 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:10:34.895721 master-0 kubenswrapper[36504]: W1203 22:10:34.894241 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894251 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894259 36504 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894267 36504 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894278 36504 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894291 36504 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894299 36504 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894308 36504 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894316 36504 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894324 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894332 36504 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894340 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894348 36504 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894356 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894364 36504 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894372 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894380 36504 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894388 36504 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:10:34.896618 master-0 kubenswrapper[36504]: W1203 22:10:34.894396 36504 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894406 36504 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894415 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894424 36504 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894434 36504 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894442 36504 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894451 36504 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894459 36504 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894468 36504 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894476 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894484 36504 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894495 36504 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894504 36504 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894514 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894522 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894532 36504 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894540 36504 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894548 36504 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894556 36504 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:10:34.897586 master-0 kubenswrapper[36504]: W1203 22:10:34.894563 36504 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894571 36504 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894579 36504 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894587 36504 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894596 36504 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894605 36504 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894613 36504 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894621 36504 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894629 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894637 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894645 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894653 36504 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894661 36504 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894669 36504 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894677 36504 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894684 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894693 36504 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894701 36504 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894711 36504 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894721 36504 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:10:34.898617 master-0 kubenswrapper[36504]: W1203 22:10:34.894731 36504 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894743 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894753 36504 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894763 36504 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894803 36504 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894814 36504 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894824 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894839 36504 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894851 36504 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894859 36504 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894867 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894875 36504 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894896 36504 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.894904 36504 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: I1203 22:10:34.894917 36504 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 22:10:34.899699 master-0 kubenswrapper[36504]: W1203 22:10:34.895143 36504 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895158 36504 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895168 36504 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895177 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895186 36504 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895194 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895202 36504 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895211 36504 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895220 36504 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895228 36504 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895236 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895244 36504 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895252 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895260 36504 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895268 36504 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895276 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895284 36504 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895292 36504 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895300 36504 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895308 36504 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 22:10:34.900590 master-0 kubenswrapper[36504]: W1203 22:10:34.895319 36504 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895329 36504 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895338 36504 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895347 36504 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895356 36504 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895365 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895375 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895384 36504 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895393 36504 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895401 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895409 36504 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895417 36504 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895425 36504 feature_gate.go:330] unrecognized feature gate: Example Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895434 36504 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895444 36504 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895454 36504 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895464 36504 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895474 36504 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895484 36504 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895494 36504 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 22:10:34.901667 master-0 kubenswrapper[36504]: W1203 22:10:34.895504 36504 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895517 36504 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895529 36504 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895538 36504 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895546 36504 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895554 36504 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895565 36504 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895574 36504 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895583 36504 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895591 36504 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895600 36504 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895609 36504 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895617 36504 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895625 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895634 36504 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895641 36504 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895649 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895657 36504 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895666 36504 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 22:10:34.902750 master-0 kubenswrapper[36504]: W1203 22:10:34.895674 36504 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895682 36504 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895689 36504 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895697 36504 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895705 36504 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895712 36504 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895721 36504 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895728 36504 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895736 36504 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895744 36504 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895753 36504 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895761 36504 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: W1203 22:10:34.895794 36504 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: I1203 22:10:34.895806 36504 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:false StreamingCollectionEncodingToProtobuf:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: I1203 22:10:34.896049 36504 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 22:10:34.904221 master-0 kubenswrapper[36504]: I1203 22:10:34.898878 36504 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.899027 36504 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.899431 36504 server.go:997] "Starting client certificate rotation" Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.899451 36504 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.899691 36504 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 18:59:00.729903036 +0000 UTC Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.899836 36504 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h48m25.830070983s for next certificate rotation Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.900708 36504 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.903144 36504 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 22:10:34.906896 master-0 kubenswrapper[36504]: I1203 22:10:34.906165 36504 log.go:25] "Validated CRI v1 runtime API" Dec 03 22:10:34.914292 master-0 kubenswrapper[36504]: I1203 22:10:34.914231 36504 log.go:25] "Validated CRI v1 image API" Dec 03 22:10:34.915578 master-0 kubenswrapper[36504]: I1203 22:10:34.915515 36504 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 22:10:34.934744 master-0 kubenswrapper[36504]: I1203 22:10:34.934614 36504 fs.go:135] Filesystem UUIDs: map[3c671a63-22b6-47f8-bf0c-b9acbe18afb0:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Dec 03 22:10:34.937035 master-0 kubenswrapper[36504]: I1203 22:10:34.934708 36504 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254/userdata/shm major:0 minor:1619 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365/userdata/shm major:0 minor:1554 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09da87e2dab0559242037544c26bbf09555449a73117b70108fdb02f60b3cce2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09da87e2dab0559242037544c26bbf09555449a73117b70108fdb02f60b3cce2/userdata/shm major:0 minor:960 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688/userdata/shm major:0 minor:335 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311/userdata/shm major:0 minor:324 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977/userdata/shm major:0 minor:165 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd/userdata/shm major:0 minor:743 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a/userdata/shm major:0 minor:319 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e/userdata/shm major:0 minor:93 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/263e27d05b8311eaef7fd597428646a031a70345e2468d3a197a3e76a71409ad/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/263e27d05b8311eaef7fd597428646a031a70345e2468d3a197a3e76a71409ad/userdata/shm major:0 minor:61 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27cf3d4301968602620f0710474b0cd1874a47ae80ca26e646bde5b1b38a2e9d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27cf3d4301968602620f0710474b0cd1874a47ae80ca26e646bde5b1b38a2e9d/userdata/shm major:0 minor:67 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432/userdata/shm major:0 minor:1431 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ae62ce3b48419372763c717979fe81f4210258cf21652aeea2900df29d7ef00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ae62ce3b48419372763c717979fe81f4210258cf21652aeea2900df29d7ef00/userdata/shm major:0 minor:580 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8/userdata/shm major:0 minor:1357 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b/userdata/shm major:0 minor:322 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a/userdata/shm major:0 minor:1347 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/33442f434ca7c4c52187f1b4d1183975995d4960e8685cc35c4e0acc9b058c70/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/33442f434ca7c4c52187f1b4d1183975995d4960e8685cc35c4e0acc9b058c70/userdata/shm major:0 minor:468 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3446e10d150ba6b0a0d4376dd59be02274fd256ae9b67434db5d0c00d0a96a36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3446e10d150ba6b0a0d4376dd59be02274fd256ae9b67434db5d0c00d0a96a36/userdata/shm major:0 minor:1471 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a/userdata/shm major:0 minor:526 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f/userdata/shm major:0 minor:1355 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20/userdata/shm major:0 minor:1487 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3d9073cce7422ee3531e5bcb005037bb8c3536325d2ddba91234eef2010050ed/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3d9073cce7422ee3531e5bcb005037bb8c3536325d2ddba91234eef2010050ed/userdata/shm major:0 minor:578 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41/userdata/shm major:0 minor:1107 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4b5dd9433f686e30d70877cae31ac505a20b05bd1de2322e270f022f7fa31aa9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4b5dd9433f686e30d70877cae31ac505a20b05bd1de2322e270f022f7fa31aa9/userdata/shm major:0 minor:1017 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e/userdata/shm major:0 minor:1014 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9/userdata/shm major:0 minor:173 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f/userdata/shm major:0 minor:1361 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4/userdata/shm major:0 minor:930 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e/userdata/shm major:0 minor:737 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5703a6e2ab5e982bbe6de5731742c9b48a97f1ee525f754cfdb8e3ab5ff893fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5703a6e2ab5e982bbe6de5731742c9b48a97f1ee525f754cfdb8e3ab5ff893fc/userdata/shm major:0 minor:994 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9/userdata/shm major:0 minor:111 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e/userdata/shm major:0 minor:996 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983/userdata/shm major:0 minor:1125 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8/userdata/shm major:0 minor:359 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb/userdata/shm major:0 minor:741 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445/userdata/shm major:0 minor:107 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704/userdata/shm major:0 minor:1349 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/670cc7960481086b8965fb2da3afeafd568cc7eac708174b6ff365eec4bad5b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/670cc7960481086b8965fb2da3afeafd568cc7eac708174b6ff365eec4bad5b9/userdata/shm major:0 minor:968 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6cee457a91036b49aea08a36be55afac46ecdcc7958eeb7e1aefc04a6a5aeae8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6cee457a91036b49aea08a36be55afac46ecdcc7958eeb7e1aefc04a6a5aeae8/userdata/shm major:0 minor:665 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/788f405f215c107f0aaae844e0357bd2af8b176524b9b27b9d45876c6c07c516/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/788f405f215c107f0aaae844e0357bd2af8b176524b9b27b9d45876c6c07c516/userdata/shm major:0 minor:473 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d/userdata/shm major:0 minor:68 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7f5e855d5ea32f73759be5b29eddccce01f7fec70b6614f70377310dbf597215/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7f5e855d5ea32f73759be5b29eddccce01f7fec70b6614f70377310dbf597215/userdata/shm major:0 minor:997 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa/userdata/shm major:0 minor:635 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00/userdata/shm major:0 minor:127 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/86b3974531c12f65b34f84b67c60273ab31a4569e9157b7dd04d59eef5e8591d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/86b3974531c12f65b34f84b67c60273ab31a4569e9157b7dd04d59eef5e8591d/userdata/shm major:0 minor:783 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8/userdata/shm major:0 minor:714 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7/userdata/shm major:0 minor:361 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9/userdata/shm major:0 minor:772 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153/userdata/shm major:0 minor:1469 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c6d88ab184c0e7fbb3e1e5fe4f3291f9c0461b4a5747d2c7087ffe16e8ba75d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c6d88ab184c0e7fbb3e1e5fe4f3291f9c0461b4a5747d2c7087ffe16e8ba75d/userdata/shm major:0 minor:76 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f/userdata/shm major:0 minor:486 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a7ee3d19bf23284e7bcdb3c58da723e7cde6ebcface7f54320c36b317f73830b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a7ee3d19bf23284e7bcdb3c58da723e7cde6ebcface7f54320c36b317f73830b/userdata/shm major:0 minor:959 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f/userdata/shm major:0 minor:859 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b0c53893b7d584ad782050fd24c3bd810082c269ba863ea58500a5c74b322c5a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b0c53893b7d584ad782050fd24c3bd810082c269ba863ea58500a5c74b322c5a/userdata/shm major:0 minor:739 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b3ab62262f6fcfe451dadb0b4353828f4d4962cfc03eadf6453b600925623b4d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b3ab62262f6fcfe451dadb0b4353828f4d4962cfc03eadf6453b600925623b4d/userdata/shm major:0 minor:78 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8/userdata/shm major:0 minor:829 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bc976a2c91b624ecb3225ef5481f2ee770ab314872bf257e672f9145ca896171/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bc976a2c91b624ecb3225ef5481f2ee770ab314872bf257e672f9145ca896171/userdata/shm major:0 minor:581 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36/userdata/shm major:0 minor:353 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0/userdata/shm major:0 minor:355 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c899fa45611ca8adf80b1957f02ef9824a27d5c29a283564fa77a53c34bc01e4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c899fa45611ca8adf80b1957f02ef9824a27d5c29a283564fa77a53c34bc01e4/userdata/shm major:0 minor:595 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a/userdata/shm major:0 minor:1344 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09/userdata/shm major:0 minor:318 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c/userdata/shm major:0 minor:354 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787/userdata/shm major:0 minor:828 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac/userdata/shm major:0 minor:1000 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e/userdata/shm major:0 minor:147 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1/userdata/shm major:0 minor:189 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e/userdata/shm major:0 minor:546 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795/userdata/shm major:0 minor:479 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95/userdata/shm major:0 minor:123 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248/userdata/shm major:0 minor:600 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f/userdata/shm major:0 minor:360 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ee51a7c7f756f465a21b62d37220086693d4cf6214b3eb70fad749bc4447332b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ee51a7c7f756f465a21b62d37220086693d4cf6214b3eb70fad749bc4447332b/userdata/shm major:0 minor:1018 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8227287562fdcce3f6bc21dbb0cca3acb31c35d8825110980ae93ba96b9894f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8227287562fdcce3f6bc21dbb0cca3acb31c35d8825110980ae93ba96b9894f/userdata/shm major:0 minor:705 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/volumes/kubernetes.io~projected/kube-api-access-v28xw:{mountpoint:/var/lib/kubelet/pods/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/volumes/kubernetes.io~projected/kube-api-access-v28xw major:0 minor:929 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:928 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~projected/kube-api-access-tgddm:{mountpoint:/var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~projected/kube-api-access-tgddm major:0 minor:332 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:731 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~projected/kube-api-access major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~secret/serving-cert major:0 minor:287 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:309 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/kube-api-access-2kfg5:{mountpoint:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/kube-api-access-2kfg5 major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~secret/metrics-tls major:0 minor:576 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0a49c320-f31d-4f6d-98c3-48d24346b873/volumes/kubernetes.io~projected/kube-api-access-s7dfd:{mountpoint:/var/lib/kubelet/pods/0a49c320-f31d-4f6d-98c3-48d24346b873/volumes/kubernetes.io~projected/kube-api-access-s7dfd major:0 minor:1121 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0e9427b8-d62c-45f7-97d0-1f7667ff27aa/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/0e9427b8-d62c-45f7-97d0-1f7667ff27aa/volumes/kubernetes.io~projected/kube-api-access major:0 minor:713 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0e9427b8-d62c-45f7-97d0-1f7667ff27aa/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0e9427b8-d62c-45f7-97d0-1f7667ff27aa/volumes/kubernetes.io~secret/serving-cert major:0 minor:712 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10fc6516-cd4d-4291-a26d-8376ba0affef/volumes/kubernetes.io~projected/kube-api-access-h9pcw:{mountpoint:/var/lib/kubelet/pods/10fc6516-cd4d-4291-a26d-8376ba0affef/volumes/kubernetes.io~projected/kube-api-access-h9pcw major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~projected/kube-api-access-lsr8k:{mountpoint:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~projected/kube-api-access-lsr8k major:0 minor:188 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~secret/webhook-cert major:0 minor:187 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b47f2ef-9923-411f-9f2f-ddaea8bc7053/volumes/kubernetes.io~projected/kube-api-access-dx9sj:{mountpoint:/var/lib/kubelet/pods/1b47f2ef-9923-411f-9f2f-ddaea8bc7053/volumes/kubernetes.io~projected/kube-api-access-dx9sj major:0 minor:501 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1b47f2ef-9923-411f-9f2f-ddaea8bc7053/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/1b47f2ef-9923-411f-9f2f-ddaea8bc7053/volumes/kubernetes.io~secret/signing-key major:0 minor:500 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~projected/kube-api-access-5h2wx:{mountpoint:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~projected/kube-api-access-5h2wx major:0 minor:545 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/encryption-config major:0 minor:547 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/etcd-client major:0 minor:548 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/serving-cert major:0 minor:549 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/28c42112-a09e-4b7a-b23b-c06bef69cbfb/volumes/kubernetes.io~projected/kube-api-access-89p9d:{mountpoint:/var/lib/kubelet/pods/28c42112-a09e-4b7a-b23b-c06bef69cbfb/volumes/kubernetes.io~projected/kube-api-access-89p9d major:0 minor:466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~projected/kube-api-access-p4mbz:{mountpoint:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~projected/kube-api-access-p4mbz major:0 minor:164 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:163 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2b014bee-5931-4856-b9e8-e38a134a1b6b/volumes/kubernetes.io~projected/kube-api-access-ntg2z:{mountpoint:/var/lib/kubelet/pods/2b014bee-5931-4856-b9e8-e38a134a1b6b/volumes/kubernetes.io~projected/kube-api-access-ntg2z major:0 minor:470 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~projected/kube-api-access-8hfrr:{mountpoint:/var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~projected/kube-api-access-8hfrr major:0 minor:1467 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:1461 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~secret/kube-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~secret/kube-state-metrics-tls major:0 minor:1475 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:314 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/kube-api-access-4fsxc:{mountpoint:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/kube-api-access-4fsxc major:0 minor:317 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:575 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a7e0eea-3da8-43de-87bc-d10231e7c239/volumes/kubernetes.io~projected/kube-api-access-6kvxd:{mountpoint:/var/lib/kubelet/pods/3a7e0eea-3da8-43de-87bc-d10231e7c239/volumes/kubernetes.io~projected/kube-api-access-6kvxd major:0 minor:956 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3a7e0eea-3da8-43de-87bc-d10231e7c239/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/3a7e0eea-3da8-43de-87bc-d10231e7c239/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:955 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~projected/kube-api-access-npkww:{mountpoint:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~projected/kube-api-access-npkww major:0 minor:1553 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/client-ca-bundle:{mountpoint:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/client-ca-bundle major:0 minor:1550 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/secret-metrics-client-certs:{mountpoint:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/secret-metrics-client-certs major:0 minor:1551 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/secret-metrics-server-tls:{mountpoint:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/secret-metrics-server-tls major:0 minor:1552 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/452012bf-eae1-4e69-9ba1-034309e9f2c8/volumes/kubernetes.io~projected/kube-api-access-b9q7k:{mountpoint:/var/lib/kubelet/pods/452012bf-eae1-4e69-9ba1-034309e9f2c8/volumes/kubernetes.io~projected/kube-api-access-b9q7k major:0 minor:1618 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/452012bf-eae1-4e69-9ba1-034309e9f2c8/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/452012bf-eae1-4e69-9ba1-034309e9f2c8/volumes/kubernetes.io~secret/webhook-certs major:0 minor:1617 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~projected/kube-api-access-jvw7p:{mountpoint:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~projected/kube-api-access-jvw7p major:0 minor:301 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~secret/serving-cert major:0 minor:296 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~projected/kube-api-access-nvkz7:{mountpoint:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~projected/kube-api-access-nvkz7 major:0 minor:172 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:171 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54767c36-ca29-4c91-9a8a-9699ecfa4afb/volumes/kubernetes.io~projected/kube-api-access-bl7tn:{mountpoint:/var/lib/kubelet/pods/54767c36-ca29-4c91-9a8a-9699ecfa4afb/volumes/kubernetes.io~projected/kube-api-access-bl7tn major:0 minor:692 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54767c36-ca29-4c91-9a8a-9699ecfa4afb/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/54767c36-ca29-4c91-9a8a-9699ecfa4afb/volumes/kubernetes.io~secret/metrics-tls major:0 minor:704 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/578b2d03-b8b3-4c75-adde-73899c472ad7/volumes/kubernetes.io~projected/kube-api-access-gfd7g:{mountpoint:/var/lib/kubelet/pods/578b2d03-b8b3-4c75-adde-73899c472ad7/volumes/kubernetes.io~projected/kube-api-access-gfd7g major:0 minor:991 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/578b2d03-b8b3-4c75-adde-73899c472ad7/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/578b2d03-b8b3-4c75-adde-73899c472ad7/volumes/kubernetes.io~secret/serving-cert major:0 minor:990 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~projected/kube-api-access-94vvg:{mountpoint:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~projected/kube-api-access-94vvg major:0 minor:316 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~secret/serving-cert major:0 minor:297 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/62b43fe1-63f5-4d29-90a2-f36cb9e880ff/volumes/kubernetes.io~projected/kube-api-access-wgwxt:{mountpoint:/var/lib/kubelet/pods/62b43fe1-63f5-4d29-90a2-f36cb9e880ff/volumes/kubernetes.io~projected/kube-api-access-wgwxt major:0 minor:958 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/62b43fe1-63f5-4d29-90a2-f36cb9e880ff/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/62b43fe1-63f5-4d29-90a2-f36cb9e880ff/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:957 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes/kubernetes.io~projected/kube-api-access-h2tqg:{mountpoint:/var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes/kubernetes.io~projected/kube-api-access-h2tqg major:0 minor:699 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes/kubernetes.io~secret/serving-cert major:0 minor:427 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66aa2598-f4b6-4d3a-9623-aeb707e4912b/volumes/kubernetes.io~projected/kube-api-access-mw7l6:{mountpoint:/var/lib/kubelet/pods/66aa2598-f4b6-4d3a-9623-aeb707e4912b/volumes/kubernetes.io~projected/kube-api-access-mw7l6 major:0 minor:703 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~projected/kube-api-access-b658f:{mountpoint:/var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~projected/kube-api-access-b658f major:0 minor:1430 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~secret/certs major:0 minor:1428 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1429 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~projected/kube-api-access-vx6rt:{mountpoint:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~projected/kube-api-access-vx6rt major:0 minor:311 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~secret/serving-cert major:0 minor:292 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~projected/kube-api-access-97kqz:{mountpoint:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~projected/kube-api-access-97kqz major:0 minor:1341 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/default-certificate major:0 minor:1335 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1338 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/stats-auth major:0 minor:1336 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e96335e-1866-41c8-b128-b95e783a9be4/volumes/kubernetes.io~projected/kube-api-access-v8rjd:{mountpoint:/var/lib/kubelet/pods/6e96335e-1866-41c8-b128-b95e783a9be4/volumes/kubernetes.io~projected/kube-api-access-v8rjd major:0 minor:1005 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e96335e-1866-41c8-b128-b95e783a9be4/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/6e96335e-1866-41c8-b128-b95e783a9be4/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:1003 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~projected/kube-api-access-s87hj:{mountpoint:/var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~projected/kube-api-access-s87hj major:0 minor:1124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:1122 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~secret/webhook-cert major:0 minor:1123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~projected/kube-api-access-j6m8f:{mountpoint:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~projected/kube-api-access-j6m8f major:0 minor:308 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~secret/serving-cert major:0 minor:294 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/787c50e1-35b5-43d7-9c26-8dd5399693d3/volumes/kubernetes.io~projected/kube-api-access-jzl8x:{mountpoint:/var/lib/kubelet/pods/787c50e1-35b5-43d7-9c26-8dd5399693d3/volumes/kubernetes.io~projected/kube-api-access-jzl8x major:0 minor:1339 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c8ec36d-9179-40ab-a448-440b4501b3e0/volumes/kubernetes.io~projected/kube-api-access-t2xtm:{mountpoint:/var/lib/kubelet/pods/7c8ec36d-9179-40ab-a448-440b4501b3e0/volumes/kubernetes.io~projected/kube-api-access-t2xtm major:0 minor:391 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c8ec36d-9179-40ab-a448-440b4501b3e0/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/7c8ec36d-9179-40ab-a448-440b4501b3e0/volumes/kubernetes.io~secret/cert major:0 minor:599 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~projected/kube-api-access-mxl5r:{mountpoint:/var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~projected/kube-api-access-mxl5r major:0 minor:151 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~secret/metrics-certs major:0 minor:728 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/814c8acf-fb8d-4f57-b8db-21304402c1f1/volumes/kubernetes.io~projected/kube-api-access-x7fsz:{mountpoint:/var/lib/kubelet/pods/814c8acf-fb8d-4f57-b8db-21304402c1f1/volumes/kubernetes.io~projected/kube-api-access-x7fsz major:0 minor:333 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~projected/kube-api-access-4clxk:{mountpoint:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~projected/kube-api-access-4clxk major:0 minor:306 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/etcd-client major:0 minor:291 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/serving-cert major:0 minor:288 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/858384f3-5741-4e67-8669-2eb2b2dcaf7f/volumes/kubernetes.io~projected/kube-api-access-qgrbd:{mountpoint:/var/lib/kubelet/pods/858384f3-5741-4e67-8669-2eb2b2dcaf7f/volumes/kubernetes.io~projected/kube-api-access-qgrbd major:0 minor:849 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/858384f3-5741-4e67-8669-2eb2b2dcaf7f/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/858384f3-5741-4e67-8669-2eb2b2dcaf7f/volumes/kubernetes.io~secret/cert major:0 minor:848 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~projected/kube-api-access-bvmxp:{mountpoint:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~projected/kube-api-access-bvmxp major:0 minor:66 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8b56c318-09b7-47f0-a7bf-32eb96e836ca/volumes/kubernetes.io~projected/kube-api-access-qtx6m:{mountpoint:/var/lib/kubelet/pods/8b56c318-09b7-47f0-a7bf-32eb96e836ca/volumes/kubernetes.io~projected/kube-api-access-qtx6m major:0 minor:1102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8b56c318-09b7-47f0-a7bf-32eb96e836ca/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/8b56c318-09b7-47f0-a7bf-32eb96e836ca/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:1101 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/922419d4-b528-472e-8215-4a55a96dab08/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/922419d4-b528-472e-8215-4a55a96dab08/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1337 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~projected/ca-certs major:0 minor:562 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~projected/kube-api-access-4z9vv:{mountpoint:/var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~projected/kube-api-access-4z9vv major:0 minor:563 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:664 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a124c14f-20c6-4df3-956f-a858de0c73c9/volumes/kubernetes.io~projected/kube-api-access-fq55c:{mountpoint:/var/lib/kubelet/pods/a124c14f-20c6-4df3-956f-a858de0c73c9/volumes/kubernetes.io~projected/kube-api-access-fq55c major:0 minor:1343 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a124c14f-20c6-4df3-956f-a858de0c73c9/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/a124c14f-20c6-4df3-956f-a858de0c73c9/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1342 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~projected/kube-api-access-2krlg:{mountpoint:/var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~projected/kube-api-access-2krlg major:0 minor:310 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:733 fsType:tmpfs blockSize:0} / Dec 03 22:10:34.937888 master-0 kubenswrapper[36504]: var/lib/kubelet/pods/a9940ff5-36a6-4c04-a51d-66f7d83bea7c/volumes/kubernetes.io~projected/kube-api-access-thgv2:{mountpoint:/var/lib/kubelet/pods/a9940ff5-36a6-4c04-a51d-66f7d83bea7c/volumes/kubernetes.io~projected/kube-api-access-thgv2 major:0 minor:832 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9940ff5-36a6-4c04-a51d-66f7d83bea7c/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/a9940ff5-36a6-4c04-a51d-66f7d83bea7c/volumes/kubernetes.io~secret/proxy-tls major:0 minor:1007 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~projected/kube-api-access-575vn:{mountpoint:/var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~projected/kube-api-access-575vn major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~secret/metrics-tls major:0 minor:577 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~projected/kube-api-access-hzjtq:{mountpoint:/var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~projected/kube-api-access-hzjtq major:0 minor:993 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:988 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~secret/srv-cert major:0 minor:911 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/add88bf0-c88d-427d-94bb-897e088a1378/volumes/kubernetes.io~projected/kube-api-access-hkhtr:{mountpoint:/var/lib/kubelet/pods/add88bf0-c88d-427d-94bb-897e088a1378/volumes/kubernetes.io~projected/kube-api-access-hkhtr major:0 minor:305 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:566 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~empty-dir/tmp major:0 minor:565 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~projected/kube-api-access-df8nl:{mountpoint:/var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~projected/kube-api-access-df8nl major:0 minor:567 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~projected/kube-api-access-b7zsw:{mountpoint:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~projected/kube-api-access-b7zsw major:0 minor:592 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/encryption-config major:0 minor:591 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/etcd-client major:0 minor:564 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/serving-cert major:0 minor:632 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~projected/kube-api-access-t4c4r:{mountpoint:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~projected/kube-api-access-t4c4r major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:524 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:525 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b8194009-3743-4da7-baf1-f9bb0afd6187/volumes/kubernetes.io~projected/kube-api-access-rd9vn:{mountpoint:/var/lib/kubelet/pods/b8194009-3743-4da7-baf1-f9bb0afd6187/volumes/kubernetes.io~projected/kube-api-access-rd9vn major:0 minor:122 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba624ed0-32cc-4c87-81a5-708a8a8a7f88/volumes/kubernetes.io~projected/kube-api-access-n4gds:{mountpoint:/var/lib/kubelet/pods/ba624ed0-32cc-4c87-81a5-708a8a8a7f88/volumes/kubernetes.io~projected/kube-api-access-n4gds major:0 minor:803 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba624ed0-32cc-4c87-81a5-708a8a8a7f88/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/ba624ed0-32cc-4c87-81a5-708a8a8a7f88/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:802 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~projected/kube-api-access-dwdzk:{mountpoint:/var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~projected/kube-api-access-dwdzk major:0 minor:1468 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config major:0 minor:1465 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~secret/node-exporter-tls:{mountpoint:/var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~secret/node-exporter-tls major:0 minor:1462 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd18a700-53b2-430c-a34f-dbb6331cfbe5/volumes/kubernetes.io~projected/kube-api-access-ll9bs:{mountpoint:/var/lib/kubelet/pods/bd18a700-53b2-430c-a34f-dbb6331cfbe5/volumes/kubernetes.io~projected/kube-api-access-ll9bs major:0 minor:697 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd18a700-53b2-430c-a34f-dbb6331cfbe5/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/bd18a700-53b2-430c-a34f-dbb6331cfbe5/volumes/kubernetes.io~secret/proxy-tls major:0 minor:696 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~projected/kube-api-access-n7qqf:{mountpoint:/var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~projected/kube-api-access-n7qqf major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:730 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~projected/kube-api-access major:0 minor:315 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~secret/serving-cert major:0 minor:290 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c807d487-5b8f-4747-87ee-df0637e2e11f/volumes/kubernetes.io~projected/kube-api-access-nlw7s:{mountpoint:/var/lib/kubelet/pods/c807d487-5b8f-4747-87ee-df0637e2e11f/volumes/kubernetes.io~projected/kube-api-access-nlw7s major:0 minor:104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~projected/kube-api-access-pm2l8:{mountpoint:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~projected/kube-api-access-pm2l8 major:0 minor:312 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:293 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~projected/kube-api-access-qnfgr:{mountpoint:/var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~projected/kube-api-access-qnfgr major:0 minor:1466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config major:0 minor:1464 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~secret/openshift-state-metrics-tls:{mountpoint:/var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~secret/openshift-state-metrics-tls major:0 minor:1463 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~projected/kube-api-access-gcb88:{mountpoint:/var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~projected/kube-api-access-gcb88 major:0 minor:1445 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:1444 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:1443 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/da949cf7-ab12-43ff-8e45-da1c2fd46e20/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/da949cf7-ab12-43ff-8e45-da1c2fd46e20/volumes/kubernetes.io~projected/ca-certs major:0 minor:590 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/da949cf7-ab12-43ff-8e45-da1c2fd46e20/volumes/kubernetes.io~projected/kube-api-access-dtxdk:{mountpoint:/var/lib/kubelet/pods/da949cf7-ab12-43ff-8e45-da1c2fd46e20/volumes/kubernetes.io~projected/kube-api-access-dtxdk major:0 minor:593 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/def52ba3-77c1-4e0c-8a0d-44ff4d677607/volumes/kubernetes.io~projected/kube-api-access-dcvkk:{mountpoint:/var/lib/kubelet/pods/def52ba3-77c1-4e0c-8a0d-44ff4d677607/volumes/kubernetes.io~projected/kube-api-access-dcvkk major:0 minor:1346 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/def52ba3-77c1-4e0c-8a0d-44ff4d677607/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/def52ba3-77c1-4e0c-8a0d-44ff4d677607/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:1187 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e403ab42-1840-4292-a37c-a8d4feeb54ca/volumes/kubernetes.io~projected/kube-api-access-tkg95:{mountpoint:/var/lib/kubelet/pods/e403ab42-1840-4292-a37c-a8d4feeb54ca/volumes/kubernetes.io~projected/kube-api-access-tkg95 major:0 minor:1120 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e50b85a6-7767-4fca-8133-8243bdd85e5d/volumes/kubernetes.io~projected/kube-api-access-z5q4k:{mountpoint:/var/lib/kubelet/pods/e50b85a6-7767-4fca-8133-8243bdd85e5d/volumes/kubernetes.io~projected/kube-api-access-z5q4k major:0 minor:992 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e50b85a6-7767-4fca-8133-8243bdd85e5d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e50b85a6-7767-4fca-8133-8243bdd85e5d/volumes/kubernetes.io~secret/serving-cert major:0 minor:989 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e6d5d61a-c5de-4619-9afb-7fad63ba0525/volumes/kubernetes.io~projected/kube-api-access-s68fd:{mountpoint:/var/lib/kubelet/pods/e6d5d61a-c5de-4619-9afb-7fad63ba0525/volumes/kubernetes.io~projected/kube-api-access-s68fd major:0 minor:281 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~projected/kube-api-access-mzklx:{mountpoint:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~projected/kube-api-access-mzklx major:0 minor:313 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~secret/serving-cert major:0 minor:295 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes/kubernetes.io~projected/kube-api-access-bt8l5:{mountpoint:/var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes/kubernetes.io~projected/kube-api-access-bt8l5 major:0 minor:426 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes/kubernetes.io~secret/serving-cert major:0 minor:392 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~projected/kube-api-access-jvzqm:{mountpoint:/var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~projected/kube-api-access-jvzqm major:0 minor:1006 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~secret/profile-collector-cert:{mountpoint:/var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~secret/profile-collector-cert major:0 minor:1002 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~secret/srv-cert major:0 minor:1004 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~projected/kube-api-access-pj79k:{mountpoint:/var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~projected/kube-api-access-pj79k major:0 minor:910 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~secret/cert major:0 minor:839 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:840 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~projected/kube-api-access major:0 minor:298 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~secret/serving-cert major:0 minor:289 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ffad8fc8-4378-44de-8864-dd2f666ade68/volumes/kubernetes.io~projected/kube-api-access-xcq9j:{mountpoint:/var/lib/kubelet/pods/ffad8fc8-4378-44de-8864-dd2f666ade68/volumes/kubernetes.io~projected/kube-api-access-xcq9j major:0 minor:146 fsType:tmpfs blockSize:0} overlay_0-1008:{mountpoint:/var/lib/containers/storage/overlay/2f05ced1dbb170fc8f6604d1bb7a28eac942de966d6400ed69a61f7b0fba1072/merged major:0 minor:1008 fsType:overlay blockSize:0} overlay_0-1010:{mountpoint:/var/lib/containers/storage/overlay/bfa50d4f6a1b30b25e0fa6b8a32ad448e13ab372f7ed38ff96f2e864dbca0bfa/merged major:0 minor:1010 fsType:overlay blockSize:0} overlay_0-1012:{mountpoint:/var/lib/containers/storage/overlay/bd6358528ecdac85d12fc76d1fba5f6fa98d65cb9a288b6cff68c5133b8e2a76/merged major:0 minor:1012 fsType:overlay blockSize:0} overlay_0-1015:{mountpoint:/var/lib/containers/storage/overlay/3d1bcc9bcd87beced884785050d78fde3f7c01a05fe62717c624df3c6d5e8b91/merged major:0 minor:1015 fsType:overlay blockSize:0} overlay_0-1022:{mountpoint:/var/lib/containers/storage/overlay/d1774e4892f07b4a2eaf860a2bd1af6014fb1c833678fd795f12007960da6eee/merged major:0 minor:1022 fsType:overlay blockSize:0} overlay_0-1024:{mountpoint:/var/lib/containers/storage/overlay/267ace796e59e6d08f99879d568eba8f7fd74515e8c53b8b8a99a8f7b4b2f4d8/merged major:0 minor:1024 fsType:overlay blockSize:0} overlay_0-1044:{mountpoint:/var/lib/containers/storage/overlay/af0010f256bb55454020c0d0f55414d3eaf0aeabdd43500a35b67509c93020b7/merged major:0 minor:1044 fsType:overlay blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/590a1e8dc52c51ed6c6cfae4d53e17d2858c48667204113d1ee99e51caebe99b/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-1052:{mountpoint:/var/lib/containers/storage/overlay/b743b07980bbcac7fbaac0dcae3adc4548c7ceb34725157182d3d3c1e8ef56b0/merged major:0 minor:1052 fsType:overlay blockSize:0} overlay_0-1053:{mountpoint:/var/lib/containers/storage/overlay/c76ccc10591d22a8d9220a7740ef843f044580ecc9af8d3a7835d85fc09f552d/merged major:0 minor:1053 fsType:overlay blockSize:0} overlay_0-1055:{mountpoint:/var/lib/containers/storage/overlay/aef040b93fea9cd1b6214e2ce40ef614cf4ec4cad1fa7952611a6c6c85816a36/merged major:0 minor:1055 fsType:overlay blockSize:0} overlay_0-1057:{mountpoint:/var/lib/containers/storage/overlay/dcaa286a5122f8cec162ed38e8562f25030b7d20f50a0bc729e6e22bc2a3933d/merged major:0 minor:1057 fsType:overlay blockSize:0} overlay_0-1059:{mountpoint:/var/lib/containers/storage/overlay/9230e972d1feb6d9b273de45b5cd7b0e25dd1f96662e699382e60bc05b29ff6f/merged major:0 minor:1059 fsType:overlay blockSize:0} overlay_0-1061:{mountpoint:/var/lib/containers/storage/overlay/e63590beac98ea56699fc3823dc4b3303b8ca4823c332bdfbfb3300959776f33/merged major:0 minor:1061 fsType:overlay blockSize:0} overlay_0-1063:{mountpoint:/var/lib/containers/storage/overlay/7a5588416adefddd4613d7131e4fc0e5eb4b3431ae893997f2df4164382e0394/merged major:0 minor:1063 fsType:overlay blockSize:0} overlay_0-1089:{mountpoint:/var/lib/containers/storage/overlay/d7f247896180780846b560f870e7a9973c4515a783cd958534510ce075c1d370/merged major:0 minor:1089 fsType:overlay blockSize:0} overlay_0-1091:{mountpoint:/var/lib/containers/storage/overlay/a3aa37f2bc2bde22937393def538e0b8a5bc6ac4a994c3052fc1d26496511d7d/merged major:0 minor:1091 fsType:overlay blockSize:0} overlay_0-1093:{mountpoint:/var/lib/containers/storage/overlay/19f05b57de273f7f2f30d2bd9b62457f970ff2ba7121bb2ce1bb4e296e3b24c7/merged major:0 minor:1093 fsType:overlay blockSize:0} overlay_0-1106:{mountpoint:/var/lib/containers/storage/overlay/de884ddff4ecf02523c4147ffa212a752ffaaa78c1978c0d3e6fef6c4373d064/merged major:0 minor:1106 fsType:overlay blockSize:0} overlay_0-1109:{mountpoint:/var/lib/containers/storage/overlay/0c3050a387fce79b765a5b68f396ad30de08022562c4dd04db1c2481f10790d0/merged major:0 minor:1109 fsType:overlay blockSize:0} overlay_0-1111:{mountpoint:/var/lib/containers/storage/overlay/a49ad5aaf647d6657d7e783ba90fbacc65d4cb8f9ab9c015b7f3f4089d2e6524/merged major:0 minor:1111 fsType:overlay blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/f6ec8e027d8aa9846b48416c94fed7fb6797c45ceda07880aa558bc8af4d9e69/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-1126:{mountpoint:/var/lib/containers/storage/overlay/a2b85ce618eb58270eadbe9e7aaeb8e34a621042cd6bcd0561edff69b6708345/merged major:0 minor:1126 fsType:overlay blockSize:0} overlay_0-1132:{mountpoint:/var/lib/containers/storage/overlay/8fccc96478da519db97d0c83dee0cc0b5805abe563678b065f600f0c2cef2efa/merged major:0 minor:1132 fsType:overlay blockSize:0} overlay_0-1139:{mountpoint:/var/lib/containers/storage/overlay/cf5c6affd2b86c30b1f4437f98b1376fe433700b512f867b300fc4c248afb7aa/merged major:0 minor:1139 fsType:overlay blockSize:0} overlay_0-1146:{mountpoint:/var/lib/containers/storage/overlay/e9dede8d9cae735f35adb8da06b431e7a5744fcbe4ce32c39e5675fd4df0ae87/merged major:0 minor:1146 fsType:overlay blockSize:0} overlay_0-1147:{mountpoint:/var/lib/containers/storage/overlay/6956633162db90ef9a8f9f239cc6e2722695d842a0fe1e0d36dc1114361bcf50/merged major:0 minor:1147 fsType:overlay blockSize:0} overlay_0-1148:{mountpoint:/var/lib/containers/storage/overlay/6decd91b7726164586425724a51b48fcad70d35d31f100362136895b02ca06dd/merged major:0 minor:1148 fsType:overlay blockSize:0} overlay_0-115:{mountpoint:/var/lib/containers/storage/overlay/20e00314b6d8a4ee73246ec097b917acfef5a2a43cced2020c5c1e7c28b37d39/merged major:0 minor:115 fsType:overlay blockSize:0} overlay_0-1150:{mountpoint:/var/lib/containers/storage/overlay/22859735a62f644c1dbe9b7373d47a8e90df261147fe77cb851037bcb0973726/merged major:0 minor:1150 fsType:overlay blockSize:0} overlay_0-1151:{mountpoint:/var/lib/containers/storage/overlay/8c515b9db939150e5d3d32a126bcb5b67ff0e9535053af13e95397dfa01c20d8/merged major:0 minor:1151 fsType:overlay blockSize:0} overlay_0-1159:{mountpoint:/var/lib/containers/storage/overlay/08cf5d45e0a639afb3c1af07da94a703bd1a12d4657bb407ada61cfcda556439/merged major:0 minor:1159 fsType:overlay blockSize:0} overlay_0-1160:{mountpoint:/var/lib/containers/storage/overlay/b24966f9bfbc134390413a08db05465aca6523b5dc6a27aa1b21234189bbf9dd/merged major:0 minor:1160 fsType:overlay blockSize:0} overlay_0-1166:{mountpoint:/var/lib/containers/storage/overlay/62e88170730dcb179eca30b92e21592235dd2cf7d58327548d9940344270cb90/merged major:0 minor:1166 fsType:overlay blockSize:0} overlay_0-1168:{mountpoint:/var/lib/containers/storage/overlay/35ef480b53e5ee85e8b89913c4acabe1aabd5ccfb65ad0c95d2023f401b2e07b/merged major:0 minor:1168 fsType:overlay blockSize:0} overlay_0-117:{mountpoint:/var/lib/containers/storage/overlay/9af5ea4fd08878bc6a91e0d39392e14d43e2d3f27152bf96d3a2fb50d72a3bbd/merged major:0 minor:117 fsType:overlay blockSize:0} overlay_0-1188:{mountpoint:/var/lib/containers/storage/overlay/0000433b47b98f65a962f8d779a4bf22d4ef3798d89d94bc14bbdede6e50b506/merged major:0 minor:1188 fsType:overlay blockSize:0} overlay_0-119:{mountpoint:/var/lib/containers/storage/overlay/e41b9f76c37ea07fe15a1ac6e4a00b0dbc5efbe0624de2d64e4998209470942d/merged major:0 minor:119 fsType:overlay blockSize:0} overlay_0-1192:{mountpoint:/var/lib/containers/storage/overlay/9e1f6b3e6f32beb23c54dc7746523939a23b46bcbd07ef8fb0a5286c5cbc999a/merged major:0 minor:1192 fsType:overlay blockSize:0} overlay_0-1194:{mountpoint:/var/lib/containers/storage/overlay/d5bc08eb029e60ae3e37ea9a5e5b731da9b789b5bb49a02687e87233df6d5b33/merged major:0 minor:1194 fsType:overlay blockSize:0} overlay_0-1197:{mountpoint:/var/lib/containers/storage/overlay/768512f7955d9f46a7a97038758da809fcf71a8f64b19fad495fb962f3073ef2/merged major:0 minor:1197 fsType:overlay blockSize:0} overlay_0-1199:{mountpoint:/var/lib/containers/storage/overlay/36215699a8aa2329e9dc638b1e807e4c001e85aa63df6b166645842973e9a0c5/merged major:0 minor:1199 fsType:overlay blockSize:0} overlay_0-120:{mountpoint:/var/lib/containers/storage/overlay/fef38fb1272fc4ca3411f86b4e86a029fd771892d80dce51fa3b9184d5bc115d/merged major:0 minor:120 fsType:overlay blockSize:0} overlay_0-1200:{mountpoint:/var/lib/containers/storage/overlay/6bc1834924545f4e201058bdc179db9631557c10885468b9f970d2855162ea1b/merged major:0 minor:1200 fsType:overlay blockSize:0} overlay_0-1210:{mountpoint:/var/lib/containers/storage/overlay/a7e7f6425f326d9aa270f3e339825284ce82b4620e5e348b1c7de249a34f8975/merged major:0 minor:1210 fsType:overlay blockSize:0} overlay_0-1212:{mountpoint:/var/lib/containers/storage/overlay/33bd864fad218b70979d70e5bdccf9dd215b2cc54b6999449e8e5af76c2b25f7/merged major:0 minor:1212 fsType:overlay blockSize:0} overlay_0-1218:{mountpoint:/var/lib/containers/storage/overlay/45dc6ef1981ce482a3f3e5c30bf06866102097e635796446fa737c885cc329f3/merged major:0 minor:1218 fsType:overlay blockSize:0} overlay_0-1224:{mountpoint:/var/lib/containers/storage/overlay/db11e75f1268938bc72fb02defffbd8107650869cbe7ae141c63f156879f164e/merged major:0 minor:1224 fsType:overlay blockSize:0} overlay_0-1226:{mountpoint:/var/lib/containers/storage/overlay/24bd6206e5cfed85e58907a77bc934a08a95be58e75e93aaebeb2474609b7edf/merged major:0 minor:1226 fsType:overlay blockSize:0} overlay_0-1228:{mountpoint:/var/lib/containers/storage/overlay/a8d523d6594e42b0fa66c89d95d657a17642d53cca546518ac04b6b249dd7b7f/merged major:0 minor:1228 fsType:overlay blockSize:0} overlay_0-1239:{mountpoint:/var/lib/containers/storage/overlay/375413ed900675cd393cecc796fabf80c697ac97e0441b01375fd4c9ea20bfd1/merged major:0 minor:1239 fsType:overlay blockSize:0} overlay_0-1245:{mountpoint:/var/lib/containers/storage/overlay/9041028eab4ddfba29bb32d59fd9079fbe7dbc7eb05902d7929d54a6dae2a2f5/merged major:0 minor:1245 fsType:overlay blockSize:0} overlay_0-125:{mountpoint:/var/lib/containers/storage/overlay/c78474e73745df18fc4119e551e0a7ee0a0f86e3be2eea8c2995b259c9b2e7c2/merged major:0 minor:125 fsType:overlay blockSize:0} overlay_0-1250:{mountpoint:/var/lib/containers/storage/overlay/7cf0a4eb0c9349bfd9b697c74d128c0660ffae4bb07925f851f5f00efadc3504/merged major:0 minor:1250 fsType:overlay blockSize:0} overlay_0-1257:{mountpoint:/var/lib/containers/storage/overlay/3d43f4d8f3b97b18d712cdc775578943e2fb50d345068728028743a71dc7f910/merged major:0 minor:1257 fsType:overlay blockSize:0} overlay_0-1265:{mountpoint:/var/lib/containers/storage/overlay/9726194c02ecb3133c4398ed1d27a1613f616e745351a8d245994a0aad7908ce/merged major:0 minor:1265 fsType:overlay blockSize:0} overlay_0-1272:{mountpoint:/var/lib/containers/storage/overlay/29edee483a11c3adbcd0a7ebe05f9e9f73ef08d1a50aec67bf676bf0635a013e/merged major:0 minor:1272 fsType:overlay blockSize:0} overlay_0-1278:{mountpoint:/var/lib/containers/storage/overlay/b40f7675587a3f4a2a6a698e33ff3d883845871afdf29cee823bbaf1bc7accb6/merged major:0 minor:1278 fsType:overlay blockSize:0} overlay_0-1283:{mountpoint:/var/lib/containers/storage/overlay/179c8f2e06bca60f6ac32780858e9b0a66709ee061e6e086c557ffd9e2548ddc/merged major:0 minor:1283 fsType:overlay blockSize:0} overlay_0-1288:{mountpoint:/var/lib/containers/storage/overlay/efbeb96cb8bd92ebc81a17b7f813030cc4f26b4dc011e2dc8ec93da7dcfac289/merged major:0 minor:1288 fsType:overlay blockSize:0} overlay_0-129:{mountpoint:/var/lib/containers/storage/overlay/ce5fceb8ef24fb11f6bd52653b19cd0a25ee077f2e896cadc9daefda4a7c7d13/merged major:0 minor:129 fsType:overlay blockSize:0} overlay_0-1290:{mountpoint:/var/lib/containers/storage/overlay/ea080ddfe4687b71997bd4c06df8e808bb2ad8fac993e66e6090b304a80fd8fa/merged major:0 minor:1290 fsType:overlay blockSize:0} overlay_0-1294:{mountpoint:/var/lib/containers/storage/overlay/c8dfbfe262340cc1a28ea9190bd13a9eee5bba3ef85e485d3443689a6bd617d0/merged major:0 minor:1294 fsType:overlay blockSize:0} overlay_0-1296:{mountpoint:/var/lib/containers/storage/overlay/a31c5489b2d9fd88ec80d9064a4c5ca1f0ce011440821f73ad38207cd186e515/merged major:0 minor:1296 fsType:overlay blockSize:0} overlay_0-1306:{mountpoint:/var/lib/containers/storage/overlay/c44f0aad91758cc67e58559d31a05d04afe6ba4151d02c850dafafab6dd1a4ab/merged major:0 minor:1306 fsType:overlay blockSize:0} overlay_0-1308:{mountpoint:/var/lib/containers/storage/overlay/9dded182f85e16aef08635f201cf6955097a42aec7d6b8ca5760f749897b8395/merged major:0 minor:1308 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/670172869ec86f15fc22c4d3cc63f01f4f7c54bfd1e6688b7d38bdf8fc04a8f9/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-1312:{mountpoint:/var/lib/containers/storage/overlay/7d5bbb90f355a5ce03cdf60a590ee71f507ffd886742241ef9195ac113913a5a/merged major:0 minor:1312 fsType:overlay blockSize:0} overlay_0-1315:{mountpoint:/var/lib/containers/storage/overlay/369cf60c53dbc58c879ea49b10bb369da4aa178c8e6d746ec4a7131c3d5cd3d0/merged major:0 minor:1315 fsType:overlay blockSize:0} overlay_0-1324:{mountpoint:/var/lib/containers/storage/overlay/8d104f0290f79bb0e2d0b30bbbf1079c60a86b97c26dd5224c2e61006ac54e8a/merged major:0 minor:1324 fsType:overlay blockSize:0} overlay_0-1333:{mountpoint:/var/lib/containers/storage/overlay/82b7fc81da1ad0cb0aba22536ec5710bbe39e677878ffcd616c5043defecfcad/merged major:0 minor:1333 fsType:overlay blockSize:0} overlay_0-1351:{mountpoint:/var/lib/containers/storage/overlay/8c8cff273463b5a3cf98fc3420235441b0818924d0558449e6a192c0e2fc6b01/merged major:0 minor:1351 fsType:overlay blockSize:0} overlay_0-1353:{mountpoint:/var/lib/containers/storage/overlay/55b46af98e6b1f2d4cf6e8d99ecc7ef6033f127a9b384f9ff3b48ab5889c247b/merged major:0 minor:1353 fsType:overlay blockSize:0} overlay_0-1359:{mountpoint:/var/lib/containers/storage/overlay/97132ba790829423b8cf447729088771d5d6341ab89b715a47a52941ac0e3701/merged major:0 minor:1359 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/6cd390b2ab9c2361c90a8588bb338ce64c1124616f97c87d4e5cf53bce14e8fe/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-1369:{mountpoint:/var/lib/containers/storage/overlay/f731ae2ddbac632ac0b89a1116536ebc3fa060c162731d8fcdbcfbc8472ab098/merged major:0 minor:1369 fsType:overlay blockSize:0} overlay_0-1371:{mountpoint:/var/lib/containers/storage/overlay/06c41f9385b085faa4e7d9faf11c7186602f704f3850d01ec95c1eff2cd3ee39/merged major:0 minor:1371 fsType:overlay blockSize:0} overlay_0-1379:{mountpoint:/var/lib/containers/storage/overlay/7340cb6e2177437ed93f077198ae1182649d9f31617a608e92c1fd304b3af909/merged major:0 minor:1379 fsType:overlay blockSize:0} overlay_0-1381:{mountpoint:/var/lib/containers/storage/overlay/ff473b0476e90b3f78d55ebe35c98be93831d056862426d75298d74c6bb8bb14/merged major:0 minor:1381 fsType:overlay blockSize:0} overlay_0-1383:{mountpoint:/var/lib/containers/storage/overlay/a1dc7515d362457a6aaf72d52462ea2413791c29f69bb7b1a8934687ba1e6c8e/merged major:0 minor:1383 fsType:overlay blockSize:0} overlay_0-1385:{mountpoint:/var/lib/containers/storage/overlay/c64931eaab8cb881716af01dab7aa5f0fb9dfcce24315a22870749a06fe5efbc/merged major:0 minor:1385 fsType:overlay blockSize:0} overlay_0-1387:{mountpoint:/var/lib/containers/storage/overlay/550470a4623a781d22af0de557927f1d7766020cac9e6f139c9650f8ed1427d0/merged major:0 minor:1387 fsType:overlay blockSize:0} overlay_0-1389:{mountpoint:/var/lib/containers/storage/overlay/916b56d14d6a8cd9f6dda80124c4874a26b6161a8cc0e5406e657563ee006d3a/merged major:0 minor:1389 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/140dd17c3fa8cc7025e291bdd330c841e90a06401db43cf773f3a08e1c9daca0/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-1401:{mountpoint:/var/lib/containers/storage/overlay/654c1fcf1670811a6a51159223a87dd879268b364286ae6bc1247aa6e0b8d4ad/merged major:0 minor:1401 fsType:overlay blockSize:0} overlay_0-1419:{mountpoint:/var/lib/containers/storage/overlay/5e93e662f09d950de8e50c65d5e4a877aa44f128dc135bd1f0a6ddb29c13a2ca/merged major:0 minor:1419 fsType:overlay blockSize:0} overlay_0-142:{mountpoint:/var/lib/containers/storage/overlay/3e648002952854c491f2edad57ce367d7ffd8efa09787e074a39b464a2c9356e/merged major:0 minor:142 fsType:overlay blockSize:0} overlay_0-1433:{mountpoint:/var/lib/containers/storage/overlay/022f8f2ff49ba81b62a6d444e28cc3708084c9908ca9f1196e23b4a657512ebb/merged major:0 minor:1433 fsType:overlay blockSize:0} overlay_0-1435:{mountpoint:/var/lib/containers/storage/overlay/a281beb14ac54171cbaaa6652075ac19796dd15c9b0930f0ddf3e6aa480a43fb/merged major:0 minor:1435 fsType:overlay blockSize:0} overlay_0-144:{mountpoint:/var/lib/containers/storage/overlay/3f2c80f9d36c1214227d6c455044025129b5bdb1db3ccf98b1e430095e0ba503/merged major:0 minor:144 fsType:overlay blockSize:0} overlay_0-1453:{mountpoint:/var/lib/containers/storage/overlay/5e8a0d1a03b941fda0141f205faac4dcc24d2b3f158779aa8e36fed52a231e18/merged major:0 minor:1453 fsType:overlay blockSize:0} overlay_0-1473:{mountpoint:/var/lib/containers/storage/overlay/52650831d9729a1e1ee0b2e69e9b436bb7625b0cad4635ad68eb60a007e5c2e2/merged major:0 minor:1473 fsType:overlay blockSize:0} overlay_0-1476:{mountpoint:/var/lib/containers/storage/overlay/af260f66a1ec14d88ac99fe00db3e90ebf9bf420d430f56b0d719e75733f6a22/merged major:0 minor:1476 fsType:overlay blockSize:0} overlay_0-1478:{mountpoint:/var/lib/containers/storage/overlay/79f2abd2d0c77d3543fec6ba3fac6961acea89f74ed35aa68cb8e2dc9f51ac73/merged major:0 minor:1478 fsType:overlay blockSize:0} overlay_0-1489:{mountpoint:/var/lib/containers/storage/overlay/226fc1ab0375c6a637346e97528fd44eaa6da31f5e2d0796614c4d8f7fb1ad92/merged major:0 minor:1489 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/f3b3a86dfaad08c9f6fd6775630eb930f4f331fbab06d418ebca7a7c0f1c88e8/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-1497:{mountpoint:/var/lib/containers/storage/overlay/57fa87af87f7b9b06c774bf9e5e4b09b6d11b4851df40cd91c5e00a5cccaddea/merged major:0 minor:1497 fsType:overlay blockSize:0} overlay_0-1499:{mountpoint:/var/lib/containers/storage/overlay/7b91429c0cd877bf304c275a14fed95b7685af742c257ae0c94c80ebac53ef90/merged major:0 minor:1499 fsType:overlay blockSize:0} overlay_0-1501:{mountpoint:/var/lib/containers/storage/overlay/b71398d0aac1c20cb574f56f456590183afcb30cf48302bc987d7ed5f6c9b3d1/merged major:0 minor:1501 fsType:overlay blockSize:0} overlay_0-1503:{mountpoint:/var/lib/containers/storage/overlay/d033ea2ff0543a36aa94f709e3c65756049c593fb9f9d2bee5086087786df5d8/merged major:0 minor:1503 fsType:overlay blockSize:0} overlay_0-1505:{mountpoint:/var/lib/containers/storage/overlay/6e534081271f78d960a20e4a7c670b54ef46e5e2143ac551177ab6b13a9b41b4/merged major:0 minor:1505 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/70e163bd10a7208244060625759610af887d2dc1579c6d63af517e075d054fdc/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-1526:{mountpoint:/var/lib/containers/storage/overlay/6f6cd4bcbb83fbcb2e098bcc7bd7fccf5a8c18f39e61e58eba0bd2296dad84a8/merged major:0 minor:1526 fsType:overlay blockSize:0} overlay_0-1528:{mountpoint:/var/lib/containers/storage/overlay/dd93bfdb859bcdada2d74e3d9ea1bc13953110edae1bc5e8660ed358471bc1b9/merged major:0 minor:1528 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/48709c21f3b233abb92173be272ef9a899bb14db874090ff0da5df2752ba1852/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-1542:{mountpoint:/var/lib/containers/storage/overlay/82cf7fd86bfcfecdb4f51810a67264697e0061e2f8c5b816c0abff3668a0e760/merged major:0 minor:1542 fsType:overlay blockSize:0} overlay_0-1556:{mountpoint:/var/lib/containers/storage/overlay/288b0d3446042eb06ab00675eba675b52a08666650769e535110281ed0128d12/merged major:0 minor:1556 fsType:overlay blockSize:0} overlay_0-1558:{mountpoint:/var/lib/containers/storage/overlay/4e7241cbf7253266a628f1fc246281844ffbd4e115f5ba436aa86283975d1ca8/merged major:0 minor:1558 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/48931cb0ccc6d2e83f761bc34f71f6c7739f9c81e58294cab3cb582976d50501/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-157:{mountpoint:/var/lib/containers/storage/overlay/5c892911cb2e26892133f8dbbec9dbc9485fa62fb7723c25c659460301fa579c/merged major:0 minor:157 fsType:overlay blockSize:0} overlay_0-1582:{mountpoint:/var/lib/containers/storage/overlay/227c3cf99ae9cdd6ceb603d0d536ee76eb93699e617000599fe3302ff18a70e4/merged major:0 minor:1582 fsType:overlay blockSize:0} overlay_0-1587:{mountpoint:/var/lib/containers/storage/overlay/b243530ab624fa2094ab382bd888cae63f0e811f7995cec00f84cac80d71eaaf/merged major:0 minor:1587 fsType:overlay blockSize:0} overlay_0-1592:{mountpoint:/var/lib/containers/storage/overlay/811e1334ea3b787e260d31a5b2eb15b16a8f3485ed07e22ca5fe8900f57a7a6c/merged major:0 minor:1592 fsType:overlay blockSize:0} overlay_0-1597:{mountpoint:/var/lib/containers/storage/overlay/c002ab9667a3f805b54f0d90e580c09f9de0ea7213ad256f0ce7fee0bfcdef01/merged major:0 minor:1597 fsType:overlay blockSize:0} overlay_0-1598:{mountpoint:/var/lib/containers/storage/overlay/df497b3355d418a77d3fc3221c849474dba16702fce6cdc23f514cc893ce8109/merged major:0 minor:1598 fsType:overlay blockSize:0} overlay_0-1599:{mountpoint:/var/lib/containers/storage/overlay/33b6047db912879820e0021eee5dc3e416baa36307b38cc2b227636d2efc4bf9/merged major:0 minor:1599 fsType:overlay blockSize:0} overlay_0-1610:{mountpoint:/var/lib/containers/storage/overlay/862206f2e07353d334658cb6f4144e9cf4ba8d12449cf3ade2dd2a3bbd253928/merged major:0 minor:1610 fsType:overlay blockSize:0} overlay_0-1611:{mountpoint:/var/lib/containers/storage/overlay/919c5fe75161ee8966ad4b8539911744a117e6fef72edfc40683f0b095e61f52/merged major:0 minor:1611 fsType:overlay blockSize:0} overlay_0-1613:{mountpoint:/var/lib/containers/storage/overlay/cfccf8c67f2da8520d7b415ea1ede34cf25cb80764329d2202ec155221d274f5/merged major:0 minor:1613 fsType:overlay blockSize:0} overlay_0-1621:{mountpoint:/var/lib/containers/storage/overlay/0264c6f2f444523653dda6ad20a852a061bd1ce2d443e45512eb16e35fcdb6d1/merged major:0 minor:1621 fsType:overlay blockSize:0} overlay_0-1623:{mountpoint:/var/lib/containers/storage/overlay/8225f5fd730cc3ca02a86068e2f390d907eaaa55457fe1ffc6e845d14c7161d6/merged major:0 minor:1623 fsType:overlay blockSize:0} overlay_0-1635:{mountpoint:/var/lib/containers/storage/overlay/f7fceb7ea387f3c57de0cf585b547492f91a195bc25646788cbe684ca7b6a0a9/merged major:0 minor:1635 fsType:overlay blockSize:0} overlay_0-1646:{mountpoint:/var/lib/containers/storage/overlay/f3bfd2e93ce9c5153ad6dfb23a06e1b9530b274a3aa3770c67e6bef70d5d63fd/merged major:0 minor:1646 fsType:overlay blockSize:0} overlay_0-1664:{mountpoint:/var/lib/containers/storage/overlay/b96b6b5d4f0a495380375af45f2106212d2007cd68da41b969c6b0016f567a63/merged major:0 minor:1664 fsType:overlay blockSize:0} overlay_0-167:{mountpoint:/var/lib/containers/storage/overlay/ae75840c61385c5258e4899df0519b4fc1bd42057cbb230354e66b75d367058f/merged major:0 minor:167 fsType:overlay blockSize:0} overlay_0-169:{mountpoint:/var/lib/containers/storage/overlay/3d2ea3ba6aa2eedd59e0ddd94af4b0be82da6c6f6a993fd08bc02f95d3e2f755/merged major:0 minor:169 fsType:overlay blockSize:0} overlay_0-175:{mountpoint:/var/lib/containers/storage/overlay/9657116123c72154776220102c999d69450f0f27e96206be1cfc89776e64341f/merged major:0 minor:175 fsType:overlay blockSize:0} overlay_0-183:{mountpoint:/var/lib/containers/storage/overlay/04494bcbf20385417053af1b64a08fb3152ffe5dc436ef336583e68a5e70cf78/merged major:0 minor:183 fsType:overlay blockSize:0} overlay_0-185:{mountpoint:/var/lib/containers/storage/overlay/d8c35ed9e289828a058a37888a3f608e2b8a81f948b18171e9fa3bf3081cc96b/merged major:0 minor:185 fsType:overlay blockSize:0} overlay_0-191:{mountpoint:/var/lib/containers/storage/overlay/b55d61767c34d3779531161ebb5ce207629fd0223f94d80090602ec8b2346372/merged major:0 minor:191 fsType:overlay blockSize:0} overlay_0-193:{mountpoint:/var/lib/containers/storage/overlay/9436f93540c593f61efefffeee729587d3c39a874576f9824beef1963400f3a8/merged major:0 minor:193 fsType:overlay blockSize:0} overlay_0-197:{mountpoint:/var/lib/containers/storage/overlay/2624ff8655e84b1b9d16a3c838e7bcc6fadb6ad147a54812b33a1d091f6d4000/merged major:0 minor:197 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/9075e26dd4918d536c65d7dfb72f454bac15a5820cbc32b196010ed7e78e1c37/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-207:{mountpoint:/var/lib/containers/storage/overlay/cecadc37bfe270f9ea4fbfdebb5738483ba1bdb8d11f7725699a9dd02e1a9717/merged major:0 minor:207 fsType:overlay blockSize:0} overlay_0-215:{mountpoint:/var/lib/containers/storage/overlay/f6bbc1c38ac949d8b86ee9c49f228896209d14217f05b7f4df0381546670f512/merged major:0 minor:215 fsType:overlay blockSize:0} overlay_0-216:{mountpoint:/var/lib/containers/storage/overlay/0c0a8a257a360fffccf016c662dd83bf83999899561602573780bcaaa4074c6b/merged major:0 minor:216 fsType:overlay blockSize:0} overlay_0-218:{mountpoint:/var/lib/containers/storage/overlay/6329ca7393a9f73f98e8f8a1a4346f8749b1f1845e2769970cb4684da211f6f4/merged major:0 minor:218 fsType:overlay blockSize:0} overlay_0-219:{mountpoint:/var/lib/containers/storage/overlay/d47df9c87d27be1f7f9575ff4c0d4e51213f958b3465c17c214ce4f459156f94/merged major:0 minor:219 fsType:overlay blockSize:0} overlay_0-221:{mountpoint:/var/lib/containers/storage/overlay/671e8998c43f6f89afd55bc00e156b6bc8e364cb85b1bd7d4b538d2de1825e1c/merged major:0 minor:221 fsType:overlay blockSize:0} overlay_0-226:{mountpoint:/var/lib/containers/storage/overlay/82c83d72088c8e73033d616c2ac12014cfe074c3f7e72b030359ddaaa0b25ebf/merged major:0 minor:226 fsType:overlay blockSize:0} overlay_0-234:{mountpoint:/var/lib/containers/storage/overlay/a4a92a504395e42a41d06894832717265578006fe34fc39609125eb1c2b88a77/merged major:0 minor:234 fsType:overlay blockSize:0} overlay_0-242:{mountpoint:/var/lib/containers/storage/overlay/ed5929c73cfe6aee4c73efbeb1ee350fe3a9fdc038927b1c2a4f44531981f251/merged major:0 minor:242 fsType:overlay blockSize:0} overlay_0-250:{mountpoint:/var/lib/containers/storage/overlay/c5a1f58678f64ba6c7c48de01046065da1042e4ad5fa3012c036642d7bbcec9f/merged major:0 minor:250 fsType:overlay blockSize:0} overlay_0-258:{mountpoint:/var/lib/containers/storage/overlay/eb34961304626bdb305888ccd53ff1d7fda87f1e17b287f3165305d2e5a99b64/merged major:0 minor:258 fsType:overlay blockSize:0} overlay_0-266:{mountpoint:/var/lib/containers/storage/overlay/b445c92596425989acedddffc5e8d685ca202e2ed201157087e8ea57e771db2c/merged major:0 minor:266 fsType:overlay blockSize:0} overlay_0-268:{mountpoint:/var/lib/containers/storage/overlay/391629bf5309395cf5220ffd2d0431e657f46b8c978e3c564a5c70e7075fcf43/merged major:0 minor:268 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/f9de57d207775bc31331b29eccbafc5435a1b2ade0950320c9c8e0b7a91a8577/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-282:{mountpoint:/var/lib/containers/storage/overlay/4b45a9cf3cd3696113fd5ef2be01012b560625e7a2d4a640ce7d63d86f0828e2/merged major:0 minor:282 fsType:overlay blockSize:0} overlay_0-325:{mountpoint:/var/lib/containers/storage/overlay/491d62c17bbf975642a16104c22aeca1e5f9471f6e20e437a7fff7f26b536843/merged major:0 minor:325 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/5f9efb85b82089ad9e6dc5101304368f65b4828c11598f9e2d6c25077990d94a/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-331:{mountpoint:/var/lib/containers/storage/overlay/d22c0d96ffa4b3a76bd4cde8c8b73eca758897759ab324ca141f6a1da2631a43/merged major:0 minor:331 fsType:overlay blockSize:0} overlay_0-338:{mountpoint:/var/lib/containers/storage/overlay/442ff403c23d374db51dd6e266ecb381a14c3ca42957ce4647c57421ca48ea6a/merged major:0 minor:338 fsType:overlay blockSize:0} overlay_0-340:{mountpoint:/var/lib/containers/storage/overlay/eec39460c398a0ab3c2f75ee9c8b9750a0cb42ee3910f03cca59b647a4c9b2d0/merged major:0 minor:340 fsType:overlay blockSize:0} overlay_0-342:{mountpoint:/var/lib/containers/storage/overlay/74bcbe88e852483b8ddbae485282b7e996612893370f341ead0e70911e6fb8cc/merged major:0 minor:342 fsType:overlay blockSize:0} overlay_0-344:{mountpoint:/var/lib/containers/storage/overlay/151e96ff8b9f1c9154cec47175544b2dacf0b89cb9ce9f3f793cc899126d858d/merged major:0 minor:344 fsType:overlay blockSize:0} overlay_0-347:{mountpoint:/var/lib/containers/storage/overlay/58adb855510ec1f5f85ff7decb21ba799ad6edca08632b31dddb14b5d63efa61/merged major:0 minor:347 fsType:overlay blockSize:0} overlay_0-351:{mountpoint:/var/lib/containers/storage/overlay/b6871501cb57a88a12f85ad9bfdc415efbf104c0d3bd65c8c659400f31101e2e/merged major:0 minor:351 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/95b53ddb8228d468a65c529d8b907e1ab02e943dc086d2c528c09f4e9135fbb2/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-365:{mountpoint:/var/lib/containers/storage/overlay/c20d70107a41357862503fe8119a772dc12bfb905128ffc55cd4b0c3fa1fdf13/merged major:0 minor:365 fsType:overlay blockSize:0} overlay_0-367:{mountpoint:/var/lib/containers/storage/overlay/92836f0969e4f5c5a7e9f48ee95c144ab95f2be5a7629fa831315b74d198775f/merged major:0 minor:367 fsType:overlay blockSize:0} overlay_0-369:{mountpoint:/var/lib/containers/storage/overlay/131762c92a195cfc40f1b154ec252900e54c22cb4d7393690cf866419282ce6b/merged major:0 minor:369 fsType:overlay blockSize:0} overlay_0-371:{mountpoint:/var/lib/containers/storage/overlay/ced1b2fb93774536673c0a838679b65783435dd49a16eb2120e630bc760e169c/merged major:0 minor:371 fsType:overlay blockSize:0} overlay_0-373:{mountpoint:/var/lib/containers/storage/overlay/bb91aa3247cc91d349c02bb268333b9e911fc85be6b73108352b8547af53f0c4/merged major:0 minor:373 fsType:overlay blockSize:0} overlay_0-375:{mountpoint:/var/lib/containers/storage/overlay/c8d9e3cc6f4825415f76ae06b60e7f449cd2dd00a6d497b0879587a09a535edc/merged major:0 minor:375 fsType:overlay blockSize:0} overlay_0-380:{mountpoint:/var/lib/containers/storage/overlay/42b0a60920bde8c3432f79db355fa6ebca9b27e0c07397dcdc49bce7ce1676dd/merged major:0 minor:380 fsType:overlay blockSize:0} overlay_0-381:{mountpoint:/var/lib/containers/storage/overlay/2e2b3f2997e582e2560685d4f23e08975df1ec64f5546c56aa78b01ade594881/merged major:0 minor:381 fsType:overlay blockSize:0} overlay_0-383:{mountpoint:/var/lib/containers/storage/overlay/9931adf3bbcb6ac6bfc89f929ab6fb8a653bc68224f3b4b78dad0007b0e0fe0a/merged major:0 minor:383 fsType:overlay blockSize:0} overlay_0-387:{mountpoint:/var/lib/containers/storage/overlay/12ed03e3a6bcefcbaf9dc4e2988b8d4bbcba46ca40d96ae32ac969f1f8e68f57/merged major:0 minor:387 fsType:overlay blockSize:0} overlay_0-389:{mountpoint:/var/lib/containers/storage/overlay/d2689d66fe7e98aaf66c12d818336d0f41bd52e4e7d218cec74e02860136aca0/merged major:0 minor:389 fsType:overlay blockSize:0} overlay_0-393:{mountpoint:/var/lib/containers/storage/overlay/dd01770093a9ccc895f31c5fa6059472a49593a195a42922a252c8155b6f7da5/merged major:0 minor:393 fsType:overlay blockSize:0} overlay_0-41:{mountpoint:/var/lib/containers/storage/overlay/b4a85660e270e700f74ea12469341a1404e29d22cf196bcefc2c22230e441cf9/merged major:0 minor:41 fsType:overlay blockSize:0} overlay_0-412:{mountpoint:/var/lib/containers/storage/overlay/6230ec8ea3df054394c4a88da4c446c31454d30e77629b129a774c37ea6f04b9/merged major:0 minor:412 fsType:overlay blockSize:0} overlay_0-414:{mountpoint:/var/lib/containers/storage/overlay/ffc667c251eca3796c1bca9e400f28b5db798d35b5bc5340b1874f85530e3191/merged major:0 minor:414 fsType:overlay blockSize:0} overlay_0-416:{mountpoint:/var/lib/containers/storage/overlay/7525346cd62720c20fc99858a0f999ed50bc58cb579d08c4d89c5cbafeefad0d/merged major:0 minor:416 fsType:overlay blockSize:0} overlay_0-418:{mountpoint:/var/lib/containers/storage/overlay/04d471edf827c4a0eceb71ec99288894dba2104e92c61243f97c793fe8fd4b78/merged major:0 minor:418 fsType:overlay blockSize:0} overlay_0-42:{mountpoint:/var/lib/containers/storage/overlay/c818f21e8b102268f04d8e5826621dcdbff302762d38dd24d75dd3ad16354ec8/merged major:0 minor:42 fsType:overlay blockSize:0} overlay_0-420:{mountpoint:/var/lib/containers/storage/overlay/14b1f93129e5b9c6165dbcc7dcde784a9cac28b895035e93fc08eb249cf1260a/merged major:0 minor:420 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/bce9f1e74ea37f3a6d3e9fc43120b15a0ee056456290a83b7a8365b6278c352e/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-442:{mountpoint:/var/lib/containers/storage/overlay/cc8a9980cbb20fd57f404c51d82f36aaaa0f4012a12b1731feed7bd68108cbd7/merged major:0 minor:442 fsType:overlay blockSize:0} overlay_0-451:{mountpoint:/var/lib/containers/storage/overlay/9505782716c13e9413b9b55edbc14b5e4091546413ce4 Dec 03 22:10:34.938604 master-0 kubenswrapper[36504]: 77a80fb818481e9e1f0/merged major:0 minor:451 fsType:overlay blockSize:0} overlay_0-457:{mountpoint:/var/lib/containers/storage/overlay/a1f30d713cb9607f5db05ce62af3ca960cad61f59cd27ac868f94403b7c87f7d/merged major:0 minor:457 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/a8c0f3fd5d81c84f3840c372a0143eda5936bac0fbf477a55057d9715aecf07f/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-475:{mountpoint:/var/lib/containers/storage/overlay/9579a37e9d728f0abae49b5167fcaa626a2383e465dd453437e8ca34cb064a58/merged major:0 minor:475 fsType:overlay blockSize:0} overlay_0-477:{mountpoint:/var/lib/containers/storage/overlay/4270ffcf76e0472e37a42c3169877c80fbc6fd2672a2618d030cfb9b58777017/merged major:0 minor:477 fsType:overlay blockSize:0} overlay_0-480:{mountpoint:/var/lib/containers/storage/overlay/c703216059d0ce4824b38d35940003d16eaeb567152bc707a67570cba497444a/merged major:0 minor:480 fsType:overlay blockSize:0} overlay_0-482:{mountpoint:/var/lib/containers/storage/overlay/b29c592880312f28b1bdc4c7d39559b00ffc2bee9831ccbe6be894367cbc14ba/merged major:0 minor:482 fsType:overlay blockSize:0} overlay_0-484:{mountpoint:/var/lib/containers/storage/overlay/8e98b26757e75837a676a44a422dfe089f12b85eb9a1abfd82e2a407dcc18639/merged major:0 minor:484 fsType:overlay blockSize:0} overlay_0-50:{mountpoint:/var/lib/containers/storage/overlay/d2f8339ba46012dcdf62bf942d3b01fab75c400210d71c5addd21f39cc470853/merged major:0 minor:50 fsType:overlay blockSize:0} overlay_0-508:{mountpoint:/var/lib/containers/storage/overlay/0f529d94a9043c629d896de2637758c873ebb165fd0018c187f43087ea025ac8/merged major:0 minor:508 fsType:overlay blockSize:0} overlay_0-512:{mountpoint:/var/lib/containers/storage/overlay/40f25ce4f4626a5fca4388a0646160a468d21af4f3d5e041c703e6f71c2daf5b/merged major:0 minor:512 fsType:overlay blockSize:0} overlay_0-514:{mountpoint:/var/lib/containers/storage/overlay/04868449af57118a34494a51d5078aa99c6192e7ec624f819026145863856124/merged major:0 minor:514 fsType:overlay blockSize:0} overlay_0-516:{mountpoint:/var/lib/containers/storage/overlay/8258d780d0eb3ba28d67064c932d730f526a20548e13231cb4964c1ffb56320b/merged major:0 minor:516 fsType:overlay blockSize:0} overlay_0-518:{mountpoint:/var/lib/containers/storage/overlay/d8672bc15b453cdc7397d232c9e483dcf817aa329d243a6cd6222ebb5f1e9a26/merged major:0 minor:518 fsType:overlay blockSize:0} overlay_0-520:{mountpoint:/var/lib/containers/storage/overlay/4fdc439ec39b625b73bcf2e4c7a7cb21a57862589a5b7642d408c2ba21cefd2f/merged major:0 minor:520 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/7153ef94af7e70a456358d9c3e321def4dc8e509402cf35c7e602a015cc46244/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-529:{mountpoint:/var/lib/containers/storage/overlay/916c92ca0de3353a8a0e13d032f4f7b662953a25fbdd37c164162ba04da00449/merged major:0 minor:529 fsType:overlay blockSize:0} overlay_0-533:{mountpoint:/var/lib/containers/storage/overlay/9e4c6c2719adca5435c7b7457baa9311309ef1f61af290a4c22794b57e703b4c/merged major:0 minor:533 fsType:overlay blockSize:0} overlay_0-542:{mountpoint:/var/lib/containers/storage/overlay/886707d197329de38fe57e4299a3dbe51990cd95617278ae7956c38b05dae57e/merged major:0 minor:542 fsType:overlay blockSize:0} overlay_0-553:{mountpoint:/var/lib/containers/storage/overlay/a27ee7f0968b51d45de25faff2c96d1bed813b32991238bd87b0977e428d7ee7/merged major:0 minor:553 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/f2b8b1a80178106765403cff9ebb04424a4436ba2b927b6a4200cef9199337fb/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/dd2e0b51841f5c579ba85cfffbf1bcd7bf27923278eec61e076cac17663be590/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-570:{mountpoint:/var/lib/containers/storage/overlay/dcd3ee6d2060f02b78116def0011f764aa4b4ee877dd88725348b7b75b82145d/merged major:0 minor:570 fsType:overlay blockSize:0} overlay_0-584:{mountpoint:/var/lib/containers/storage/overlay/0f33c806dd3ae68b2ea26f0d5e0384cf5f4743afaa614cac11c74b93d126e9f0/merged major:0 minor:584 fsType:overlay blockSize:0} overlay_0-586:{mountpoint:/var/lib/containers/storage/overlay/b412f5b1a78d54073f7fe8261c15e5399f5337420b600a706605c1a70f71762a/merged major:0 minor:586 fsType:overlay blockSize:0} overlay_0-588:{mountpoint:/var/lib/containers/storage/overlay/8d5730dcf3c12dd288843f004f37397d56c46701371c2b0ed958519f08a6f398/merged major:0 minor:588 fsType:overlay blockSize:0} overlay_0-597:{mountpoint:/var/lib/containers/storage/overlay/9828b9745bbedb47cc02548cb018c0a029611d3f33e2905d4292d792cda49adc/merged major:0 minor:597 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/d809b294c764c9c0241a5e7db8e62ebfac0ec21a59a4c6c0bc0922195053fab8/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-601:{mountpoint:/var/lib/containers/storage/overlay/c051fdca3954ec908e2f4429f9094a6451e07e022c97690c80ab5dfb589b0db7/merged major:0 minor:601 fsType:overlay blockSize:0} overlay_0-604:{mountpoint:/var/lib/containers/storage/overlay/8a5f458b41aee19417499a11ecc57d8f9bbaa00d16926dc23d376d6ab1bfb463/merged major:0 minor:604 fsType:overlay blockSize:0} overlay_0-607:{mountpoint:/var/lib/containers/storage/overlay/a2053f243260219ebe04dacfedf62c1534763ed86396871e7b58d0e95c6ff0d7/merged major:0 minor:607 fsType:overlay blockSize:0} overlay_0-609:{mountpoint:/var/lib/containers/storage/overlay/6c2ec033ccfd92d7db79b473eb482af60b3b8e84519cfa3ba3174a554b1cf851/merged major:0 minor:609 fsType:overlay blockSize:0} overlay_0-617:{mountpoint:/var/lib/containers/storage/overlay/14ddff0cc877ef556c24c921da0bc6923c666621ae40ff2f6e4b9798543bb8aa/merged major:0 minor:617 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/6a5889e75ea70adf3f7acf9c8e161d969b5764f45e222ab8b0ebd29d18618a49/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-633:{mountpoint:/var/lib/containers/storage/overlay/9ff1438435a1c054ed60d55b408fd542fd028750876d15b75cf9972ba58a3cc2/merged major:0 minor:633 fsType:overlay blockSize:0} overlay_0-643:{mountpoint:/var/lib/containers/storage/overlay/572d8b79a53a6c5e7bcbceae8b39713c893136f54d78243110a79bdc79aefd6c/merged major:0 minor:643 fsType:overlay blockSize:0} overlay_0-645:{mountpoint:/var/lib/containers/storage/overlay/84ab5abe49d2fa12fce0258cd142911c76aa2790756fee4c53d7a376ba583e90/merged major:0 minor:645 fsType:overlay blockSize:0} overlay_0-653:{mountpoint:/var/lib/containers/storage/overlay/986039d05c9f5427703990e97b60442ae684ed196c0f288cb3f3f9ec8d779a8b/merged major:0 minor:653 fsType:overlay blockSize:0} overlay_0-656:{mountpoint:/var/lib/containers/storage/overlay/2d683eafc1e2d3aeab217767e674060285897327541aaa8cbbda8f6c52e7a979/merged major:0 minor:656 fsType:overlay blockSize:0} overlay_0-669:{mountpoint:/var/lib/containers/storage/overlay/b1f4edc84ac56af150dab6f04f547e6e7084fc09c42144086080993aa7e28c0e/merged major:0 minor:669 fsType:overlay blockSize:0} overlay_0-671:{mountpoint:/var/lib/containers/storage/overlay/e74abaa278f0cc7de13a889daf95e82556ec5376383156c1bb12b1ad81e796f1/merged major:0 minor:671 fsType:overlay blockSize:0} overlay_0-684:{mountpoint:/var/lib/containers/storage/overlay/5f9d4c79a257767be3a4933ca3aaaa9b9b6d3671c054932d8c9aa29bfc1417c7/merged major:0 minor:684 fsType:overlay blockSize:0} overlay_0-686:{mountpoint:/var/lib/containers/storage/overlay/06dd7b04a505c90c8606bdc5a9ea077f214aadb3ea3bedb21d3e0d40dbc838f9/merged major:0 minor:686 fsType:overlay blockSize:0} overlay_0-690:{mountpoint:/var/lib/containers/storage/overlay/835ad4e099025b96249e17494575ec46f2029db53559385610f5f41ab0533e8a/merged major:0 minor:690 fsType:overlay blockSize:0} overlay_0-693:{mountpoint:/var/lib/containers/storage/overlay/0e3d19f5f49ec01052737e8486910b93b8ccb24315bca547ae8125130f242614/merged major:0 minor:693 fsType:overlay blockSize:0} overlay_0-70:{mountpoint:/var/lib/containers/storage/overlay/c9f24280275a9a1565bb74eb69501946c801ae768cdc4e5c9d0e6d17a697c6c3/merged major:0 minor:70 fsType:overlay blockSize:0} overlay_0-707:{mountpoint:/var/lib/containers/storage/overlay/3c38ad51f6d8226161793fd5ec5892a073f3cc5c7050c5088a934795dad40ef6/merged major:0 minor:707 fsType:overlay blockSize:0} overlay_0-709:{mountpoint:/var/lib/containers/storage/overlay/4683a943905d6a287ca915f1b013bf131ebc60360f9c93f4428341227de23ff6/merged major:0 minor:709 fsType:overlay blockSize:0} overlay_0-716:{mountpoint:/var/lib/containers/storage/overlay/3facd3f4fb9d88eddf9811fb658ebfb7b37037615ab17030c8bc861a675be8a7/merged major:0 minor:716 fsType:overlay blockSize:0} overlay_0-718:{mountpoint:/var/lib/containers/storage/overlay/0ea4e157458e5af3d1da98f0fa9d269288fc3bd1d5474b80461e17db3e1b597f/merged major:0 minor:718 fsType:overlay blockSize:0} overlay_0-72:{mountpoint:/var/lib/containers/storage/overlay/35ee205ab705c3dbe110a3c5307f13e75b16fe0623f49f794718beed8320ad57/merged major:0 minor:72 fsType:overlay blockSize:0} overlay_0-726:{mountpoint:/var/lib/containers/storage/overlay/3fe8edc506f5ad6f888740d5cc01c4043f9d71454853c6205a0ce6f03ccce636/merged major:0 minor:726 fsType:overlay blockSize:0} overlay_0-729:{mountpoint:/var/lib/containers/storage/overlay/101aaf1a27606e0c3af9775ebb687833dafee541f3c3640f2f7cbec6fbdb286f/merged major:0 minor:729 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/744de430ea5bbde88ccd9622fa91250d2aa2bdf3281ef587dd9afb15dae512ae/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-740:{mountpoint:/var/lib/containers/storage/overlay/43397402e34c4811465ecb51431c0ddc7501834220a84928e722c914a67bfb03/merged major:0 minor:740 fsType:overlay blockSize:0} overlay_0-751:{mountpoint:/var/lib/containers/storage/overlay/68b41087b1c59d683aa3325438516283b67f4bf74f1467353a99689f141978da/merged major:0 minor:751 fsType:overlay blockSize:0} overlay_0-753:{mountpoint:/var/lib/containers/storage/overlay/ff1cda2e700ab6ee5369cea0f102b3eb804e0e36b14e766f314076df1e2fae7b/merged major:0 minor:753 fsType:overlay blockSize:0} overlay_0-755:{mountpoint:/var/lib/containers/storage/overlay/a7550e2ab9ff37a60ae1927f839b0eb303a8086e66b5193b6c04bbeedd43d30e/merged major:0 minor:755 fsType:overlay blockSize:0} overlay_0-757:{mountpoint:/var/lib/containers/storage/overlay/1c94429a2ef5ea539f6368e7a1b13b025e76825ddea6358d27b6428674044a8f/merged major:0 minor:757 fsType:overlay blockSize:0} overlay_0-760:{mountpoint:/var/lib/containers/storage/overlay/f33eed65b5edcacee9949398ade49d0733957aa348a1b173bf6f034f6a97096b/merged major:0 minor:760 fsType:overlay blockSize:0} overlay_0-762:{mountpoint:/var/lib/containers/storage/overlay/8431d99b6603622a1e2f36990da2b70898e9a75d3a265fc2e93caa11fe332230/merged major:0 minor:762 fsType:overlay blockSize:0} overlay_0-764:{mountpoint:/var/lib/containers/storage/overlay/078440c514d277aa5280f74c53c743c6b0c1e8f52bc17e6b1e80123e6c8e5202/merged major:0 minor:764 fsType:overlay blockSize:0} overlay_0-777:{mountpoint:/var/lib/containers/storage/overlay/b48e792fcbb7e451ec6320157c890d482a6d53ca83256df56c753d8c1d2f7392/merged major:0 minor:777 fsType:overlay blockSize:0} overlay_0-779:{mountpoint:/var/lib/containers/storage/overlay/a3aefa6cca0bd1265cc9be3aa66af1edcc2f75afb2b00d850825435e7eec459a/merged major:0 minor:779 fsType:overlay blockSize:0} overlay_0-785:{mountpoint:/var/lib/containers/storage/overlay/101c5f4f6dfe4877108c6ee3243a61232133da485912ce40d5244126d0ed5a12/merged major:0 minor:785 fsType:overlay blockSize:0} overlay_0-787:{mountpoint:/var/lib/containers/storage/overlay/2c2b9f30ffbda812993fadb2e10e8564652b9e704038199b2c388b64ebe23719/merged major:0 minor:787 fsType:overlay blockSize:0} overlay_0-79:{mountpoint:/var/lib/containers/storage/overlay/79dc96e9da32f80a0c72ed346f46f811657809900e8fc35be863c236487015d3/merged major:0 minor:79 fsType:overlay blockSize:0} overlay_0-804:{mountpoint:/var/lib/containers/storage/overlay/5ed8b11580df4c2e05037aaf8555fd83059a69400beb4c3b7ef8c1fe838eb1d8/merged major:0 minor:804 fsType:overlay blockSize:0} overlay_0-806:{mountpoint:/var/lib/containers/storage/overlay/526e93046d54d0adffcd2225a68d2862536702609e78f7812b010e90bfb25f92/merged major:0 minor:806 fsType:overlay blockSize:0} overlay_0-808:{mountpoint:/var/lib/containers/storage/overlay/06b05f62c17623ff5ecd71dfa0649a9c680104d130ac70c65c660394c363144c/merged major:0 minor:808 fsType:overlay blockSize:0} overlay_0-81:{mountpoint:/var/lib/containers/storage/overlay/bf2b289465adeb351ac8e353e32879ad6092237e7a857819d5351b12aacd2c5e/merged major:0 minor:81 fsType:overlay blockSize:0} overlay_0-818:{mountpoint:/var/lib/containers/storage/overlay/47454f329e5a77e061fc3137328b6ab10a2e56bcbbd970415b109cb7eef36841/merged major:0 minor:818 fsType:overlay blockSize:0} overlay_0-835:{mountpoint:/var/lib/containers/storage/overlay/f54b2d328ab9cc8563cd7c94c944b1981e594bbf2f714030b5d281230dc05ca1/merged major:0 minor:835 fsType:overlay blockSize:0} overlay_0-837:{mountpoint:/var/lib/containers/storage/overlay/20bd904466690e5df06a948c4a9a2e5569aa3b71e02017c574d79d3f232304cb/merged major:0 minor:837 fsType:overlay blockSize:0} overlay_0-844:{mountpoint:/var/lib/containers/storage/overlay/6eaff9b3738f8a6b7d44b390b805d2e192dc55d86dc4672886fb583dac9173b8/merged major:0 minor:844 fsType:overlay blockSize:0} overlay_0-850:{mountpoint:/var/lib/containers/storage/overlay/2ea0c618f279163f382c2a538895d905f944920488ae9e58b3c0efeba6287d68/merged major:0 minor:850 fsType:overlay blockSize:0} overlay_0-86:{mountpoint:/var/lib/containers/storage/overlay/4d9b8624dc010bcf70b781618b9ad6f5fec7d8bf3ee5bf1d12013f3f0c05ddbf/merged major:0 minor:86 fsType:overlay blockSize:0} overlay_0-861:{mountpoint:/var/lib/containers/storage/overlay/841d25fd0bfd6beabb071e9e65058da23e7c1ffaba3f9408a199b4c4e42f02f6/merged major:0 minor:861 fsType:overlay blockSize:0} overlay_0-863:{mountpoint:/var/lib/containers/storage/overlay/8365490acd6af9bc8a59051102addff0a2bf13b32ce49ec91319a1b960cfc4cf/merged major:0 minor:863 fsType:overlay blockSize:0} overlay_0-870:{mountpoint:/var/lib/containers/storage/overlay/9b2031bd63cc3db6a513b2014d560038596d1a9eafe845af06b77de26ea1fe09/merged major:0 minor:870 fsType:overlay blockSize:0} overlay_0-872:{mountpoint:/var/lib/containers/storage/overlay/8505e6be84e13d54544c1a02e0b357f3ce6d001de7219eb445d0313cd0c0979c/merged major:0 minor:872 fsType:overlay blockSize:0} overlay_0-88:{mountpoint:/var/lib/containers/storage/overlay/49777b4ec12550fbd57f6136b9196ac49766216a57538956caf901b6378abfe1/merged major:0 minor:88 fsType:overlay blockSize:0} overlay_0-883:{mountpoint:/var/lib/containers/storage/overlay/468706368f08634d0bae68dee0c690628bb10179092bd18aa5546cf0058c5a8b/merged major:0 minor:883 fsType:overlay blockSize:0} overlay_0-893:{mountpoint:/var/lib/containers/storage/overlay/fc7347b00e4afd771a3138cdd5c1a05efc40cf6961a161c863fd8df1b00d403b/merged major:0 minor:893 fsType:overlay blockSize:0} overlay_0-90:{mountpoint:/var/lib/containers/storage/overlay/90d3d17cc3f05b0de756ba8c719590b5adee50d4563af4ba5d26bff7958557d7/merged major:0 minor:90 fsType:overlay blockSize:0} overlay_0-900:{mountpoint:/var/lib/containers/storage/overlay/39fe5e96b0731e5de018659a6ed0f8009536664eefa4707d4bcf1f9f790ba452/merged major:0 minor:900 fsType:overlay blockSize:0} overlay_0-913:{mountpoint:/var/lib/containers/storage/overlay/43acfac2758fa8c1c4c05841ce07b3bfe06795cb48b8e623de1a6ad84c2b0a2d/merged major:0 minor:913 fsType:overlay blockSize:0} overlay_0-914:{mountpoint:/var/lib/containers/storage/overlay/f7213a58b78a4132cab40936c3e0fb7c6a82371c155ef60fd6065b251251eca5/merged major:0 minor:914 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/748b134191a1db2a96e23eff19d200ae36fc156b8d53e77d57540186ed4a30e4/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-925:{mountpoint:/var/lib/containers/storage/overlay/172e8a16e1dc33402bc63412ce79679a92563b2bebc93f554bb18f661ea08b95/merged major:0 minor:925 fsType:overlay blockSize:0} overlay_0-932:{mountpoint:/var/lib/containers/storage/overlay/4c4d5f36ffc1dccff8890bf976b37aec65b2e7ddea941fced6e0d9fc94f59fff/merged major:0 minor:932 fsType:overlay blockSize:0} overlay_0-940:{mountpoint:/var/lib/containers/storage/overlay/5eac8b06aa19b4df9f47c534e1a75d798255dbd58cf351cfc4fd0dbec163a902/merged major:0 minor:940 fsType:overlay blockSize:0} overlay_0-95:{mountpoint:/var/lib/containers/storage/overlay/542060b4d19a6edc2413073681e66491da93b3ffd6f26d21e574eb0ad01e69d2/merged major:0 minor:95 fsType:overlay blockSize:0} overlay_0-951:{mountpoint:/var/lib/containers/storage/overlay/8fabb90a669a6adcce25ea8514a54cd0b53649834710ac165d21491f6e65a2f8/merged major:0 minor:951 fsType:overlay blockSize:0} overlay_0-964:{mountpoint:/var/lib/containers/storage/overlay/21f6c850dc6764b03655b4773fdfdaaa53327bf050ffa61abb99ed84f88ca636/merged major:0 minor:964 fsType:overlay blockSize:0} overlay_0-97:{mountpoint:/var/lib/containers/storage/overlay/601ba13153e38ee88036f5c8c2c0554f25b3199b10ba3b6fe94b25b36eb327ff/merged major:0 minor:97 fsType:overlay blockSize:0} overlay_0-970:{mountpoint:/var/lib/containers/storage/overlay/c1a0d6d7011ea67a15b3474ec317e3de1b144156a36fab247714c0bf11441b8c/merged major:0 minor:970 fsType:overlay blockSize:0} overlay_0-971:{mountpoint:/var/lib/containers/storage/overlay/e730b6631c9f442609afa866a135dc634425531a7dc1c37b37cdbdd4a1f8dae2/merged major:0 minor:971 fsType:overlay blockSize:0} overlay_0-977:{mountpoint:/var/lib/containers/storage/overlay/4ab3cd2e2197c70c4b030c60da7c162b7c6fe4bcedffd54693d86771b75a5a34/merged major:0 minor:977 fsType:overlay blockSize:0} overlay_0-979:{mountpoint:/var/lib/containers/storage/overlay/8cff31d152a430e01dac145edc57c143a4d577baa9c3f3bc27da1310ccd880b6/merged major:0 minor:979 fsType:overlay blockSize:0} overlay_0-99:{mountpoint:/var/lib/containers/storage/overlay/5607917806a9675d25bb1888b95373a4fb431dd04db48c693c08ae9e98db3a83/merged major:0 minor:99 fsType:overlay blockSize:0}] Dec 03 22:10:35.008275 master-0 kubenswrapper[36504]: I1203 22:10:35.005027 36504 manager.go:217] Machine: {Timestamp:2025-12-03 22:10:35.001429136 +0000 UTC m=+0.221201163 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2800000 MemoryCapacity:50514153472 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:4df8bb8220024c58895f94eb9cdab694 SystemUUID:4df8bb82-2002-4c58-895f-94eb9cdab694 BootID:a203903b-0841-4def-8d3c-caca39fd1aed Filesystems:[{Device:overlay_0-1197 DeviceMajor:0 DeviceMinor:1197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-686 DeviceMajor:0 DeviceMinor:686 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1126 DeviceMajor:0 DeviceMinor:1126 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1210 DeviceMajor:0 DeviceMinor:1210 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-393 DeviceMajor:0 DeviceMinor:393 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-520 DeviceMajor:0 DeviceMinor:520 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-72 DeviceMajor:0 DeviceMinor:72 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3446e10d150ba6b0a0d4376dd59be02274fd256ae9b67434db5d0c00d0a96a36/userdata/shm DeviceMajor:0 DeviceMinor:1471 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1501 DeviceMajor:0 DeviceMinor:1501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-416 DeviceMajor:0 DeviceMinor:416 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-99 DeviceMajor:0 DeviceMinor:99 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-369 DeviceMajor:0 DeviceMinor:369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2b014bee-5931-4856-b9e8-e38a134a1b6b/volumes/kubernetes.io~projected/kube-api-access-ntg2z DeviceMajor:0 DeviceMinor:470 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-514 DeviceMajor:0 DeviceMinor:514 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1200 DeviceMajor:0 DeviceMinor:1200 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1369 DeviceMajor:0 DeviceMinor:1369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-870 DeviceMajor:0 DeviceMinor:870 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1635 DeviceMajor:0 DeviceMinor:1635 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~projected/kube-api-access-hzjtq DeviceMajor:0 DeviceMinor:993 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6e96335e-1866-41c8-b128-b95e783a9be4/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:1003 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1091 DeviceMajor:0 DeviceMinor:1091 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~secret/node-exporter-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1465 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/7c8ec36d-9179-40ab-a448-440b4501b3e0/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:599 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-234 DeviceMajor:0 DeviceMinor:234 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a958a4641ec8ffd3efa6fee5b0f524a6a79e473a45d1d057045bc0b833ac5d9/userdata/shm DeviceMajor:0 DeviceMinor:111 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-79 DeviceMajor:0 DeviceMinor:79 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1401 DeviceMajor:0 DeviceMinor:1401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1308 DeviceMajor:0 DeviceMinor:1308 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:295 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:296 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/volumes/kubernetes.io~projected/kube-api-access-v28xw DeviceMajor:0 DeviceMinor:929 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/86b3974531c12f65b34f84b67c60273ab31a4569e9157b7dd04d59eef5e8591d/userdata/shm DeviceMajor:0 DeviceMinor:783 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/54767c36-ca29-4c91-9a8a-9699ecfa4afb/volumes/kubernetes.io~projected/kube-api-access-bl7tn DeviceMajor:0 DeviceMinor:692 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:840 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:988 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1478 DeviceMajor:0 DeviceMinor:1478 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/111d6b874dbe9030949b3d244afae39cb8fd431d88d3f5725c0ae2c79e636977/userdata/shm DeviceMajor:0 DeviceMinor:165 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ead64d1f04c9817f87eca391b24ce53dab18082c97baff6cee6cb8262295728f/userdata/shm DeviceMajor:0 DeviceMinor:360 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1497 DeviceMajor:0 DeviceMinor:1497 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/452012bf-eae1-4e69-9ba1-034309e9f2c8/volumes/kubernetes.io~projected/kube-api-access-b9q7k DeviceMajor:0 DeviceMinor:1618 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1290 DeviceMajor:0 DeviceMinor:1290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/302f177926043f4c13db47cefcd709d813569fccfeda81cb1d3c813d238c6d9a/userdata/shm DeviceMajor:0 DeviceMinor:1347 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-167 DeviceMajor:0 DeviceMinor:167 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~projected/kube-api-access-b7zsw DeviceMajor:0 DeviceMinor:592 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-690 DeviceMajor:0 DeviceMinor:690 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09da87e2dab0559242037544c26bbf09555449a73117b70108fdb02f60b3cce2/userdata/shm DeviceMajor:0 DeviceMinor:960 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4b5dd9433f686e30d70877cae31ac505a20b05bd1de2322e270f022f7fa31aa9/userdata/shm DeviceMajor:0 DeviceMinor:1017 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8b56c318-09b7-47f0-a7bf-32eb96e836ca/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:1101 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:1123 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-81 DeviceMajor:0 DeviceMinor:81 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-373 DeviceMajor:0 DeviceMinor:373 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba624ed0-32cc-4c87-81a5-708a8a8a7f88/volumes/kubernetes.io~projected/kube-api-access-n4gds DeviceMajor:0 DeviceMinor:803 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1315 DeviceMajor:0 DeviceMinor:1315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:171 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~projected/kube-api-access-t4c4r DeviceMajor:0 DeviceMinor:299 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~projected/kube-api-access-2krlg DeviceMajor:0 DeviceMinor:310 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e50b85a6-7767-4fca-8133-8243bdd85e5d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:989 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/8b56c318-09b7-47f0-a7bf-32eb96e836ca/volumes/kubernetes.io~projected/kube-api-access-qtx6m DeviceMajor:0 DeviceMinor:1102 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1226 DeviceMajor:0 DeviceMinor:1226 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9940ff5-36a6-4c04-a51d-66f7d83bea7c/volumes/kubernetes.io~projected/kube-api-access-thgv2 DeviceMajor:0 DeviceMinor:832 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-119 DeviceMajor:0 DeviceMinor:119 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-325 DeviceMajor:0 DeviceMinor:325 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5d92800ba121e99986d7a48bba9d36e70382652164434a33df53407af047f1d8/userdata/shm DeviceMajor:0 DeviceMinor:359 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-755 DeviceMajor:0 DeviceMinor:755 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd18a700-53b2-430c-a34f-dbb6331cfbe5/volumes/kubernetes.io~projected/kube-api-access-ll9bs DeviceMajor:0 DeviceMinor:697 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-175 DeviceMajor:0 DeviceMinor:175 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e6d5d61a-c5de-4619-9afb-7fad63ba0525/volumes/kubernetes.io~projected/kube-api-access-s68fd DeviceMajor:0 DeviceMinor:281 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~projected/kube-api-access-tgddm DeviceMajor:0 DeviceMinor:332 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-777 DeviceMajor:0 DeviceMinor:777 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f59094ec-47dd-4547-ad41-b15a7933f461/volumes/kubernetes.io~projected/kube-api-access-mzklx DeviceMajor:0 DeviceMinor:313 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-970 DeviceMajor:0 DeviceMinor:970 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1503 DeviceMajor:0 DeviceMinor:1503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e3f5f9ddf501f4d5ccacebd325174974a148fba4c7c203f9e2de430d8e8e4795/userdata/shm DeviceMajor:0 DeviceMinor:479 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:565 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b0c53893b7d584ad782050fd24c3bd810082c269ba863ea58500a5c74b322c5a/userdata/shm DeviceMajor:0 DeviceMinor:739 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/42c8513568caf77d747657eb0bb6e5e7f16c48cd244adc21c473c785507cff41/userdata/shm DeviceMajor:0 DeviceMinor:1107 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/bd18a700-53b2-430c-a34f-dbb6331cfbe5/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:696 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1166 DeviceMajor:0 DeviceMinor:1166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257078784 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:525 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-656 DeviceMajor:0 DeviceMinor:656 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/54de27477181cce7acb3d618802b3284f638cd785e8880f9dfe2503277e5076e/userdata/shm DeviceMajor:0 DeviceMinor:737 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-729 DeviceMajor:0 DeviceMinor:729 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-387 DeviceMajor:0 DeviceMinor:387 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:548 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1359 DeviceMajor:0 DeviceMinor:1359 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-475 DeviceMajor:0 DeviceMinor:475 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-757 DeviceMajor:0 DeviceMinor:757 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c807d487-5b8f-4747-87ee-df0637e2e11f/volumes/kubernetes.io~projected/kube-api-access-nlw7s DeviceMajor:0 DeviceMinor:104 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/a9940ff5-36a6-4c04-a51d-66f7d83bea7c/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1007 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-932 DeviceMajor:0 DeviceMinor:932 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~projected/kube-api-access-97kqz DeviceMajor:0 DeviceMinor:1341 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8495441bff4f990cf5fc3b05bed536e0b798949d41f4292acbe3e4812a6870c/userdata/shm DeviceMajor:0 DeviceMinor:354 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:392 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c899fa45611ca8adf80b1957f02ef9824a27d5c29a283564fa77a53c34bc01e4/userdata/shm DeviceMajor:0 DeviceMinor:595 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b3ab62262f6fcfe451dadb0b4353828f4d4962cfc03eadf6453b600925623b4d/userdata/shm DeviceMajor:0 DeviceMinor:78 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1473 DeviceMajor:0 DeviceMinor:1473 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1419 DeviceMajor:0 DeviceMinor:1419 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-86 DeviceMajor:0 DeviceMinor:86 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-484 DeviceMajor:0 DeviceMinor:484 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1379 DeviceMajor:0 DeviceMinor:1379 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/865783a22ac5862e11b6d307c10f02810e48cdbea9a1b3078ae19f77014ddb00/userdata/shm DeviceMajor:0 DeviceMinor:127 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-477 DeviceMajor:0 DeviceMinor:477 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-726 DeviceMajor:0 DeviceMinor:726 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-226 DeviceMajor:0 DeviceMinor:226 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:293 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1150 DeviceMajor:0 DeviceMinor:1150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-50 DeviceMajor:0 DeviceMinor:50 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5225581315ea5da37a467d8053b6d7cca0c7b3fadc88ca04e9c3e4145300fad4/userdata/shm DeviceMajor:0 DeviceMinor:930 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-861 DeviceMajor:0 DeviceMinor:861 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-120 DeviceMajor:0 DeviceMinor:120 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a124c14f-20c6-4df3-956f-a858de0c73c9/volumes/kubernetes.io~projected/kube-api-access-fq55c DeviceMajor:0 DeviceMinor:1343 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/secret-metrics-client-certs DeviceMajor:0 DeviceMinor:1551 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1611 DeviceMajor:0 DeviceMinor:1611 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1059 DeviceMajor:0 DeviceMinor:1059 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1239 DeviceMajor:0 DeviceMinor:1239 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-940 DeviceMajor:0 DeviceMinor:940 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1598 DeviceMajor:0 DeviceMinor:1598 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1146 DeviceMajor:0 DeviceMinor:1146 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:289 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/e403ab42-1840-4292-a37c-a8d4feeb54ca/volumes/kubernetes.io~projected/kube-api-access-tkg95 DeviceMajor:0 DeviceMinor:1120 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-671 DeviceMajor:0 DeviceMinor:671 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1015 DeviceMajor:0 DeviceMinor:1015 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1192 DeviceMajor:0 DeviceMinor:1192 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1338 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1433 DeviceMajor:0 DeviceMinor:1433 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6cee457a91036b49aea08a36be55afac46ecdcc7958eeb7e1aefc04a6a5aeae8/userdata/shm DeviceMajor:0 DeviceMinor:665 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1012 DeviceMajor:0 DeviceMinor:1012 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1371 DeviceMajor:0 DeviceMinor:1371 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-684 DeviceMajor:0 DeviceMinor:684 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~projected/kube-api-access-n7qqf DeviceMajor:0 DeviceMinor:303 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5b457e4b5344867e1c07054947dfb70a261a415f7e6f2a317dd28940b878f21e/userdata/shm DeviceMajor:0 DeviceMinor:996 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1111 DeviceMajor:0 DeviceMinor:1111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27cf3d4301968602620f0710474b0cd1874a47ae80ca26e646bde5b1b38a2e9d/userdata/shm DeviceMajor:0 DeviceMinor:67 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1489 DeviceMajor:0 DeviceMinor:1489 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1592 DeviceMajor:0 DeviceMinor:1592 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-542 DeviceMajor:0 DeviceMinor:542 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ffad8fc8-4378-44de-8864-dd2f666ade68/volumes/kubernetes.io~projected/kube-api-access-xcq9j DeviceMajor:0 DeviceMinor:146 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/ba624ed0-32cc-4c87-81a5-708a8a8a7f88/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:802 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1148 DeviceMajor:0 DeviceMinor:1148 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes/kubernetes.io~projected/kube-api-access-bt8l5 DeviceMajor:0 DeviceMinor:426 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1381 DeviceMajor:0 DeviceMinor:1381 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e81517a49f1ce4beaf68d43f0efeea95ff0f8821517abab2c4989ee77cd41248/userdata/shm DeviceMajor:0 DeviceMinor:600 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-250 DeviceMajor:0 DeviceMinor:250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/33442f434ca7c4c52187f1b4d1183975995d4960e8685cc35c4e0acc9b058c70/userdata/shm DeviceMajor:0 DeviceMinor:468 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/66aa2598-f4b6-4d3a-9623-aeb707e4912b/volumes/kubernetes.io~projected/kube-api-access-mw7l6 DeviceMajor:0 DeviceMinor:703 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a7ee3d19bf23284e7bcdb3c58da723e7cde6ebcface7f54320c36b317f73830b/userdata/shm DeviceMajor:0 DeviceMinor:959 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1022 DeviceMajor:0 DeviceMinor:1022 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-97 DeviceMajor:0 DeviceMinor:97 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102833152 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~projected/kube-api-access-4clxk DeviceMajor:0 DeviceMinor:306 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-553 DeviceMajor:0 DeviceMinor:553 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-588 DeviceMajor:0 DeviceMinor:588 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1272 DeviceMajor:0 DeviceMinor:1272 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:309 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:839 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0d649d9574e5890c562a660d0fd6b1f641cfe7fa0750549ecdfb69ce6e896311/userdata/shm DeviceMajor:0 DeviceMinor:324 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1599 DeviceMajor:0 DeviceMinor:1599 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b8194009-3743-4da7-baf1-f9bb0afd6187/volumes/kubernetes.io~projected/kube-api-access-rd9vn DeviceMajor:0 DeviceMinor:122 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/04f5fc52-4ec2-48c3-8441-2b15ad632233/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:731 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1582 DeviceMajor:0 DeviceMinor:1582 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/28c42112-a09e-4b7a-b23b-c06bef69cbfb/volumes/kubernetes.io~projected/kube-api-access-89p9d DeviceMajor:0 DeviceMinor:466 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/da949cf7-ab12-43ff-8e45-da1c2fd46e20/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:590 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1453 DeviceMajor:0 DeviceMinor:1453 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-95 DeviceMajor:0 DeviceMinor:95 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-185 DeviceMajor:0 DeviceMinor:185 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-197 DeviceMajor:0 DeviceMinor:197 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-282 DeviceMajor:0 DeviceMinor:282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:576 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/6e96335e-1866-41c8-b128-b95e783a9be4/volumes/kubernetes.io~projected/kube-api-access-v8rjd DeviceMajor:0 DeviceMinor:1005 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:1443 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/512e0124a102c710ff4b5f6b30983def3e7ae3d4ea65e0ce10b5f791e3c6d0c9/userdata/shm DeviceMajor:0 DeviceMinor:173 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-760 DeviceMajor:0 DeviceMinor:760 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1278 DeviceMajor:0 DeviceMinor:1278 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:294 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-338 DeviceMajor:0 DeviceMinor:338 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-604 DeviceMajor:0 DeviceMinor:604 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9/userdata/shm DeviceMajor:0 DeviceMinor:772 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-951 DeviceMajor:0 DeviceMinor:951 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/814c8acf-fb8d-4f57-b8db-21304402c1f1/volumes/kubernetes.io~projected/kube-api-access-x7fsz DeviceMajor:0 DeviceMinor:333 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-900 DeviceMajor:0 DeviceMinor:900 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-420 DeviceMajor:0 DeviceMinor:420 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/922419d4-b528-472e-8215-4a55a96dab08/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1337 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1188 DeviceMajor:0 DeviceMinor:1188 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-835 DeviceMajor:0 DeviceMinor:835 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ee51a7c7f756f465a21b62d37220086693d4cf6214b3eb70fad749bc4447332b/userdata/shm DeviceMajor:0 DeviceMinor:1018 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1063 DeviceMajor:0 DeviceMinor:1063 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1528 DeviceMajor:0 DeviceMinor:1528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7f5e855d5ea32f73759be5b29eddccce01f7fec70b6614f70377310dbf597215/userdata/shm DeviceMajor:0 DeviceMinor:997 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1385 DeviceMajor:0 DeviceMinor:1385 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445/userdata/shm DeviceMajor:0 DeviceMinor:107 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/39570e307c085c431159acfdbb22b12b2a8620889b6ed8e3fd6e335c0088192a/userdata/shm DeviceMajor:0 DeviceMinor:526 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1055 DeviceMajor:0 DeviceMinor:1055 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1283 DeviceMajor:0 DeviceMinor:1283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1505 DeviceMajor:0 DeviceMinor:1505 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes/kubernetes.io~projected/kube-api-access-h2tqg DeviceMajor:0 DeviceMinor:699 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-872 DeviceMajor:0 DeviceMinor:872 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0b613104230c7facfafaae05d2e6f8b12309442292c7644b3476f2df0ca9f688/userdata/shm DeviceMajor:0 DeviceMinor:335 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1106 DeviceMajor:0 DeviceMinor:1106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/263e27d05b8311eaef7fd597428646a031a70345e2468d3a197a3e76a71409ad/userdata/shm DeviceMajor:0 DeviceMinor:61 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/add88bf0-c88d-427d-94bb-897e088a1378/volumes/kubernetes.io~projected/kube-api-access-hkhtr DeviceMajor:0 DeviceMinor:305 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ec578e3da37e24a4ab2c27a8d268228dd8201d926f60aaadf8b617ef377a21b/userdata/shm DeviceMajor:0 DeviceMinor:322 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/def52ba3-77c1-4e0c-8a0d-44ff4d677607/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:1187 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1664 DeviceMajor:0 DeviceMinor:1664 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-693 DeviceMajor:0 DeviceMinor:693 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e50b85a6-7767-4fca-8133-8243bdd85e5d/volumes/kubernetes.io~projected/kube-api-access-z5q4k DeviceMajor:0 DeviceMinor:992 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1061 DeviceMajor:0 DeviceMinor:1061 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/10fc6516-cd4d-4291-a26d-8376ba0affef/volumes/kubernetes.io~projected/kube-api-access-h9pcw DeviceMajor:0 DeviceMinor:94 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/787c50e1-35b5-43d7-9c26-8dd5399693d3/volumes/kubernetes.io~projected/kube-api-access-jzl8x DeviceMajor:0 DeviceMinor:1339 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-418 DeviceMajor:0 DeviceMinor:418 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1646 DeviceMajor:0 DeviceMinor:1646 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dfe68018d93ee1e8ecdb8139e17ee4db21f7dabb8af64d6ea19c76db0b09dab1/userdata/shm DeviceMajor:0 DeviceMinor:189 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:315 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~projected/kube-api-access-4z9vv DeviceMajor:0 DeviceMinor:563 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-751 DeviceMajor:0 DeviceMinor:751 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-979 DeviceMajor:0 DeviceMinor:979 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1218 DeviceMajor:0 DeviceMinor:1218 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1383 DeviceMajor:0 DeviceMinor:1383 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-344 DeviceMajor:0 DeviceMinor:344 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1168 DeviceMajor:0 DeviceMinor:1168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0869de9b-6f5b-4c31-81ad-02a9c8888193/volumes/kubernetes.io~projected/kube-api-access-2kfg5 DeviceMajor:0 DeviceMinor:304 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8d88d9135730dcd89e7695b4c5e7eac40a1af6c6d348f3b0618123e49a4338e5/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-597 DeviceMajor:0 DeviceMinor:597 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1435 DeviceMajor:0 DeviceMinor:1435 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-740 DeviceMajor:0 DeviceMinor:740 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1294 DeviceMajor:0 DeviceMinor:1294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-340 DeviceMajor:0 DeviceMinor:340 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-383 DeviceMajor:0 DeviceMinor:383 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-512 DeviceMajor:0 DeviceMinor:512 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0e9427b8-d62c-45f7-97d0-1f7667ff27aa/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:713 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1024 DeviceMajor:0 DeviceMinor:1024 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-482 DeviceMajor:0 DeviceMinor:482 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/670cc7960481086b8965fb2da3afeafd568cc7eac708174b6ff365eec4bad5b9/userdata/shm DeviceMajor:0 DeviceMinor:968 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1312 DeviceMajor:0 DeviceMinor:1312 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1324 DeviceMajor:0 DeviceMinor:1324 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-371 DeviceMajor:0 DeviceMinor:371 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~projected/kube-api-access-gcb88 DeviceMajor:0 DeviceMinor:1445 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fdfbaebe-d655-4c1e-a039-08802c5c35c5/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:298 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/54767c36-ca29-4c91-9a8a-9699ecfa4afb/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:704 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-88 DeviceMajor:0 DeviceMinor:88 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1461 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:163 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a7e6b82c1580d9d7cf8afc8bf7e406625525f7c96b5b72cc9bd2e8e78541ba8/userdata/shm DeviceMajor:0 DeviceMinor:714 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-609 DeviceMajor:0 DeviceMinor:609 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1089 DeviceMajor:0 DeviceMinor:1089 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-480 DeviceMajor:0 DeviceMinor:480 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/82b2e653c93ba4a69b0223c8268dab1e869db2d0513ffbaf110dba2876358bfa/userdata/shm DeviceMajor:0 DeviceMinor:635 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1109 DeviceMajor:0 DeviceMinor:1109 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1212 DeviceMajor:0 DeviceMinor:1212 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~projected/kube-api-access-mxl5r DeviceMajor:0 DeviceMinor:151 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d4dcd3a3c921a750603eee714d78acc1c3287857fa9a3fa51cbb8e99aa49ea09/userdata/shm DeviceMajor:0 DeviceMinor:318 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8227287562fdcce3f6bc21dbb0cca3acb31c35d8825110980ae93ba96b9894f/userdata/shm DeviceMajor:0 DeviceMinor:705 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/50076985-bbaa-4bcf-9d1a-cc25bed016a7/volumes/kubernetes.io~projected/kube-api-access-jvw7p DeviceMajor:0 DeviceMinor:301 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:575 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e10b9a0a37f16109552ec2b60e216265154523259956cbf4150a13cac8e5cf2e/userdata/shm DeviceMajor:0 DeviceMinor:546 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-837 DeviceMajor:0 DeviceMinor:837 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3a7e0eea-3da8-43de-87bc-d10231e7c239/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:955 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1159 DeviceMajor:0 DeviceMinor:1159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-753 DeviceMajor:0 DeviceMinor:753 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~secret/openshift-state-metrics-tls DeviceMajor:0 DeviceMinor:1463 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-914 DeviceMajor:0 DeviceMinor:914 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-883 DeviceMajor:0 DeviceMinor:883 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/dd3ce2cf375c8b287635e37da192dc2c907d3dab5e23c6437d445a1478d56fac/userdata/shm DeviceMajor:0 DeviceMinor:1000 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-913 DeviceMajor:0 DeviceMinor:913 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/29ac4a9d-1228-49c7-9051-338e7dc98a38/volumes/kubernetes.io~projected/kube-api-access-p4mbz DeviceMajor:0 DeviceMinor:164 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-193 DeviceMajor:0 DeviceMinor:193 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-207 DeviceMajor:0 DeviceMinor:207 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~projected/kube-api-access-575vn DeviceMajor:0 DeviceMinor:300 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/785612fc-3f78-4f1a-bc83-7afe5d3b8056/volumes/kubernetes.io~projected/kube-api-access-j6m8f DeviceMajor:0 DeviceMinor:308 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-584 DeviceMajor:0 DeviceMinor:584 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-219 DeviceMajor:0 DeviceMinor:219 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3d9073cce7422ee3531e5bcb005037bb8c3536325d2ddba91234eef2010050ed/userdata/shm DeviceMajor:0 DeviceMinor:578 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/da949cf7-ab12-43ff-8e45-da1c2fd46e20/volumes/kubernetes.io~projected/kube-api-access-dtxdk DeviceMajor:0 DeviceMinor:593 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/62b43fe1-63f5-4d29-90a2-f36cb9e880ff/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:957 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~secret/kube-state-metrics-tls DeviceMajor:0 DeviceMinor:1475 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1499 DeviceMajor:0 DeviceMinor:1499 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:427 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/3a7e0eea-3da8-43de-87bc-d10231e7c239/volumes/kubernetes.io~projected/kube-api-access-6kvxd DeviceMajor:0 DeviceMinor:956 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a5901bdc085b43cef6161fde550ec83afa0dc23a3c44d303a53b3a2d9164d54f/userdata/shm DeviceMajor:0 DeviceMinor:486 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1093 DeviceMajor:0 DeviceMinor:1093 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1542 DeviceMajor:0 DeviceMinor:1542 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-533 DeviceMajor:0 DeviceMinor:533 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1147 DeviceMajor:0 DeviceMinor:1147 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1151 DeviceMajor:0 DeviceMinor:1151 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1351 DeviceMajor:0 DeviceMinor:1351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-844 DeviceMajor:0 DeviceMinor:844 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-375 DeviceMajor:0 DeviceMinor:375 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-442 DeviceMajor:0 DeviceMinor:442 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-653 DeviceMajor:0 DeviceMinor:653 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1250 DeviceMajor:0 DeviceMinor:1250 Capacity:214143315968 Type:vf Dec 03 22:10:35.009436 master-0 kubenswrapper[36504]: s Inodes:104594880 HasInodes:true} {Device:overlay_0-818 DeviceMajor:0 DeviceMinor:818 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-850 DeviceMajor:0 DeviceMinor:850 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-529 DeviceMajor:0 DeviceMinor:529 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-347 DeviceMajor:0 DeviceMinor:347 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a124c14f-20c6-4df3-956f-a858de0c73c9/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:1342 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1613 DeviceMajor:0 DeviceMinor:1613 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:314 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/def52ba3-77c1-4e0c-8a0d-44ff4d677607/volumes/kubernetes.io~projected/kube-api-access-dcvkk DeviceMajor:0 DeviceMinor:1346 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1429 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1296 DeviceMajor:0 DeviceMinor:1296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-617 DeviceMajor:0 DeviceMinor:617 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1044 DeviceMajor:0 DeviceMinor:1044 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:297 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~projected/kube-api-access-df8nl DeviceMajor:0 DeviceMinor:567 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:591 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3cbd73f86ab3e16fe0a3db2c1ea58704181c81848bd40dec7b7bf4aa0ba51d20/userdata/shm DeviceMajor:0 DeviceMinor:1487 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1587 DeviceMajor:0 DeviceMinor:1587 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-42 DeviceMajor:0 DeviceMinor:42 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-191 DeviceMajor:0 DeviceMinor:191 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:292 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bc976a2c91b624ecb3225ef5481f2ee770ab314872bf257e672f9145ca896171/userdata/shm DeviceMajor:0 DeviceMinor:581 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-718 DeviceMajor:0 DeviceMinor:718 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~projected/kube-api-access-npkww DeviceMajor:0 DeviceMinor:1553 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/941d02fc970e3b112abf225c70ef6df0a3f8ad0cb493409725bb2eddfc0e94f7/userdata/shm DeviceMajor:0 DeviceMinor:361 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/452012bf-eae1-4e69-9ba1-034309e9f2c8/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:1617 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257074688 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:307 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-365 DeviceMajor:0 DeviceMinor:365 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:566 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~projected/kube-api-access-s87hj DeviceMajor:0 DeviceMinor:1124 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-414 DeviceMajor:0 DeviceMinor:414 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-90 DeviceMajor:0 DeviceMinor:90 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1333 DeviceMajor:0 DeviceMinor:1333 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-142 DeviceMajor:0 DeviceMinor:142 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-451 DeviceMajor:0 DeviceMinor:451 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:549 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/578b2d03-b8b3-4c75-adde-73899c472ad7/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:990 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:1004 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1306 DeviceMajor:0 DeviceMinor:1306 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1476 DeviceMajor:0 DeviceMinor:1476 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/08432be8-0086-48d2-a93d-7a474e96749d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:287 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-808 DeviceMajor:0 DeviceMinor:808 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1245 DeviceMajor:0 DeviceMinor:1245 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~projected/kube-api-access-5h2wx DeviceMajor:0 DeviceMinor:545 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/bebd69d2-5b0f-4b66-8722-d6861eba3e12/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:730 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1057 DeviceMajor:0 DeviceMinor:1057 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-41 DeviceMajor:0 DeviceMinor:41 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/23abcfb0261fbbab58bbff5305c7ff7c6aa8a2b495f632704ea4c9372088e68e/userdata/shm DeviceMajor:0 DeviceMinor:93 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-331 DeviceMajor:0 DeviceMinor:331 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-787 DeviceMajor:0 DeviceMinor:787 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/578b2d03-b8b3-4c75-adde-73899c472ad7/volumes/kubernetes.io~projected/kube-api-access-gfd7g DeviceMajor:0 DeviceMinor:991 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-115 DeviceMajor:0 DeviceMinor:115 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/df76bbc30883b6ae690248727c938b1b6c8135d395feea44ea56d089180ba91e/userdata/shm DeviceMajor:0 DeviceMinor:147 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-258 DeviceMajor:0 DeviceMinor:258 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ac3d3235-531e-4c7d-9fc9-e65c97970d0f/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:911 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-607 DeviceMajor:0 DeviceMinor:607 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~projected/kube-api-access-b658f DeviceMajor:0 DeviceMinor:1430 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1623 DeviceMajor:0 DeviceMinor:1623 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-643 DeviceMajor:0 DeviceMinor:643 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~secret/profile-collector-cert DeviceMajor:0 DeviceMinor:1002 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/0a49c320-f31d-4f6d-98c3-48d24346b873/volumes/kubernetes.io~projected/kube-api-access-s7dfd DeviceMajor:0 DeviceMinor:1121 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1353 DeviceMajor:0 DeviceMinor:1353 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a9a3f403-a742-4977-901a-cf4a8eb7df5a/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:577 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-669 DeviceMajor:0 DeviceMinor:669 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c6d88ab184c0e7fbb3e1e5fe4f3291f9c0461b4a5747d2c7087ffe16e8ba75d/userdata/shm DeviceMajor:0 DeviceMinor:76 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1228 DeviceMajor:0 DeviceMinor:1228 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/66e225cef033a22130cb863563d5cf26aea983f2cbfcbc0547f78c06ef7d3704/userdata/shm DeviceMajor:0 DeviceMinor:1349 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a12bcf4fc9a3a0e9cde90a88b7956f38f3ab4e2e145a1f0eb84cd99548fc153/userdata/shm DeviceMajor:0 DeviceMinor:1469 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/892d5611-debf-402f-abc5-3f99aa080159/volumes/kubernetes.io~projected/kube-api-access-bvmxp DeviceMajor:0 DeviceMinor:66 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-125 DeviceMajor:0 DeviceMinor:125 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-707 DeviceMajor:0 DeviceMinor:707 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5db32eafb49c5ea2131a9ee6fac6a97f01a676cfb60265016c5dae78b38833fb/userdata/shm DeviceMajor:0 DeviceMinor:741 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-764 DeviceMajor:0 DeviceMinor:764 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-218 DeviceMajor:0 DeviceMinor:218 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-508 DeviceMajor:0 DeviceMinor:508 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:288 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:562 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ceb2ba2506d4333662e8bd2c848d863fe6f93e5f0c5666caa6e24d3ad37cad8a/userdata/shm DeviceMajor:0 DeviceMinor:1344 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1335 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3bea9a551a545f4989f29c1018afc27685829e9f353eed2bd20ae41b3e512f7f/userdata/shm DeviceMajor:0 DeviceMinor:1355 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/77e36f4e-845b-4b82-8abc-b634636c087a/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:1122 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1526 DeviceMajor:0 DeviceMinor:1526 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/secret-metrics-server-tls DeviceMajor:0 DeviceMinor:1552 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1160 DeviceMajor:0 DeviceMinor:1160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/19084c46f9e406b8c5bea90a33fad5659fbc69d12b59150d714d618ccc77cc8a/userdata/shm DeviceMajor:0 DeviceMinor:319 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1b47f2ef-9923-411f-9f2f-ddaea8bc7053/volumes/kubernetes.io~projected/kube-api-access-dx9sj DeviceMajor:0 DeviceMinor:501 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-785 DeviceMajor:0 DeviceMinor:785 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d6fafa97-812d-4588-95f8-7c4d85f53098/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1444 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-762 DeviceMajor:0 DeviceMinor:762 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27e1c0f49ae3c796e84e3254dcdc228d57bf21685fbfb51038f19f9e52199432/userdata/shm DeviceMajor:0 DeviceMinor:1431 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f/userdata/shm DeviceMajor:0 DeviceMinor:859 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-516 DeviceMajor:0 DeviceMinor:516 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/82055cfc-b4ce-4a00-a51d-141059947693/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:291 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c04e4f1ef981dcb988be8443321f87d7be92340901ebbfb1c1010286f70b76e0/userdata/shm DeviceMajor:0 DeviceMinor:355 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/1b47f2ef-9923-411f-9f2f-ddaea8bc7053/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:500 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-570 DeviceMajor:0 DeviceMinor:570 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ae62ce3b48419372763c717979fe81f4210258cf21652aeea2900df29d7ef00/userdata/shm DeviceMajor:0 DeviceMinor:580 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5703a6e2ab5e982bbe6de5731742c9b48a97f1ee525f754cfdb8e3ab5ff893fc/userdata/shm DeviceMajor:0 DeviceMinor:994 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-342 DeviceMajor:0 DeviceMinor:342 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~secret/openshift-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:1464 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-183 DeviceMajor:0 DeviceMinor:183 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6976b503-87da-48fc-b097-d1b315fbee3f/volumes/kubernetes.io~projected/kube-api-access-vx6rt DeviceMajor:0 DeviceMinor:311 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-389 DeviceMajor:0 DeviceMinor:389 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-381 DeviceMajor:0 DeviceMinor:381 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1621 DeviceMajor:0 DeviceMinor:1621 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e408024baf734d8341c58053ea73c2e166f19fbabf219b13f76d3497ac95de95/userdata/shm DeviceMajor:0 DeviceMinor:123 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-117 DeviceMajor:0 DeviceMinor:117 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-709 DeviceMajor:0 DeviceMinor:709 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2e64dfdf0acb39b47217a06b11c64af7640d431c0b9e7c2bcb9ff0068e0337e8/userdata/shm DeviceMajor:0 DeviceMinor:1357 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/51af5c76c79fbb742ccd27b8ec8e1df376daaccc120618135f6c44b3a44f3b5f/userdata/shm DeviceMajor:0 DeviceMinor:1361 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1265 DeviceMajor:0 DeviceMinor:1265 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-215 DeviceMajor:0 DeviceMinor:215 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-157 DeviceMajor:0 DeviceMinor:157 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-518 DeviceMajor:0 DeviceMinor:518 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1597 DeviceMajor:0 DeviceMinor:1597 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/246b7846-0dfd-43a8-bcfa-81e7435060dc/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:547 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1010 DeviceMajor:0 DeviceMinor:1010 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1139 DeviceMajor:0 DeviceMinor:1139 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-221 DeviceMajor:0 DeviceMinor:221 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-268 DeviceMajor:0 DeviceMinor:268 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:664 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/858384f3-5741-4e67-8669-2eb2b2dcaf7f/volumes/kubernetes.io~projected/kube-api-access-qgrbd DeviceMajor:0 DeviceMinor:849 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/7c8ec36d-9179-40ab-a448-440b4501b3e0/volumes/kubernetes.io~projected/kube-api-access-t2xtm DeviceMajor:0 DeviceMinor:391 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d/userdata/shm DeviceMajor:0 DeviceMinor:68 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-144 DeviceMajor:0 DeviceMinor:144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/39f0e973-7864-4842-af8e-47718ab1804c/volumes/kubernetes.io~projected/kube-api-access-4fsxc DeviceMajor:0 DeviceMinor:317 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-806 DeviceMajor:0 DeviceMinor:806 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-893 DeviceMajor:0 DeviceMinor:893 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-351 DeviceMajor:0 DeviceMinor:351 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bff07e162c1a61e585cb91d0f10b7d6f531c3b6d280120c518b63384810a9b36/userdata/shm DeviceMajor:0 DeviceMinor:353 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-964 DeviceMajor:0 DeviceMinor:964 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~projected/kube-api-access-dwdzk DeviceMajor:0 DeviceMinor:1468 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-412 DeviceMajor:0 DeviceMinor:412 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1288 DeviceMajor:0 DeviceMinor:1288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-716 DeviceMajor:0 DeviceMinor:716 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-380 DeviceMajor:0 DeviceMinor:380 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/812401c0-d1ac-4857-b939-217b7b07f8bc/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:728 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1775a3bb94266b5fadb3db6f198c4d7c599f8a235c13e9243ec7e800b61882bd/userdata/shm DeviceMajor:0 DeviceMinor:743 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8be7a677acefc6a4ee45682e61c271ff822b15dc97b6ca66fdfa8ca84749787/userdata/shm DeviceMajor:0 DeviceMinor:828 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-70 DeviceMajor:0 DeviceMinor:70 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:187 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/5f088999-ec66-402e-9634-8c762206d6b4/volumes/kubernetes.io~projected/kube-api-access-94vvg DeviceMajor:0 DeviceMinor:316 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:564 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-645 DeviceMajor:0 DeviceMinor:645 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bcbec7ef-0b98-4346-8c6b-c5fa37e90286/volumes/kubernetes.io~secret/node-exporter-tls DeviceMajor:0 DeviceMinor:1462 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/a4399d20-f9a6-4ab1-86be-e2845394eaba/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:733 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/f9a3f900-60e4-49c2-85ec-88d19852d1b9/volumes/kubernetes.io~projected/kube-api-access-jvzqm DeviceMajor:0 DeviceMinor:1006 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1052 DeviceMajor:0 DeviceMinor:1052 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53713eab-c920-4d5a-ae05-7cdb59ace852/volumes/kubernetes.io~projected/kube-api-access-nvkz7 DeviceMajor:0 DeviceMinor:172 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-977 DeviceMajor:0 DeviceMinor:977 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2d592f19-c7b9-4b29-9ca2-848572067908/volumes/kubernetes.io~projected/kube-api-access-8hfrr DeviceMajor:0 DeviceMinor:1467 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/01271153cce146a7d369c7e83af60def6d65f55d525d13eb3b7d571a58f0b254/userdata/shm DeviceMajor:0 DeviceMinor:1619 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-601 DeviceMajor:0 DeviceMinor:601 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b522af85-394e-4965-9bf4-83f48fb8ad94/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:632 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-925 DeviceMajor:0 DeviceMinor:925 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:928 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1387 DeviceMajor:0 DeviceMinor:1387 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/66f7b08c-52e8-4795-9cf0-74402a9cc0bb/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1428 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-457 DeviceMajor:0 DeviceMinor:457 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b7f68d19-71d4-4129-a575-3ee57fa53493/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:524 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-804 DeviceMajor:0 DeviceMinor:804 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5c77db9cd763bf167454cf085e8ce314f6fbd68437892b13394158027937a983/userdata/shm DeviceMajor:0 DeviceMinor:1125 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/698e6d87-1a58-493c-8b69-d22c89d26ac5/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1336 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1199 DeviceMajor:0 DeviceMinor:1199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c8da5d44-680e-4169-abc6-607bdc37a64d/volumes/kubernetes.io~projected/kube-api-access-pm2l8 DeviceMajor:0 DeviceMinor:312 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-971 DeviceMajor:0 DeviceMinor:971 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1008 DeviceMajor:0 DeviceMinor:1008 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365/userdata/shm DeviceMajor:0 DeviceMinor:1554 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/788f405f215c107f0aaae844e0357bd2af8b176524b9b27b9d45876c6c07c516/userdata/shm DeviceMajor:0 DeviceMinor:473 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1389 DeviceMajor:0 DeviceMinor:1389 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-779 DeviceMajor:0 DeviceMinor:779 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1194 DeviceMajor:0 DeviceMinor:1194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0e9427b8-d62c-45f7-97d0-1f7667ff27aa/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:712 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/62b43fe1-63f5-4d29-90a2-f36cb9e880ff/volumes/kubernetes.io~projected/kube-api-access-wgwxt DeviceMajor:0 DeviceMinor:958 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/fa9b5917-d4f3-4372-a200-45b57412f92f/volumes/kubernetes.io~projected/kube-api-access-pj79k DeviceMajor:0 DeviceMinor:910 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-216 DeviceMajor:0 DeviceMinor:216 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-169 DeviceMajor:0 DeviceMinor:169 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-367 DeviceMajor:0 DeviceMinor:367 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-633 DeviceMajor:0 DeviceMinor:633 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1224 DeviceMajor:0 DeviceMinor:1224 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-266 DeviceMajor:0 DeviceMinor:266 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/858384f3-5741-4e67-8669-2eb2b2dcaf7f/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:848 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/d0be52f3-b318-4630-b4da-f3c4a57d5818/volumes/kubernetes.io~projected/kube-api-access-qnfgr DeviceMajor:0 DeviceMinor:1466 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-129 DeviceMajor:0 DeviceMinor:129 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-586 DeviceMajor:0 DeviceMinor:586 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1610 DeviceMajor:0 DeviceMinor:1610 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-242 DeviceMajor:0 DeviceMinor:242 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c1ee4db7-f2d3-4064-a189-f66fd0a021eb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:290 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b8716994da0d4c9873141d2aa21edd03e4ae031b26b1f8173e339c8acb9d84d8/userdata/shm DeviceMajor:0 DeviceMinor:829 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1132 DeviceMajor:0 DeviceMinor:1132 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1a0f647a-0260-4737-8ae2-cc90d01d33d1/volumes/kubernetes.io~projected/kube-api-access-lsr8k DeviceMajor:0 DeviceMinor:188 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-863 DeviceMajor:0 DeviceMinor:863 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1053 DeviceMajor:0 DeviceMinor:1053 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1558 DeviceMajor:0 DeviceMinor:1558 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4ffb33ed0c17e44b8ef16a80bc6cabdd86e0fa10c888372300239522e481a90e/userdata/shm DeviceMajor:0 DeviceMinor:1014 Capacity:67108864 Type:vfs Inodes:6166278 HasInodes:true} {Device:/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes/kubernetes.io~secret/client-ca-bundle DeviceMajor:0 DeviceMinor:1550 Capacity:49335554048 Type:vfs Inodes:6166278 HasInodes:true} {Device:overlay_0-1556 DeviceMajor:0 DeviceMinor:1556 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1257 DeviceMajor:0 DeviceMinor:1257 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:01271153cce146a MacAddress:a6:4f:18:82:6a:36 Speed:10000 Mtu:8900} {Name:01563ec1e9dc14e MacAddress:32:52:a8:92:39:27 Speed:10000 Mtu:8900} {Name:09da87e2dab0559 MacAddress:66:7a:53:64:75:fb Speed:10000 Mtu:8900} {Name:0b613104230c7fa MacAddress:0e:aa:70:50:21:17 Speed:10000 Mtu:8900} {Name:1775a3bb94266b5 MacAddress:56:39:df:09:a6:78 Speed:10000 Mtu:8900} {Name:19084c46f9e406b MacAddress:c2:b1:72:86:e1:0d Speed:10000 Mtu:8900} {Name:23abcfb0261fbba MacAddress:82:65:09:dc:d8:6a Speed:10000 Mtu:8900} {Name:27cf3d430196860 MacAddress:86:1f:da:36:22:1c Speed:10000 Mtu:8900} {Name:2ae62ce3b484193 MacAddress:9a:ca:13:a3:fb:bd Speed:10000 Mtu:8900} {Name:2ec578e3da37e24 MacAddress:52:f4:84:84:77:ae Speed:10000 Mtu:8900} {Name:302f177926043f4 MacAddress:92:b1:2e:f5:70:6c Speed:10000 Mtu:8900} {Name:33442f434ca7c4c MacAddress:ce:9e:93:0e:aa:26 Speed:10000 Mtu:8900} {Name:39570e307c085c4 MacAddress:be:c9:d7:48:56:ae Speed:10000 Mtu:8900} {Name:3bea9a551a545f4 MacAddress:26:3b:10:65:be:39 Speed:10000 Mtu:8900} {Name:3cbd73f86ab3e16 MacAddress:9e:77:74:56:21:18 Speed:10000 Mtu:8900} {Name:3d9073cce7422ee MacAddress:3e:02:be:11:0c:e4 Speed:10000 Mtu:8900} {Name:42c8513568caf77 MacAddress:26:a5:62:2f:32:03 Speed:10000 Mtu:8900} {Name:4b5dd9433f686e3 MacAddress:ea:c0:80:63:ac:f8 Speed:10000 Mtu:8900} {Name:4ffb33ed0c17e44 MacAddress:be:fd:af:02:03:03 Speed:10000 Mtu:8900} {Name:51af5c76c79fbb7 MacAddress:9e:ed:39:e5:da:09 Speed:10000 Mtu:8900} {Name:54de27477181cce MacAddress:a2:aa:70:84:95:39 Speed:10000 Mtu:8900} {Name:5703a6e2ab5e982 MacAddress:da:8c:e9:d3:86:9b Speed:10000 Mtu:8900} {Name:5a958a4641ec8ff MacAddress:4e:7f:a3:e3:27:0b Speed:10000 Mtu:8900} {Name:5b457e4b5344867 MacAddress:6a:89:40:f5:e9:1d Speed:10000 Mtu:8900} {Name:5c77db9cd763bf1 MacAddress:0a:76:69:35:20:80 Speed:10000 Mtu:8900} {Name:5d92800ba121e99 MacAddress:0e:8b:ef:d8:55:3b Speed:10000 Mtu:8900} {Name:5db32eafb49c5ea MacAddress:62:0d:ec:f5:88:1c Speed:10000 Mtu:8900} {Name:670cc7960481086 MacAddress:52:47:76:9d:4b:ba Speed:10000 Mtu:8900} {Name:6cee457a91036b4 MacAddress:a2:f0:ba:9f:92:d4 Speed:10000 Mtu:8900} {Name:788f405f215c107 MacAddress:c2:ad:dd:41:5f:cc Speed:10000 Mtu:8900} {Name:7f5e855d5ea32f7 MacAddress:3e:dd:dd:dc:ed:26 Speed:10000 Mtu:8900} {Name:82b2e653c93ba4a MacAddress:4a:90:f1:fa:a5:8c Speed:10000 Mtu:8900} {Name:86b3974531c12f6 MacAddress:86:36:32:77:5d:5c Speed:10000 Mtu:8900} {Name:8d88d9135730dcd MacAddress:0e:ef:cc:f3:02:26 Speed:10000 Mtu:8900} {Name:941d02fc970e3b1 MacAddress:32:78:a4:0f:dd:51 Speed:10000 Mtu:8900} {Name:9a12bcf4fc9a3a0 MacAddress:22:fb:26:ae:60:3f Speed:10000 Mtu:8900} {Name:9c6d88ab184c0e7 MacAddress:2a:0d:95:12:b8:a3 Speed:10000 Mtu:8900} {Name:a5901bdc085b43c MacAddress:f6:d9:45:1b:0e:d8 Speed:10000 Mtu:8900} {Name:a7ee3d19bf23284 MacAddress:9a:d7:88:09:f6:17 Speed:10000 Mtu:8900} {Name:ab4d3d9867608a7 MacAddress:92:cb:45:5f:25:5c Speed:10000 Mtu:8900} {Name:b0c53893b7d584a MacAddress:36:a6:71:73:5f:56 Speed:10000 Mtu:8900} {Name:b8716994da0d4c9 MacAddress:72:de:a5:28:85:9a Speed:10000 Mtu:8900} {Name:bc976a2c91b624e MacAddress:3e:e4:16:3d:55:a4 Speed:10000 Mtu:8900} {Name:bff07e162c1a61e MacAddress:d2:18:de:d8:44:9f Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:ae:f3:08:a3:10:cf Speed:0 Mtu:8900} {Name:c04e4f1ef981dcb MacAddress:3a:d4:3a:7c:9f:67 Speed:10000 Mtu:8900} {Name:c899fa45611ca8a MacAddress:1a:f0:44:c2:72:36 Speed:10000 Mtu:8900} {Name:ceb2ba2506d4333 MacAddress:32:4f:c5:5f:60:06 Speed:10000 Mtu:8900} {Name:d4dcd3a3c921a75 MacAddress:9e:5b:ac:fc:3e:70 Speed:10000 Mtu:8900} {Name:d8495441bff4f99 MacAddress:e6:ac:fa:ec:4f:b4 Speed:10000 Mtu:8900} {Name:d8be7a677acefc6 MacAddress:e2:ba:47:25:d5:4c Speed:10000 Mtu:8900} {Name:dd3ce2cf375c8b2 MacAddress:a6:d7:dd:d1:14:6f Speed:10000 Mtu:8900} {Name:e10b9a0a37f1610 MacAddress:52:d3:87:9a:19:7e Speed:10000 Mtu:8900} {Name:e3f5f9ddf501f4d MacAddress:36:0a:3f:fe:ec:1c Speed:10000 Mtu:8900} {Name:e81517a49f1ce4b MacAddress:d2:7d:75:10:1b:c2 Speed:10000 Mtu:8900} {Name:ead64d1f04c9817 MacAddress:ce:96:d4:f5:00:44 Speed:10000 Mtu:8900} {Name:ee51a7c7f756f46 MacAddress:62:ab:31:5b:80:46 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:95:f0:47 Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:56:de:4c:1d:30:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514153472 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 22:10:35.009436 master-0 kubenswrapper[36504]: I1203 22:10:35.008261 36504 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 22:10:35.009436 master-0 kubenswrapper[36504]: I1203 22:10:35.008438 36504 manager.go:233] Version: {KernelVersion:5.14.0-427.97.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202511041748-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 22:10:35.009436 master-0 kubenswrapper[36504]: I1203 22:10:35.008885 36504 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 22:10:35.009436 master-0 kubenswrapper[36504]: I1203 22:10:35.009216 36504 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009267 36504 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009634 36504 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009655 36504 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009671 36504 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009713 36504 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009814 36504 state_mem.go:36] "Initialized new in-memory state store" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.009966 36504 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.010117 36504 kubelet.go:418] "Attempting to sync node with API server" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.010149 36504 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.010180 36504 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.010210 36504 kubelet.go:324] "Adding apiserver pod source" Dec 03 22:10:35.010391 master-0 kubenswrapper[36504]: I1203 22:10:35.010248 36504 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 22:10:35.013463 master-0 kubenswrapper[36504]: I1203 22:10:35.013351 36504 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-2.rhaos4.18.git15789b8.el9" apiVersion="v1" Dec 03 22:10:35.013969 master-0 kubenswrapper[36504]: I1203 22:10:35.013913 36504 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 22:10:35.014951 master-0 kubenswrapper[36504]: I1203 22:10:35.014893 36504 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 22:10:35.015337 master-0 kubenswrapper[36504]: I1203 22:10:35.015286 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 22:10:35.015469 master-0 kubenswrapper[36504]: I1203 22:10:35.015368 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 22:10:35.015469 master-0 kubenswrapper[36504]: I1203 22:10:35.015386 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 22:10:35.015469 master-0 kubenswrapper[36504]: I1203 22:10:35.015437 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 22:10:35.015469 master-0 kubenswrapper[36504]: I1203 22:10:35.015455 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 22:10:35.015469 master-0 kubenswrapper[36504]: I1203 22:10:35.015469 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015483 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015538 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015565 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015584 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015662 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015737 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 22:10:35.016091 master-0 kubenswrapper[36504]: I1203 22:10:35.015865 36504 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 22:10:35.017420 master-0 kubenswrapper[36504]: I1203 22:10:35.017362 36504 server.go:1280] "Started kubelet" Dec 03 22:10:35.019019 master-0 kubenswrapper[36504]: I1203 22:10:35.018902 36504 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 22:10:35.019035 master-0 systemd[1]: Started Kubernetes Kubelet. Dec 03 22:10:35.022462 master-0 kubenswrapper[36504]: I1203 22:10:35.022401 36504 server.go:449] "Adding debug handlers to kubelet server" Dec 03 22:10:35.023879 master-0 kubenswrapper[36504]: I1203 22:10:35.023737 36504 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 22:10:35.023992 master-0 kubenswrapper[36504]: I1203 22:10:35.023901 36504 server_v1.go:47] "podresources" method="list" useActivePods=true Dec 03 22:10:35.026851 master-0 kubenswrapper[36504]: I1203 22:10:35.026747 36504 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 22:10:35.035990 master-0 kubenswrapper[36504]: I1203 22:10:35.035913 36504 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 22:10:35.038844 master-0 kubenswrapper[36504]: I1203 22:10:35.038790 36504 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 22:10:35.048145 master-0 kubenswrapper[36504]: E1203 22:10:35.048057 36504 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Dec 03 22:10:35.058548 master-0 kubenswrapper[36504]: I1203 22:10:35.053142 36504 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 22:10:35.058548 master-0 kubenswrapper[36504]: I1203 22:10:35.058514 36504 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 22:10:35.059345 master-0 kubenswrapper[36504]: I1203 22:10:35.059282 36504 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2025-12-04 21:39:22 +0000 UTC, rotation deadline is 2025-12-04 16:16:31.793780265 +0000 UTC Dec 03 22:10:35.059345 master-0 kubenswrapper[36504]: I1203 22:10:35.059341 36504 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h5m56.73444564s for next certificate rotation Dec 03 22:10:35.062147 master-0 kubenswrapper[36504]: I1203 22:10:35.062090 36504 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 22:10:35.062147 master-0 kubenswrapper[36504]: I1203 22:10:35.062144 36504 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 22:10:35.062489 master-0 kubenswrapper[36504]: I1203 22:10:35.062446 36504 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Dec 03 22:10:35.064606 master-0 kubenswrapper[36504]: I1203 22:10:35.064539 36504 factory.go:153] Registering CRI-O factory Dec 03 22:10:35.064606 master-0 kubenswrapper[36504]: I1203 22:10:35.064603 36504 factory.go:221] Registration of the crio container factory successfully Dec 03 22:10:35.064851 master-0 kubenswrapper[36504]: I1203 22:10:35.064764 36504 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 22:10:35.064851 master-0 kubenswrapper[36504]: I1203 22:10:35.064847 36504 factory.go:55] Registering systemd factory Dec 03 22:10:35.064958 master-0 kubenswrapper[36504]: I1203 22:10:35.064867 36504 factory.go:221] Registration of the systemd container factory successfully Dec 03 22:10:35.064958 master-0 kubenswrapper[36504]: I1203 22:10:35.064915 36504 factory.go:103] Registering Raw factory Dec 03 22:10:35.064958 master-0 kubenswrapper[36504]: I1203 22:10:35.064948 36504 manager.go:1196] Started watching for new ooms in manager Dec 03 22:10:35.066230 master-0 kubenswrapper[36504]: I1203 22:10:35.066161 36504 manager.go:319] Starting recovery of all containers Dec 03 22:10:35.066345 master-0 kubenswrapper[36504]: I1203 22:10:35.066293 36504 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 22:10:35.075955 master-0 kubenswrapper[36504]: I1203 22:10:35.075845 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f088999-ec66-402e-9634-8c762206d6b4" volumeName="kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.075955 master-0 kubenswrapper[36504]: I1203 22:10:35.075950 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba624ed0-32cc-4c87-81a5-708a8a8a7f88" volumeName="kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.075972 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.075988 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54767c36-ca29-4c91-9a8a-9699ecfa4afb" volumeName="kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.076006 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.076040 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-policies" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.076060 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d6fafa97-812d-4588-95f8-7c4d85f53098" volumeName="kubernetes.io/projected/d6fafa97-812d-4588-95f8-7c4d85f53098-kube-api-access-gcb88" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.076079 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d592f19-c7b9-4b29-9ca2-848572067908" volumeName="kubernetes.io/projected/2d592f19-c7b9-4b29-9ca2-848572067908-kube-api-access-8hfrr" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.076095 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token" seLinuxMountContext="" Dec 03 22:10:35.076141 master-0 kubenswrapper[36504]: I1203 22:10:35.076138 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076156 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4399d20-f9a6-4ab1-86be-e2845394eaba" volumeName="kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076204 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" volumeName="kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076218 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076232 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d6fafa97-812d-4588-95f8-7c4d85f53098" volumeName="kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076244 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e6d5d61a-c5de-4619-9afb-7fad63ba0525" volumeName="kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076255 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66aa2598-f4b6-4d3a-9623-aeb707e4912b" volumeName="kubernetes.io/projected/66aa2598-f4b6-4d3a-9623-aeb707e4912b-kube-api-access-mw7l6" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076266 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e96335e-1866-41c8-b128-b95e783a9be4" volumeName="kubernetes.io/projected/6e96335e-1866-41c8-b128-b95e783a9be4-kube-api-access-v8rjd" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076281 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-serving-ca" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076293 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="def52ba3-77c1-4e0c-8a0d-44ff4d677607" volumeName="kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076305 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f088999-ec66-402e-9634-8c762206d6b4" volumeName="kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076316 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8194009-3743-4da7-baf1-f9bb0afd6187" volumeName="kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076333 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c807d487-5b8f-4747-87ee-df0637e2e11f" volumeName="kubernetes.io/projected/c807d487-5b8f-4747-87ee-df0637e2e11f-kube-api-access-nlw7s" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076350 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d6fafa97-812d-4588-95f8-7c4d85f53098" volumeName="kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca" seLinuxMountContext="" Dec 03 22:10:35.076461 master-0 kubenswrapper[36504]: I1203 22:10:35.076378 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e403ab42-1840-4292-a37c-a8d4feeb54ca" volumeName="kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076459 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f59094ec-47dd-4547-ad41-b15a7933f461" volumeName="kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076513 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="28c42112-a09e-4b7a-b23b-c06bef69cbfb" volumeName="kubernetes.io/projected/28c42112-a09e-4b7a-b23b-c06bef69cbfb-kube-api-access-89p9d" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076531 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076545 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62b43fe1-63f5-4d29-90a2-f36cb9e880ff" volumeName="kubernetes.io/secret/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-samples-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076587 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="812401c0-d1ac-4857-b939-217b7b07f8bc" volumeName="kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076601 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="add88bf0-c88d-427d-94bb-897e088a1378" volumeName="kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076614 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e403ab42-1840-4292-a37c-a8d4feeb54ca" volumeName="kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076627 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa9b5917-d4f3-4372-a200-45b57412f92f" volumeName="kubernetes.io/projected/fa9b5917-d4f3-4372-a200-45b57412f92f-kube-api-access-pj79k" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076639 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-serving-ca" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076652 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d592f19-c7b9-4b29-9ca2-848572067908" volumeName="kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076665 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f7b08c-52e8-4795-9cf0-74402a9cc0bb" volumeName="kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076679 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4399d20-f9a6-4ab1-86be-e2845394eaba" volumeName="kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076691 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9940ff5-36a6-4c04-a51d-66f7d83bea7c" volumeName="kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076722 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8da5d44-680e-4169-abc6-607bdc37a64d" volumeName="kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076735 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076766 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9940ff5-36a6-4c04-a51d-66f7d83bea7c" volumeName="kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076883 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d592f19-c7b9-4b29-9ca2-848572067908" volumeName="kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076895 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="787c50e1-35b5-43d7-9c26-8dd5399693d3" volumeName="kubernetes.io/projected/787c50e1-35b5-43d7-9c26-8dd5399693d3-kube-api-access-jzl8x" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076908 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076921 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076943 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54767c36-ca29-4c91-9a8a-9699ecfa4afb" volumeName="kubernetes.io/projected/54767c36-ca29-4c91-9a8a-9699ecfa4afb-kube-api-access-bl7tn" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076970 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7f68d19-71d4-4129-a575-3ee57fa53493" volumeName="kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.076994 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077011 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23" volumeName="kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-tmp" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077026 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-audit" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077044 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8da5d44-680e-4169-abc6-607bdc37a64d" volumeName="kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077059 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f6498ac1-7d07-4a5f-a968-d8bda72d1002" volumeName="kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077072 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a124c14f-20c6-4df3-956f-a858de0c73c9" volumeName="kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077091 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdfbaebe-d655-4c1e-a039-08802c5c35c5" volumeName="kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077104 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077121 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a49c320-f31d-4f6d-98c3-48d24346b873" volumeName="kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-utilities" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077136 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a7e0eea-3da8-43de-87bc-d10231e7c239" volumeName="kubernetes.io/projected/3a7e0eea-3da8-43de-87bc-d10231e7c239-kube-api-access-6kvxd" seLinuxMountContext="" Dec 03 22:10:35.077103 master-0 kubenswrapper[36504]: I1203 22:10:35.077150 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077165 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64856d96-023f-46db-819c-02f1adea5aab" volumeName="kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077179 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="698e6d87-1a58-493c-8b69-d22c89d26ac5" volumeName="kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-metrics-certs" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077192 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f59094ec-47dd-4547-ad41-b15a7933f461" volumeName="kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077205 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9940ff5-36a6-4c04-a51d-66f7d83bea7c" volumeName="kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077218 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7f68d19-71d4-4129-a575-3ee57fa53493" volumeName="kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077232 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10fc6516-cd4d-4291-a26d-8376ba0affef" volumeName="kubernetes.io/projected/10fc6516-cd4d-4291-a26d-8376ba0affef-kube-api-access-h9pcw" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077244 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077256 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077271 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77e36f4e-845b-4b82-8abc-b634636c087a" volumeName="kubernetes.io/empty-dir/77e36f4e-845b-4b82-8abc-b634636c087a-tmpfs" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077287 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077305 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077372 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="da949cf7-ab12-43ff-8e45-da1c2fd46e20" volumeName="kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-kube-api-access-dtxdk" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077395 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="def52ba3-77c1-4e0c-8a0d-44ff4d677607" volumeName="kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077409 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01b80ad5-7d7c-4ecd-90b0-2913d4559b5f" volumeName="kubernetes.io/projected/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-kube-api-access-v28xw" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077425 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e9427b8-d62c-45f7-97d0-1f7667ff27aa" volumeName="kubernetes.io/configmap/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-service-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077440 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b47f2ef-9923-411f-9f2f-ddaea8bc7053" volumeName="kubernetes.io/configmap/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-cabundle" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077455 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2b014bee-5931-4856-b9e8-e38a134a1b6b" volumeName="kubernetes.io/projected/2b014bee-5931-4856-b9e8-e38a134a1b6b-kube-api-access-ntg2z" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077469 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="698e6d87-1a58-493c-8b69-d22c89d26ac5" volumeName="kubernetes.io/projected/698e6d87-1a58-493c-8b69-d22c89d26ac5-kube-api-access-97kqz" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077482 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c8ec36d-9179-40ab-a448-440b4501b3e0" volumeName="kubernetes.io/projected/7c8ec36d-9179-40ab-a448-440b4501b3e0-kube-api-access-t2xtm" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077495 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077508 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="62b43fe1-63f5-4d29-90a2-f36cb9e880ff" volumeName="kubernetes.io/projected/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-kube-api-access-wgwxt" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077522 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077536 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" volumeName="kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077549 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" volumeName="kubernetes.io/empty-dir/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-textfile" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077568 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d0be52f3-b318-4630-b4da-f3c4a57d5818" volumeName="kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077584 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d0be52f3-b318-4630-b4da-f3c4a57d5818" volumeName="kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077596 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a3f403-a742-4977-901a-cf4a8eb7df5a" volumeName="kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077612 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-image-import-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077626 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e9427b8-d62c-45f7-97d0-1f7667ff27aa" volumeName="kubernetes.io/secret/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077640 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10fc6516-cd4d-4291-a26d-8376ba0affef" volumeName="kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-utilities" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077653 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="698e6d87-1a58-493c-8b69-d22c89d26ac5" volumeName="kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-stats-auth" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077666 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa9b5917-d4f3-4372-a200-45b57412f92f" volumeName="kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077681 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077696 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-encryption-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077710 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e403ab42-1840-4292-a37c-a8d4feeb54ca" volumeName="kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077722 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077735 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077748 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077761 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50076985-bbaa-4bcf-9d1a-cc25bed016a7" volumeName="kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077793 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077810 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6976b503-87da-48fc-b097-d1b315fbee3f" volumeName="kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077821 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="922419d4-b528-472e-8215-4a55a96dab08" volumeName="kubernetes.io/secret/922419d4-b528-472e-8215-4a55a96dab08-tls-certificates" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077835 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" volumeName="kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077847 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077858 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077871 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077885 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077904 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="858384f3-5741-4e67-8669-2eb2b2dcaf7f" volumeName="kubernetes.io/configmap/858384f3-5741-4e67-8669-2eb2b2dcaf7f-auth-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077917 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b47f2ef-9923-411f-9f2f-ddaea8bc7053" volumeName="kubernetes.io/secret/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-key" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077933 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e96335e-1866-41c8-b128-b95e783a9be4" volumeName="kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077946 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b56c318-09b7-47f0-a7bf-32eb96e836ca" volumeName="kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077961 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f6498ac1-7d07-4a5f-a968-d8bda72d1002" volumeName="kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077974 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a7e0eea-3da8-43de-87bc-d10231e7c239" volumeName="kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077987 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="814c8acf-fb8d-4f57-b8db-21304402c1f1" volumeName="kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.077999 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b56c318-09b7-47f0-a7bf-32eb96e836ca" volumeName="kubernetes.io/projected/8b56c318-09b7-47f0-a7bf-32eb96e836ca-kube-api-access-qtx6m" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078012 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f59094ec-47dd-4547-ad41-b15a7933f461" volumeName="kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078028 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa9b5917-d4f3-4372-a200-45b57412f92f" volumeName="kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078044 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c8ec36d-9179-40ab-a448-440b4501b3e0" volumeName="kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078063 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bebd69d2-5b0f-4b66-8722-d6861eba3e12" volumeName="kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078080 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078097 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c8da5d44-680e-4169-abc6-607bdc37a64d" volumeName="kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078117 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ffad8fc8-4378-44de-8864-dd2f666ade68" volumeName="kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078132 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9a3f900-60e4-49c2-85ec-88d19852d1b9" volumeName="kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078148 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="578b2d03-b8b3-4c75-adde-73899c472ad7" volumeName="kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078161 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd18a700-53b2-430c-a34f-dbb6331cfbe5" volumeName="kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078175 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="578b2d03-b8b3-4c75-adde-73899c472ad7" volumeName="kubernetes.io/projected/578b2d03-b8b3-4c75-adde-73899c472ad7-kube-api-access-gfd7g" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078189 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" volumeName="kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078201 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" volumeName="kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078213 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e50b85a6-7767-4fca-8133-8243bdd85e5d" volumeName="kubernetes.io/secret/e50b85a6-7767-4fca-8133-8243bdd85e5d-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078224 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa9b5917-d4f3-4372-a200-45b57412f92f" volumeName="kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078243 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08432be8-0086-48d2-a93d-7a474e96749d" volumeName="kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078256 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6976b503-87da-48fc-b097-d1b315fbee3f" volumeName="kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078268 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" volumeName="kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-kube-api-access-4z9vv" seLinuxMountContext="" Dec 03 22:10:35.078469 master-0 kubenswrapper[36504]: I1203 22:10:35.078280 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a4399d20-f9a6-4ab1-86be-e2845394eaba" volumeName="kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078719 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0869de9b-6f5b-4c31-81ad-02a9c8888193" volumeName="kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078757 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078815 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f7b08c-52e8-4795-9cf0-74402a9cc0bb" volumeName="kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078831 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="82055cfc-b4ce-4a00-a51d-141059947693" volumeName="kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078848 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f088999-ec66-402e-9634-8c762206d6b4" volumeName="kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078873 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078887 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078908 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a49c320-f31d-4f6d-98c3-48d24346b873" volumeName="kubernetes.io/projected/0a49c320-f31d-4f6d-98c3-48d24346b873-kube-api-access-s7dfd" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078925 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64856d96-023f-46db-819c-02f1adea5aab" volumeName="kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078948 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9a3f403-a742-4977-901a-cf4a8eb7df5a" volumeName="kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078978 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7f68d19-71d4-4129-a575-3ee57fa53493" volumeName="kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.078996 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d0be52f3-b318-4630-b4da-f3c4a57d5818" volumeName="kubernetes.io/projected/d0be52f3-b318-4630-b4da-f3c4a57d5818-kube-api-access-qnfgr" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079017 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04f5fc52-4ec2-48c3-8441-2b15ad632233" volumeName="kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079083 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="452012bf-eae1-4e69-9ba1-034309e9f2c8" volumeName="kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079180 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="858384f3-5741-4e67-8669-2eb2b2dcaf7f" volumeName="kubernetes.io/secret/858384f3-5741-4e67-8669-2eb2b2dcaf7f-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079205 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9940ff5-36a6-4c04-a51d-66f7d83bea7c" volumeName="kubernetes.io/projected/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-kube-api-access-thgv2" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079240 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d592f19-c7b9-4b29-9ca2-848572067908" volumeName="kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079262 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079296 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b56c318-09b7-47f0-a7bf-32eb96e836ca" volumeName="kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079353 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079453 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="698e6d87-1a58-493c-8b69-d22c89d26ac5" volumeName="kubernetes.io/configmap/698e6d87-1a58-493c-8b69-d22c89d26ac5-service-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079484 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b56c318-09b7-47f0-a7bf-32eb96e836ca" volumeName="kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079537 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079558 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="698e6d87-1a58-493c-8b69-d22c89d26ac5" volumeName="kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-default-certificate" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079585 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="892d5611-debf-402f-abc5-3f99aa080159" volumeName="kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079601 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e50b85a6-7767-4fca-8133-8243bdd85e5d" volumeName="kubernetes.io/projected/e50b85a6-7767-4fca-8133-8243bdd85e5d-kube-api-access-z5q4k" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079628 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10fc6516-cd4d-4291-a26d-8376ba0affef" volumeName="kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-catalog-content" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079712 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bebd69d2-5b0f-4b66-8722-d6861eba3e12" volumeName="kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079786 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d6fafa97-812d-4588-95f8-7c4d85f53098" volumeName="kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079821 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f6498ac1-7d07-4a5f-a968-d8bda72d1002" volumeName="kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079842 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9a3f900-60e4-49c2-85ec-88d19852d1b9" volumeName="kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079867 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/projected/246b7846-0dfd-43a8-bcfa-81e7435060dc-kube-api-access-5h2wx" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079884 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-client" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079924 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29ac4a9d-1228-49c7-9051-338e7dc98a38" volumeName="kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079956 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a124c14f-20c6-4df3-956f-a858de0c73c9" volumeName="kubernetes.io/projected/a124c14f-20c6-4df3-956f-a858de0c73c9-kube-api-access-fq55c" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079974 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8194009-3743-4da7-baf1-f9bb0afd6187" volumeName="kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.079994 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba624ed0-32cc-4c87-81a5-708a8a8a7f88" volumeName="kubernetes.io/projected/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-kube-api-access-n4gds" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080012 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="578b2d03-b8b3-4c75-adde-73899c472ad7" volumeName="kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080031 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c807d487-5b8f-4747-87ee-df0637e2e11f" volumeName="kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-catalog-content" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080054 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d0be52f3-b318-4630-b4da-f3c4a57d5818" volumeName="kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080070 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f6498ac1-7d07-4a5f-a968-d8bda72d1002" volumeName="kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080094 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="04f5fc52-4ec2-48c3-8441-2b15ad632233" volumeName="kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080114 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080133 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d592f19-c7b9-4b29-9ca2-848572067908" volumeName="kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080154 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="578b2d03-b8b3-4c75-adde-73899c472ad7" volumeName="kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080185 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080229 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="def52ba3-77c1-4e0c-8a0d-44ff4d677607" volumeName="kubernetes.io/projected/def52ba3-77c1-4e0c-8a0d-44ff4d677607-kube-api-access-dcvkk" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080250 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="def52ba3-77c1-4e0c-8a0d-44ff4d677607" volumeName="kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080274 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fa9b5917-d4f3-4372-a200-45b57412f92f" volumeName="kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080293 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01b80ad5-7d7c-4ecd-90b0-2913d4559b5f" volumeName="kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080314 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77e36f4e-845b-4b82-8abc-b634636c087a" volumeName="kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080331 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a124c14f-20c6-4df3-956f-a858de0c73c9" volumeName="kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080348 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b8194009-3743-4da7-baf1-f9bb0afd6187" volumeName="kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080371 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" volumeName="kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080387 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="da949cf7-ab12-43ff-8e45-da1c2fd46e20" volumeName="kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-ca-certs" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080410 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0a49c320-f31d-4f6d-98c3-48d24346b873" volumeName="kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-catalog-content" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080464 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0e9427b8-d62c-45f7-97d0-1f7667ff27aa" volumeName="kubernetes.io/projected/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-kube-api-access" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080479 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="66f7b08c-52e8-4795-9cf0-74402a9cc0bb" volumeName="kubernetes.io/projected/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-kube-api-access-b658f" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080502 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="785612fc-3f78-4f1a-bc83-7afe5d3b8056" volumeName="kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080518 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23" volumeName="kubernetes.io/projected/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-kube-api-access-df8nl" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080534 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c807d487-5b8f-4747-87ee-df0637e2e11f" volumeName="kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-utilities" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080554 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2d592f19-c7b9-4b29-9ca2-848572067908" volumeName="kubernetes.io/empty-dir/2d592f19-c7b9-4b29-9ca2-848572067908-volume-directive-shadow" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080568 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77e36f4e-845b-4b82-8abc-b634636c087a" volumeName="kubernetes.io/projected/77e36f4e-845b-4b82-8abc-b634636c087a-kube-api-access-s87hj" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080590 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="814c8acf-fb8d-4f57-b8db-21304402c1f1" volumeName="kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080607 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080624 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac3d3235-531e-4c7d-9fc9-e65c97970d0f" volumeName="kubernetes.io/projected/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-kube-api-access-hzjtq" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080643 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f9a3f900-60e4-49c2-85ec-88d19852d1b9" volumeName="kubernetes.io/projected/f9a3f900-60e4-49c2-85ec-88d19852d1b9-kube-api-access-jvzqm" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080657 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-trusted-ca-bundle" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080678 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="452012bf-eae1-4e69-9ba1-034309e9f2c8" volumeName="kubernetes.io/projected/452012bf-eae1-4e69-9ba1-034309e9f2c8-kube-api-access-b9q7k" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080692 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" volumeName="kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-ca-certs" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080706 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7f68d19-71d4-4129-a575-3ee57fa53493" volumeName="kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080725 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="578b2d03-b8b3-4c75-adde-73899c472ad7" volumeName="kubernetes.io/empty-dir/578b2d03-b8b3-4c75-adde-73899c472ad7-snapshots" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080740 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac3d3235-531e-4c7d-9fc9-e65c97970d0f" volumeName="kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080761 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/projected/b522af85-394e-4965-9bf4-83f48fb8ad94-kube-api-access-b7zsw" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080820 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-client" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080866 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" volumeName="kubernetes.io/projected/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-kube-api-access-dwdzk" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080902 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="246b7846-0dfd-43a8-bcfa-81e7435060dc" volumeName="kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-encryption-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080967 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="858384f3-5741-4e67-8669-2eb2b2dcaf7f" volumeName="kubernetes.io/projected/858384f3-5741-4e67-8669-2eb2b2dcaf7f-kube-api-access-qgrbd" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.080996 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64856d96-023f-46db-819c-02f1adea5aab" volumeName="kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081024 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08432be8-0086-48d2-a93d-7a474e96749d" volumeName="kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081056 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="40f8e70d-5f98-47f1-afa8-ea67242981fc" volumeName="kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081080 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="77e36f4e-845b-4b82-8abc-b634636c087a" volumeName="kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081108 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="892d5611-debf-402f-abc5-3f99aa080159" volumeName="kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081144 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50076985-bbaa-4bcf-9d1a-cc25bed016a7" volumeName="kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081209 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" volumeName="kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081348 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd18a700-53b2-430c-a34f-dbb6331cfbe5" volumeName="kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081680 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bebd69d2-5b0f-4b66-8722-d6861eba3e12" volumeName="kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081792 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1a0f647a-0260-4737-8ae2-cc90d01d33d1" volumeName="kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081858 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="50076985-bbaa-4bcf-9d1a-cc25bed016a7" volumeName="kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081882 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" volumeName="kubernetes.io/empty-dir/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-cache" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081905 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e50b85a6-7767-4fca-8133-8243bdd85e5d" volumeName="kubernetes.io/empty-dir/e50b85a6-7767-4fca-8133-8243bdd85e5d-available-featuregates" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081921 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="39f0e973-7864-4842-af8e-47718ab1804c" volumeName="kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc" seLinuxMountContext="" Dec 03 22:10:35.082229 master-0 kubenswrapper[36504]: I1203 22:10:35.081940 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54767c36-ca29-4c91-9a8a-9699ecfa4afb" volumeName="kubernetes.io/configmap/54767c36-ca29-4c91-9a8a-9699ecfa4afb-config-volume" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.082875 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="da949cf7-ab12-43ff-8e45-da1c2fd46e20" volumeName="kubernetes.io/empty-dir/da949cf7-ab12-43ff-8e45-da1c2fd46e20-cache" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.082983 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="08432be8-0086-48d2-a93d-7a474e96749d" volumeName="kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083018 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6976b503-87da-48fc-b097-d1b315fbee3f" volumeName="kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083047 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b522af85-394e-4965-9bf4-83f48fb8ad94" volumeName="kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-config" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083075 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01b80ad5-7d7c-4ecd-90b0-2913d4559b5f" volumeName="kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083097 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53713eab-c920-4d5a-ae05-7cdb59ace852" volumeName="kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083122 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23" volumeName="kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-tuned" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083144 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd18a700-53b2-430c-a34f-dbb6331cfbe5" volumeName="kubernetes.io/projected/bd18a700-53b2-430c-a34f-dbb6331cfbe5-kube-api-access-ll9bs" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083166 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdfbaebe-d655-4c1e-a039-08802c5c35c5" volumeName="kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083192 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="64856d96-023f-46db-819c-02f1adea5aab" volumeName="kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083215 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fdfbaebe-d655-4c1e-a039-08802c5c35c5" volumeName="kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083239 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01b80ad5-7d7c-4ecd-90b0-2913d4559b5f" volumeName="kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083262 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1b47f2ef-9923-411f-9f2f-ddaea8bc7053" volumeName="kubernetes.io/projected/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-kube-api-access-dx9sj" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083283 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a7e0eea-3da8-43de-87bc-d10231e7c239" volumeName="kubernetes.io/secret/3a7e0eea-3da8-43de-87bc-d10231e7c239-cloud-credential-operator-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083305 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="812401c0-d1ac-4857-b939-217b7b07f8bc" volumeName="kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083326 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ac3d3235-531e-4c7d-9fc9-e65c97970d0f" volumeName="kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083351 36504 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f6498ac1-7d07-4a5f-a968-d8bda72d1002" volumeName="kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert" seLinuxMountContext="" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083372 36504 reconstruct.go:97] "Volume reconstruction finished" Dec 03 22:10:35.087185 master-0 kubenswrapper[36504]: I1203 22:10:35.083387 36504 reconciler.go:26] "Reconciler: start to sync state" Dec 03 22:10:35.088923 master-0 kubenswrapper[36504]: I1203 22:10:35.087238 36504 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 22:10:35.091959 master-0 kubenswrapper[36504]: I1203 22:10:35.091862 36504 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 22:10:35.094065 master-0 kubenswrapper[36504]: I1203 22:10:35.094023 36504 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 22:10:35.094157 master-0 kubenswrapper[36504]: I1203 22:10:35.094073 36504 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 22:10:35.094157 master-0 kubenswrapper[36504]: I1203 22:10:35.094100 36504 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 22:10:35.094290 master-0 kubenswrapper[36504]: E1203 22:10:35.094193 36504 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 22:10:35.096928 master-0 kubenswrapper[36504]: I1203 22:10:35.096723 36504 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 22:10:35.104490 master-0 kubenswrapper[36504]: I1203 22:10:35.104378 36504 generic.go:334] "Generic (PLEG): container finished" podID="2dcf886d-2028-4acd-83ac-850c4b278810" containerID="8e98bd7632e10184a3905755491b0bf5b7f9601daae4ea74b3820228d382dac8" exitCode=0 Dec 03 22:10:35.106595 master-0 kubenswrapper[36504]: I1203 22:10:35.106551 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5f574c6c79-6mvwr_c1ee4db7-f2d3-4064-a189-f66fd0a021eb/kube-scheduler-operator-container/3.log" Dec 03 22:10:35.106677 master-0 kubenswrapper[36504]: I1203 22:10:35.106617 36504 generic.go:334] "Generic (PLEG): container finished" podID="c1ee4db7-f2d3-4064-a189-f66fd0a021eb" containerID="8ef60a565e77b47e29c85d100f4aaafe8ce0754e92c6f0f4b921e8ac07f1fea6" exitCode=255 Dec 03 22:10:35.110952 master-0 kubenswrapper[36504]: I1203 22:10:35.110896 36504 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402" exitCode=0 Dec 03 22:10:35.113632 master-0 kubenswrapper[36504]: I1203 22:10:35.113597 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/4.log" Dec 03 22:10:35.114130 master-0 kubenswrapper[36504]: I1203 22:10:35.114094 36504 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="f4848bfd59e369f5ce71f4c1cac3394e313afe940a2755c2f4d6e03b1f962233" exitCode=255 Dec 03 22:10:35.114130 master-0 kubenswrapper[36504]: I1203 22:10:35.114122 36504 generic.go:334] "Generic (PLEG): container finished" podID="e50b85a6-7767-4fca-8133-8243bdd85e5d" containerID="08f0994ce641aa8480f422d38c06dab9bf39acafe61488448bfa2be6e9c006dc" exitCode=0 Dec 03 22:10:35.117432 master-0 kubenswrapper[36504]: I1203 22:10:35.117392 36504 generic.go:334] "Generic (PLEG): container finished" podID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerID="a2582bc7c1924ecfbd971e3c3eba302e8128d7abf1b76de440d5c3eb71052837" exitCode=0 Dec 03 22:10:35.122356 master-0 kubenswrapper[36504]: I1203 22:10:35.122292 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/3.log" Dec 03 22:10:35.122443 master-0 kubenswrapper[36504]: I1203 22:10:35.122369 36504 generic.go:334] "Generic (PLEG): container finished" podID="82055cfc-b4ce-4a00-a51d-141059947693" containerID="857211a86edae02d3593eaf3bb4ea9872abdabe69c74ef883ca26b0a2970ff26" exitCode=255 Dec 03 22:10:35.125331 master-0 kubenswrapper[36504]: I1203 22:10:35.125263 36504 generic.go:334] "Generic (PLEG): container finished" podID="29ac4a9d-1228-49c7-9051-338e7dc98a38" containerID="5c6cff0db5f54508702a0378b5a2fcfe25e7d02bb8251b1a5dc1e5b0d7ac5a5a" exitCode=0 Dec 03 22:10:35.131635 master-0 kubenswrapper[36504]: I1203 22:10:35.131477 36504 generic.go:334] "Generic (PLEG): container finished" podID="fd2fa610bb2a39c39fcdd00db03a511a" containerID="8c83e2edc2c0d9a9e848a5cf55074f9b40879a47fa830dc6cab1377b18fd6f6a" exitCode=0 Dec 03 22:10:35.137901 master-0 kubenswrapper[36504]: I1203 22:10:35.137833 36504 generic.go:334] "Generic (PLEG): container finished" podID="bcbec7ef-0b98-4346-8c6b-c5fa37e90286" containerID="968964eedf27b3e40df0428a52df5b822f50c33a2dfb85064c918086871bd63f" exitCode=0 Dec 03 22:10:35.142464 master-0 kubenswrapper[36504]: I1203 22:10:35.142415 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/3.log" Dec 03 22:10:35.142464 master-0 kubenswrapper[36504]: I1203 22:10:35.142462 36504 generic.go:334] "Generic (PLEG): container finished" podID="785612fc-3f78-4f1a-bc83-7afe5d3b8056" containerID="3f938a79f0b6c6611be3fa1d23328a9984e35cf9a2e94086de4076ebe8bf8a36" exitCode=255 Dec 03 22:10:35.144094 master-0 kubenswrapper[36504]: I1203 22:10:35.144052 36504 generic.go:334] "Generic (PLEG): container finished" podID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerID="888690db481ab6164f73ea7eec62997e93d90f2abdb333711d2e3c24534b02e9" exitCode=0 Dec 03 22:10:35.146847 master-0 kubenswrapper[36504]: I1203 22:10:35.146760 36504 generic.go:334] "Generic (PLEG): container finished" podID="05dd6e8e0dea56089da96190349dd4c1" containerID="e6e266ea8a0d197e496699cd337eabb6f247b692621f965213e4454b3e59b018" exitCode=0 Dec 03 22:10:35.150078 master-0 kubenswrapper[36504]: I1203 22:10:35.150026 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/5.log" Dec 03 22:10:35.150455 master-0 kubenswrapper[36504]: I1203 22:10:35.150405 36504 generic.go:334] "Generic (PLEG): container finished" podID="0869de9b-6f5b-4c31-81ad-02a9c8888193" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" exitCode=1 Dec 03 22:10:35.152339 master-0 kubenswrapper[36504]: I1203 22:10:35.152285 36504 generic.go:334] "Generic (PLEG): container finished" podID="add88bf0-c88d-427d-94bb-897e088a1378" containerID="50ea9b3d8d8a684066d6791cddb4680be7db2c4667be5e468a6d5e22cffb259f" exitCode=0 Dec 03 22:10:35.157271 master-0 kubenswrapper[36504]: I1203 22:10:35.157224 36504 generic.go:334] "Generic (PLEG): container finished" podID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerID="cfa4dcf215847e24aecbc2956ffb90a60bdcdbbe3d57e1f15a1b92587d1579de" exitCode=0 Dec 03 22:10:35.159755 master-0 kubenswrapper[36504]: I1203 22:10:35.159675 36504 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="6c639bc69c380d96355863b219dedf37b52fea28072ae5913704ecc349c5d8c1" exitCode=0 Dec 03 22:10:35.159755 master-0 kubenswrapper[36504]: I1203 22:10:35.159706 36504 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="70791961c08656ffa1017f6f966c1f2ba603d16610929ed1a69c881343d1bfec" exitCode=0 Dec 03 22:10:35.159755 master-0 kubenswrapper[36504]: I1203 22:10:35.159717 36504 generic.go:334] "Generic (PLEG): container finished" podID="13238af3704fe583f617f61e755cf4c2" containerID="ebd2204fe249e4ff0cd7edb5edf639c7a4971288017457fe2c22236d90e0112a" exitCode=0 Dec 03 22:10:35.161939 master-0 kubenswrapper[36504]: I1203 22:10:35.161438 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/4.log" Dec 03 22:10:35.161939 master-0 kubenswrapper[36504]: I1203 22:10:35.161480 36504 generic.go:334] "Generic (PLEG): container finished" podID="28c42112-a09e-4b7a-b23b-c06bef69cbfb" containerID="416cf014b256f3c13d903afd6395b3370d0676e44137bb0a7e4541716f9e2252" exitCode=1 Dec 03 22:10:35.163542 master-0 kubenswrapper[36504]: I1203 22:10:35.163489 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-jxw8c_08432be8-0086-48d2-a93d-7a474e96749d/kube-apiserver-operator/2.log" Dec 03 22:10:35.163606 master-0 kubenswrapper[36504]: I1203 22:10:35.163552 36504 generic.go:334] "Generic (PLEG): container finished" podID="08432be8-0086-48d2-a93d-7a474e96749d" containerID="53a60280703adaa150a5f659528627f228a9f7e4e33c25a05a6a83bbcae2a4c4" exitCode=255 Dec 03 22:10:35.165438 master-0 kubenswrapper[36504]: I1203 22:10:35.165392 36504 generic.go:334] "Generic (PLEG): container finished" podID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerID="e15ea152ef1a26d4cb376f11826bf4ceac9e8245aad4cfc2e16ac02f57a9e91c" exitCode=0 Dec 03 22:10:35.167782 master-0 kubenswrapper[36504]: I1203 22:10:35.167710 36504 generic.go:334] "Generic (PLEG): container finished" podID="39f0e973-7864-4842-af8e-47718ab1804c" containerID="0eed9d981ff70eac9619ddc620b8f9e1ce7420952f2e8fb539ac72d9a0cb037f" exitCode=0 Dec 03 22:10:35.170314 master-0 kubenswrapper[36504]: I1203 22:10:35.169985 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-56f5898f45-mjdfr_5f088999-ec66-402e-9634-8c762206d6b4/service-ca-operator/2.log" Dec 03 22:10:35.170314 master-0 kubenswrapper[36504]: I1203 22:10:35.170032 36504 generic.go:334] "Generic (PLEG): container finished" podID="5f088999-ec66-402e-9634-8c762206d6b4" containerID="33a7bb5f5e679144cd91e9f7be121f1b4387ca8c698a44bc1d39de4b8dcc3580" exitCode=255 Dec 03 22:10:35.182606 master-0 kubenswrapper[36504]: I1203 22:10:35.181491 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/0.log" Dec 03 22:10:35.182886 master-0 kubenswrapper[36504]: I1203 22:10:35.182636 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/cluster-cloud-controller-manager/0.log" Dec 03 22:10:35.182886 master-0 kubenswrapper[36504]: I1203 22:10:35.182724 36504 generic.go:334] "Generic (PLEG): container finished" podID="def52ba3-77c1-4e0c-8a0d-44ff4d677607" containerID="19a9bc6e0f1f8798f87d4882e3c42bf8b1af00134db1a21061f75b7ece3744fd" exitCode=1 Dec 03 22:10:35.182886 master-0 kubenswrapper[36504]: I1203 22:10:35.182813 36504 generic.go:334] "Generic (PLEG): container finished" podID="def52ba3-77c1-4e0c-8a0d-44ff4d677607" containerID="e063c475fe48713685c7da56849e11be35ce10c982e2192d9447df0278644182" exitCode=1 Dec 03 22:10:35.186203 master-0 kubenswrapper[36504]: I1203 22:10:35.186151 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/2.log" Dec 03 22:10:35.186277 master-0 kubenswrapper[36504]: I1203 22:10:35.186219 36504 generic.go:334] "Generic (PLEG): container finished" podID="6e96335e-1866-41c8-b128-b95e783a9be4" containerID="16ea4a93de4863ed15876a3040be9a6845e308ffe81f55c8cf6208378385e0da" exitCode=255 Dec 03 22:10:35.191445 master-0 kubenswrapper[36504]: I1203 22:10:35.191408 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-retry-1-master-0_25602d69-3aec-487d-8d62-c2c21f27e2b7/installer/0.log" Dec 03 22:10:35.191497 master-0 kubenswrapper[36504]: I1203 22:10:35.191445 36504 generic.go:334] "Generic (PLEG): container finished" podID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerID="8bd5db9bea83f9fc4e7c948b8e5ecf2d5c83b59963dd6e82975811d819eaa07a" exitCode=1 Dec 03 22:10:35.193341 master-0 kubenswrapper[36504]: I1203 22:10:35.193300 36504 generic.go:334] "Generic (PLEG): container finished" podID="31ee3291-2979-4903-98a2-355855cedd55" containerID="18687aca035ea2cd91fda30a36ce8489d00076f5b5f568339084de2e1d467700" exitCode=0 Dec 03 22:10:35.194336 master-0 kubenswrapper[36504]: E1203 22:10:35.194290 36504 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 03 22:10:35.201804 master-0 kubenswrapper[36504]: I1203 22:10:35.201721 36504 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="ff61a256b05f04a3b13b90f6bdc5da43d7f265eb788e86c58f6473565825003c" exitCode=0 Dec 03 22:10:35.201804 master-0 kubenswrapper[36504]: I1203 22:10:35.201803 36504 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="919b8027f8f33ff55f195fb5965cea0049cac7097943867a77eb3961b87644ee" exitCode=0 Dec 03 22:10:35.201804 master-0 kubenswrapper[36504]: I1203 22:10:35.201815 36504 generic.go:334] "Generic (PLEG): container finished" podID="4dd8b778e190b1975a0a8fad534da6dd" containerID="909ec871125d6c5f469945361701b23d979d4f7f33b16129f5238c7a2207ec30" exitCode=0 Dec 03 22:10:35.205968 master-0 kubenswrapper[36504]: I1203 22:10:35.205744 36504 generic.go:334] "Generic (PLEG): container finished" podID="bd18a700-53b2-430c-a34f-dbb6331cfbe5" containerID="70f67b41dc589699749acb86409088e137724e0bbf05196ad9f606340dc1c95e" exitCode=0 Dec 03 22:10:35.209805 master-0 kubenswrapper[36504]: I1203 22:10:35.209747 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-5f78c89466-kz8nk_da949cf7-ab12-43ff-8e45-da1c2fd46e20/manager/1.log" Dec 03 22:10:35.210338 master-0 kubenswrapper[36504]: I1203 22:10:35.210265 36504 generic.go:334] "Generic (PLEG): container finished" podID="da949cf7-ab12-43ff-8e45-da1c2fd46e20" containerID="b0ed7d6d004009e716a466752b32cae3db21211bdd49340bde3ae4a9d649b34e" exitCode=1 Dec 03 22:10:35.215523 master-0 kubenswrapper[36504]: I1203 22:10:35.215479 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="fcf20cc5fbcb1ad49ee0d23a982171e11d3207eebc2515ce8bb2b5502de1eab0" exitCode=0 Dec 03 22:10:35.215523 master-0 kubenswrapper[36504]: I1203 22:10:35.215517 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="e085324cd58339aaa2ff69b985958fab513139997fb20fc1324a7c3c052fa89d" exitCode=0 Dec 03 22:10:35.215654 master-0 kubenswrapper[36504]: I1203 22:10:35.215529 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="742bc32f3346ea0ae584753ab21aedef0a29aac40cae443f406076aeae30033d" exitCode=0 Dec 03 22:10:35.215654 master-0 kubenswrapper[36504]: I1203 22:10:35.215541 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="684c7729ba85fd8ac6a78b633cc41d781048493dac0127949a3ebcf247c67f5b" exitCode=0 Dec 03 22:10:35.215654 master-0 kubenswrapper[36504]: I1203 22:10:35.215552 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="c4e06171c55f7e5fa99ea1abceeee733b284cf3a34d0a86bf2f243d3f655f7a5" exitCode=0 Dec 03 22:10:35.215654 master-0 kubenswrapper[36504]: I1203 22:10:35.215562 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffad8fc8-4378-44de-8864-dd2f666ade68" containerID="2378e0ccca63367b0b1f9fee7b2b6b1c87516db57078ddfc7d8d7140f0467b96" exitCode=0 Dec 03 22:10:35.218036 master-0 kubenswrapper[36504]: I1203 22:10:35.218001 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-6cbf58c977-zk7jw_892d5611-debf-402f-abc5-3f99aa080159/network-operator/3.log" Dec 03 22:10:35.218103 master-0 kubenswrapper[36504]: I1203 22:10:35.218041 36504 generic.go:334] "Generic (PLEG): container finished" podID="892d5611-debf-402f-abc5-3f99aa080159" containerID="a0d8b8091c75c8d084f0ab3c649dec1f42114979d34c4d430f5908359e9d65f4" exitCode=255 Dec 03 22:10:35.220346 master-0 kubenswrapper[36504]: I1203 22:10:35.220274 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_0269ada6-cb6e-4c98-bd24-752ae0286498/installer/0.log" Dec 03 22:10:35.220346 master-0 kubenswrapper[36504]: I1203 22:10:35.220329 36504 generic.go:334] "Generic (PLEG): container finished" podID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerID="828e1c847e23b1857152d987d6c054b5de0c2ac4dd44ef540ab230ae90795ebb" exitCode=1 Dec 03 22:10:35.222735 master-0 kubenswrapper[36504]: I1203 22:10:35.222696 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/1.log" Dec 03 22:10:35.222735 master-0 kubenswrapper[36504]: I1203 22:10:35.222732 36504 generic.go:334] "Generic (PLEG): container finished" podID="ba624ed0-32cc-4c87-81a5-708a8a8a7f88" containerID="3be403144fdd17b31c23c14e8eecf3d61113ddb620955d19d68fd287cf34269c" exitCode=1 Dec 03 22:10:35.230264 master-0 kubenswrapper[36504]: I1203 22:10:35.230216 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-75b4d49d4c-psjj5_04f5fc52-4ec2-48c3-8441-2b15ad632233/package-server-manager/0.log" Dec 03 22:10:35.231360 master-0 kubenswrapper[36504]: I1203 22:10:35.231293 36504 generic.go:334] "Generic (PLEG): container finished" podID="04f5fc52-4ec2-48c3-8441-2b15ad632233" containerID="957fee7d4699a08d2cc2951a06021265bc26437f145867b0dd5f82dd49642db5" exitCode=1 Dec 03 22:10:35.236907 master-0 kubenswrapper[36504]: I1203 22:10:35.236863 36504 generic.go:334] "Generic (PLEG): container finished" podID="578b2d03-b8b3-4c75-adde-73899c472ad7" containerID="3445ebe854f42ae0ad53b657352b66491efe14d790f93fa83d0efffb68d3546d" exitCode=0 Dec 03 22:10:35.240201 master-0 kubenswrapper[36504]: I1203 22:10:35.240150 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kb5rx_858384f3-5741-4e67-8669-2eb2b2dcaf7f/cluster-autoscaler-operator/0.log" Dec 03 22:10:35.240614 master-0 kubenswrapper[36504]: I1203 22:10:35.240572 36504 generic.go:334] "Generic (PLEG): container finished" podID="858384f3-5741-4e67-8669-2eb2b2dcaf7f" containerID="4c2adc08380436f319ebce0bff4387679c426ba1840c8e9241539270a64e7dab" exitCode=255 Dec 03 22:10:35.242632 master-0 kubenswrapper[36504]: I1203 22:10:35.242564 36504 generic.go:334] "Generic (PLEG): container finished" podID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerID="19cd297510bc844f8288365a1588d651ec674ded0636669c86f19116e03ce004" exitCode=0 Dec 03 22:10:35.259351 master-0 kubenswrapper[36504]: I1203 22:10:35.259287 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/3.log" Dec 03 22:10:35.260026 master-0 kubenswrapper[36504]: I1203 22:10:35.259977 36504 generic.go:334] "Generic (PLEG): container finished" podID="fa9b5917-d4f3-4372-a200-45b57412f92f" containerID="2c8f1554a8a69b7abb2ac2fb40ac35726b3a427f042bc6ab03c716500f2824ff" exitCode=1 Dec 03 22:10:35.265462 master-0 kubenswrapper[36504]: I1203 22:10:35.265416 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4c2364d3-47b2-4784-9c42-76bf2547b797/installer/0.log" Dec 03 22:10:35.265576 master-0 kubenswrapper[36504]: I1203 22:10:35.265468 36504 generic.go:334] "Generic (PLEG): container finished" podID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerID="5850f56760e2a61c527c84ff5f17d82ffc3198cbe6b6606b3d9f5f38cfe114d6" exitCode=1 Dec 03 22:10:35.268305 master-0 kubenswrapper[36504]: I1203 22:10:35.268227 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/1.log" Dec 03 22:10:35.268736 master-0 kubenswrapper[36504]: I1203 22:10:35.268692 36504 generic.go:334] "Generic (PLEG): container finished" podID="9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6" containerID="9902d273d5ff7cb502b4442eacaaf41f2951f19d1c2e446bfccd73ed96a23e7a" exitCode=1 Dec 03 22:10:35.270908 master-0 kubenswrapper[36504]: I1203 22:10:35.270881 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-589f5cdc9d-25qxh_c8da5d44-680e-4169-abc6-607bdc37a64d/cluster-olm-operator/3.log" Dec 03 22:10:35.273064 master-0 kubenswrapper[36504]: I1203 22:10:35.273005 36504 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="a88cd4d204a69ad50a5388826a51d98da82a35275f10e3130a3f06ed4ccdea64" exitCode=255 Dec 03 22:10:35.273168 master-0 kubenswrapper[36504]: I1203 22:10:35.273070 36504 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="cdd730b5e809fc70ef1f6320b094ee33892a49867b7b0591e6a94822a48a578e" exitCode=0 Dec 03 22:10:35.273168 master-0 kubenswrapper[36504]: I1203 22:10:35.273082 36504 generic.go:334] "Generic (PLEG): container finished" podID="c8da5d44-680e-4169-abc6-607bdc37a64d" containerID="fa08d3f519d75f25a074a139ff466f2e3733733b2aac82151c4b8dd0843b2a0d" exitCode=0 Dec 03 22:10:35.280080 master-0 kubenswrapper[36504]: I1203 22:10:35.279968 36504 generic.go:334] "Generic (PLEG): container finished" podID="0e9427b8-d62c-45f7-97d0-1f7667ff27aa" containerID="3ad0afa8f21e830ac9c2172cd6306aca03770c2953877a91f4130533970ae228" exitCode=0 Dec 03 22:10:35.282149 master-0 kubenswrapper[36504]: I1203 22:10:35.282106 36504 generic.go:334] "Generic (PLEG): container finished" podID="1b47f2ef-9923-411f-9f2f-ddaea8bc7053" containerID="3d0456f7cf5468e13600eb8bcd1915c3323422019a6601c2df115d88e550552b" exitCode=0 Dec 03 22:10:35.289722 master-0 kubenswrapper[36504]: I1203 22:10:35.289689 36504 generic.go:334] "Generic (PLEG): container finished" podID="246b7846-0dfd-43a8-bcfa-81e7435060dc" containerID="13495d8d6fbe117fe450e36fc69d037193ff935ef6fdc0f5ba92833ef4c5c160" exitCode=0 Dec 03 22:10:35.293796 master-0 kubenswrapper[36504]: I1203 22:10:35.293308 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-cb84b9cdf-wkcnd_01b80ad5-7d7c-4ecd-90b0-2913d4559b5f/machine-approver-controller/0.log" Dec 03 22:10:35.293796 master-0 kubenswrapper[36504]: I1203 22:10:35.293762 36504 generic.go:334] "Generic (PLEG): container finished" podID="01b80ad5-7d7c-4ecd-90b0-2913d4559b5f" containerID="627c6aedb1aa983e6e6dc6c1e7265386ba60120ffcae5ff232be95dd9b9911e5" exitCode=255 Dec 03 22:10:35.296219 master-0 kubenswrapper[36504]: I1203 22:10:35.296167 36504 generic.go:334] "Generic (PLEG): container finished" podID="c807d487-5b8f-4747-87ee-df0637e2e11f" containerID="a2eed5b6aacd84387e58a76d08c7f60cf4fc01bda67cb81b8e95dd99934d41f2" exitCode=0 Dec 03 22:10:35.296219 master-0 kubenswrapper[36504]: I1203 22:10:35.296205 36504 generic.go:334] "Generic (PLEG): container finished" podID="c807d487-5b8f-4747-87ee-df0637e2e11f" containerID="c36651d25828b26f062b98a2d24259d3a418e128fa8528be38ba41a00ecbed18" exitCode=0 Dec 03 22:10:35.299084 master-0 kubenswrapper[36504]: I1203 22:10:35.299055 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_8d60f02e-1803-461e-9606-667d91fcae14/installer/0.log" Dec 03 22:10:35.299182 master-0 kubenswrapper[36504]: I1203 22:10:35.299098 36504 generic.go:334] "Generic (PLEG): container finished" podID="8d60f02e-1803-461e-9606-667d91fcae14" containerID="a31f9ca2a3872e9cae07acaea514d81ec85d80124c14419e3e1663f38e942380" exitCode=1 Dec 03 22:10:35.304512 master-0 kubenswrapper[36504]: I1203 22:10:35.304460 36504 generic.go:334] "Generic (PLEG): container finished" podID="10fc6516-cd4d-4291-a26d-8376ba0affef" containerID="508b98fabad3f511f41e1f4fccc27dd67c19dfb8a3b5dfe228e24e0f17fa5a05" exitCode=0 Dec 03 22:10:35.304658 master-0 kubenswrapper[36504]: I1203 22:10:35.304642 36504 generic.go:334] "Generic (PLEG): container finished" podID="10fc6516-cd4d-4291-a26d-8376ba0affef" containerID="824ae9e8313e0e1638464c8629a2eb0b5c0bd9befe3d9076d5a3df5161cfa30c" exitCode=0 Dec 03 22:10:35.309701 master-0 kubenswrapper[36504]: I1203 22:10:35.309657 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-96glt_b7f68d19-71d4-4129-a575-3ee57fa53493/cluster-node-tuning-operator/0.log" Dec 03 22:10:35.309852 master-0 kubenswrapper[36504]: I1203 22:10:35.309707 36504 generic.go:334] "Generic (PLEG): container finished" podID="b7f68d19-71d4-4129-a575-3ee57fa53493" containerID="ddc4b2406902641b1427a12cb2394dec3bff8dffab1d17cd293d7a2306efb279" exitCode=1 Dec 03 22:10:35.312629 master-0 kubenswrapper[36504]: I1203 22:10:35.312584 36504 generic.go:334] "Generic (PLEG): container finished" podID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerID="8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483" exitCode=0 Dec 03 22:10:35.320050 master-0 kubenswrapper[36504]: I1203 22:10:35.320013 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-r24k4_1a0f647a-0260-4737-8ae2-cc90d01d33d1/approver/1.log" Dec 03 22:10:35.320535 master-0 kubenswrapper[36504]: I1203 22:10:35.320504 36504 generic.go:334] "Generic (PLEG): container finished" podID="1a0f647a-0260-4737-8ae2-cc90d01d33d1" containerID="afb5b53ec6fc0fe7cc9eca316bae9563bccc04673dbe4d97d1079ce470fa5a03" exitCode=1 Dec 03 22:10:35.323331 master-0 kubenswrapper[36504]: I1203 22:10:35.323087 36504 generic.go:334] "Generic (PLEG): container finished" podID="0a49c320-f31d-4f6d-98c3-48d24346b873" containerID="c96d65a2700e3dd19239b4928664815aa3051ca2f5ecf0c1ae3c50129c0815b7" exitCode=0 Dec 03 22:10:35.323331 master-0 kubenswrapper[36504]: I1203 22:10:35.323128 36504 generic.go:334] "Generic (PLEG): container finished" podID="0a49c320-f31d-4f6d-98c3-48d24346b873" containerID="3a30bbf848145aedbf5309536c3bdd4398da29c24efe84cdd51cfed325666388" exitCode=0 Dec 03 22:10:35.337946 master-0 kubenswrapper[36504]: I1203 22:10:35.337906 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-67c4cff67d-vcd7b_50076985-bbaa-4bcf-9d1a-cc25bed016a7/kube-storage-version-migrator-operator/3.log" Dec 03 22:10:35.338229 master-0 kubenswrapper[36504]: I1203 22:10:35.338196 36504 generic.go:334] "Generic (PLEG): container finished" podID="50076985-bbaa-4bcf-9d1a-cc25bed016a7" containerID="2dd13e4ab397a03d92f03a68ff67d5ce97bced0fc31e3d5a401542285c8badd1" exitCode=255 Dec 03 22:10:35.341443 master-0 kubenswrapper[36504]: I1203 22:10:35.341361 36504 generic.go:334] "Generic (PLEG): container finished" podID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" containerID="404a35f8de80b669ffc0285b3b2f8a18d00355d93f5f0568bb71370a69a93877" exitCode=0 Dec 03 22:10:35.344040 master-0 kubenswrapper[36504]: I1203 22:10:35.344003 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-667484ff5-st2db_f59094ec-47dd-4547-ad41-b15a7933f461/openshift-apiserver-operator/2.log" Dec 03 22:10:35.344103 master-0 kubenswrapper[36504]: I1203 22:10:35.344069 36504 generic.go:334] "Generic (PLEG): container finished" podID="f59094ec-47dd-4547-ad41-b15a7933f461" containerID="694dfb3a5405c23ea842bfdfc58ff9ae7e4ed9b528fa489e10e2cc964b4f43d4" exitCode=255 Dec 03 22:10:35.351540 master-0 kubenswrapper[36504]: I1203 22:10:35.351520 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_b495b0c38f2c54e7cc46282c5f92aab5/kube-rbac-proxy-crio/2.log" Dec 03 22:10:35.352214 master-0 kubenswrapper[36504]: I1203 22:10:35.352167 36504 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864" exitCode=1 Dec 03 22:10:35.352274 master-0 kubenswrapper[36504]: I1203 22:10:35.352246 36504 generic.go:334] "Generic (PLEG): container finished" podID="b495b0c38f2c54e7cc46282c5f92aab5" containerID="4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4" exitCode=0 Dec 03 22:10:35.354991 master-0 kubenswrapper[36504]: I1203 22:10:35.354949 36504 generic.go:334] "Generic (PLEG): container finished" podID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerID="7458eacc3a5edc54c5cf843060c75af4d4324f599075c78c8fcbd5a674afd301" exitCode=0 Dec 03 22:10:35.354991 master-0 kubenswrapper[36504]: I1203 22:10:35.354983 36504 generic.go:334] "Generic (PLEG): container finished" podID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerID="60c50abe2ec8c7459d390c08606126e779403c662dcf37b0171073aa9b774934" exitCode=0 Dec 03 22:10:35.356646 master-0 kubenswrapper[36504]: I1203 22:10:35.356618 36504 generic.go:334] "Generic (PLEG): container finished" podID="4ea59f59-8970-4eea-994d-9763792ee704" containerID="e80985ff027f59006e6b5b28f1d88811cb0ea27e878e3be4ff3dc70b657dbfd8" exitCode=0 Dec 03 22:10:35.370299 master-0 kubenswrapper[36504]: I1203 22:10:35.367853 36504 generic.go:334] "Generic (PLEG): container finished" podID="a124c14f-20c6-4df3-956f-a858de0c73c9" containerID="af6236462e324978e8cd817baf06fb75710668886258289a9caf50532bbe9141" exitCode=0 Dec 03 22:10:35.372269 master-0 kubenswrapper[36504]: I1203 22:10:35.372221 36504 generic.go:334] "Generic (PLEG): container finished" podID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerID="18479610f2526d3c7733e22ac3dd7ab3ea47a3f6618f2f508eccebdebab45285" exitCode=0 Dec 03 22:10:35.405125 master-0 kubenswrapper[36504]: E1203 22:10:35.395505 36504 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 03 22:10:35.409160 master-0 kubenswrapper[36504]: I1203 22:10:35.408376 36504 generic.go:334] "Generic (PLEG): container finished" podID="53713eab-c920-4d5a-ae05-7cdb59ace852" containerID="e2f2192c61ed1d621d2aff90353ee42f69de43d1e563bba5ffc2fd9223a2ba8a" exitCode=0 Dec 03 22:10:35.411267 master-0 kubenswrapper[36504]: I1203 22:10:35.411188 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-7c4697b5f5-458zh_6976b503-87da-48fc-b097-d1b315fbee3f/openshift-controller-manager-operator/3.log" Dec 03 22:10:35.411327 master-0 kubenswrapper[36504]: I1203 22:10:35.411265 36504 generic.go:334] "Generic (PLEG): container finished" podID="6976b503-87da-48fc-b097-d1b315fbee3f" containerID="3ef5d22510c8de2958749a344d7b815a72b238bc55095fec3564929b0d79773a" exitCode=255 Dec 03 22:10:35.414578 master-0 kubenswrapper[36504]: I1203 22:10:35.414533 36504 generic.go:334] "Generic (PLEG): container finished" podID="b522af85-394e-4965-9bf4-83f48fb8ad94" containerID="a2b603babc41baf7369a41e8e1bd12d0ecf8378163120db7862d41d41fe4536e" exitCode=0 Dec 03 22:10:35.420030 master-0 kubenswrapper[36504]: I1203 22:10:35.419977 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-b5dddf8f5-llvrh_fdfbaebe-d655-4c1e-a039-08802c5c35c5/kube-controller-manager-operator/3.log" Dec 03 22:10:35.420134 master-0 kubenswrapper[36504]: I1203 22:10:35.420089 36504 generic.go:334] "Generic (PLEG): container finished" podID="fdfbaebe-d655-4c1e-a039-08802c5c35c5" containerID="964b789870e41d10c8d066861125f17066ab2d12be1bd7a27c3c121f0ff84160" exitCode=255 Dec 03 22:10:35.429605 master-0 kubenswrapper[36504]: I1203 22:10:35.429529 36504 generic.go:334] "Generic (PLEG): container finished" podID="64856d96-023f-46db-819c-02f1adea5aab" containerID="5af03b1fc8b9d3242cc113182968101569dfe657f33d984d37c023bd205c8309" exitCode=0 Dec 03 22:10:35.432782 master-0 kubenswrapper[36504]: I1203 22:10:35.432723 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-w9xk2_8b56c318-09b7-47f0-a7bf-32eb96e836ca/machine-api-operator/0.log" Dec 03 22:10:35.433125 master-0 kubenswrapper[36504]: I1203 22:10:35.433083 36504 generic.go:334] "Generic (PLEG): container finished" podID="8b56c318-09b7-47f0-a7bf-32eb96e836ca" containerID="dc9a686d307f5dc3dced989067ef4ced2ce1f8af42bcaa948a56153b4e7018c7" exitCode=255 Dec 03 22:10:35.474223 master-0 kubenswrapper[36504]: E1203 22:10:35.474174 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:10:35.562455 master-0 kubenswrapper[36504]: I1203 22:10:35.562407 36504 manager.go:324] Recovery completed Dec 03 22:10:35.648104 master-0 kubenswrapper[36504]: E1203 22:10:35.648035 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:10:35.649469 master-0 kubenswrapper[36504]: W1203 22:10:35.649410 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 WatchSource:0}: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:10:35.687266 master-0 kubenswrapper[36504]: I1203 22:10:35.687154 36504 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 22:10:35.687266 master-0 kubenswrapper[36504]: I1203 22:10:35.687184 36504 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 22:10:35.687266 master-0 kubenswrapper[36504]: I1203 22:10:35.687210 36504 state_mem.go:36] "Initialized new in-memory state store" Dec 03 22:10:35.687486 master-0 kubenswrapper[36504]: I1203 22:10:35.687388 36504 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 03 22:10:35.687486 master-0 kubenswrapper[36504]: I1203 22:10:35.687399 36504 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 03 22:10:35.687486 master-0 kubenswrapper[36504]: I1203 22:10:35.687416 36504 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Dec 03 22:10:35.687486 master-0 kubenswrapper[36504]: I1203 22:10:35.687422 36504 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Dec 03 22:10:35.687486 master-0 kubenswrapper[36504]: I1203 22:10:35.687428 36504 policy_none.go:49] "None policy: Start" Dec 03 22:10:35.690664 master-0 kubenswrapper[36504]: I1203 22:10:35.690643 36504 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 22:10:35.690816 master-0 kubenswrapper[36504]: I1203 22:10:35.690803 36504 state_mem.go:35] "Initializing new in-memory state store" Dec 03 22:10:35.691082 master-0 kubenswrapper[36504]: I1203 22:10:35.691069 36504 state_mem.go:75] "Updated machine memory state" Dec 03 22:10:35.691164 master-0 kubenswrapper[36504]: I1203 22:10:35.691153 36504 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Dec 03 22:10:35.711090 master-0 kubenswrapper[36504]: I1203 22:10:35.711023 36504 manager.go:334] "Starting Device Plugin manager" Dec 03 22:10:35.711365 master-0 kubenswrapper[36504]: I1203 22:10:35.711148 36504 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 22:10:35.711365 master-0 kubenswrapper[36504]: I1203 22:10:35.711174 36504 server.go:79] "Starting device plugin registration server" Dec 03 22:10:35.713407 master-0 kubenswrapper[36504]: I1203 22:10:35.713338 36504 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 22:10:35.713489 master-0 kubenswrapper[36504]: I1203 22:10:35.713399 36504 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 22:10:35.713725 master-0 kubenswrapper[36504]: I1203 22:10:35.713689 36504 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 22:10:35.713892 master-0 kubenswrapper[36504]: I1203 22:10:35.713873 36504 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 22:10:35.713892 master-0 kubenswrapper[36504]: I1203 22:10:35.713887 36504 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 22:10:35.806034 master-0 kubenswrapper[36504]: I1203 22:10:35.805607 36504 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Dec 03 22:10:35.806352 master-0 kubenswrapper[36504]: I1203 22:10:35.806267 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54fabba34f6ed1546c826bd2346dfd527db4cdc2878650997141f56d5e70ee87" Dec 03 22:10:35.806352 master-0 kubenswrapper[36504]: I1203 22:10:35.806295 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806358 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806373 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806385 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806414 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806425 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerDied","Data":"d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806435 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"8a00233b22d19df39b2e1c8ba133b3c2","Type":"ContainerStarted","Data":"6140791f4e5979c7be1bc34a3f290363c6ce9b4c0931d1ef70bec66f9fb74445"} Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806452 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd41923c7718b750c80c184d862535442462bd815561e9bbfb7bb52e77b97884" Dec 03 22:10:35.806500 master-0 kubenswrapper[36504]: I1203 22:10:35.806470 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"dcc6040d2844c3f3c3156ab1bc868c6f57b5cdffda04d188c4419d66f9b0084e"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806479 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"4469364ea09f6c46de0174b9f26041a598b86281f7be9ab5063b1bb3fbaf0cc1"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806522 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"2ef318f46b5f8577afb9290b7c23fb5a9f5de2346d418c5ad5d719a19e57967f"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806530 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerDied","Data":"8c83e2edc2c0d9a9e848a5cf55074f9b40879a47fa830dc6cab1377b18fd6f6a"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806542 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"fd2fa610bb2a39c39fcdd00db03a511a","Type":"ContainerStarted","Data":"7dded9f66e38346f58eab7668d020f179ca7449f5ac3538a2108b54cbabe523d"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806561 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f4b9a89514daa309f37b636ce10a58970922ba06493924e79533d702a051d8" Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806592 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf20ff99c26562f4b242d4f427025d07982e61e3333d9eba9727037ba6c8316" Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806624 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6870e12c7cebea9481ca6d5d2722804205f20e81d572c38201523c901437c2" Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806642 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d372df0702f388fb4c029dbd78740e3df1431045d11f281419ebde9b5e3ba62" Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806679 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8804c8caf822254253605f55cc71b86af43d373e50704b9313038bda7b60d32d" Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806689 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b933c1402e8992ef5246dfe34ff5daed65d9df17d247f67abc5c136f20ffdb22" Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806700 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"cf0b0e4e1879f4685f9869a658ff0f8605dc42c65afb63586ce51981115ba251"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806711 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"02f0436e29c41d2aa03592d5036e113bab231655e9781237864df2d13f97fd4c"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806723 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"b7cb5e73f7418263a2c8c42bb52ae4c4bb2f0b2e8a5241c7e6ae5e58057ecc31"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806733 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"bfdc7006170b024bd74362f283ec12b682ef294c515e62ba9157cea3e95b4c90"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806743 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"82d2da1a02234717b8dd48730b93c0192a4b42b7f7379e1fc9653baa00df6c93"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806754 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"ff61a256b05f04a3b13b90f6bdc5da43d7f265eb788e86c58f6473565825003c"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806781 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"919b8027f8f33ff55f195fb5965cea0049cac7097943867a77eb3961b87644ee"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806794 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerDied","Data":"909ec871125d6c5f469945361701b23d979d4f7f33b16129f5238c7a2207ec30"} Dec 03 22:10:35.806834 master-0 kubenswrapper[36504]: I1203 22:10:35.806805 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"4dd8b778e190b1975a0a8fad534da6dd","Type":"ContainerStarted","Data":"963f097bb66f564ca2cfab23207306eb31cee167ee206a1a05491adca1aa31a9"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.807788 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5aa221876e2ecf7e39b3a5aa7aae7fe9d5609e05e0b403c5225db362cc84b785" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.807878 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93441462469b8b27bc3d666672828b80b42a37ea80417a77f840571890747f10" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.807911 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e5c32836cb7467d081521f4c6877fa57e211ffc219e0b13d86a9ec21ed11c0c" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.807957 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f08d11be0e2919664ff2ea4b2440d0e0","Type":"ContainerStarted","Data":"42ac76745cd48697a9d60b7a3008b7dd9f6c94eb9ad7c1bc7b99f348cd44c91a"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.807988 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f08d11be0e2919664ff2ea4b2440d0e0","Type":"ContainerStarted","Data":"f3df99b2206cbdec4d3c14a3658db30c624bd0d8a8285e540bec8d439783f20e"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808003 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"f632f4d3a35c98012f1cece56605d69139c5283e86fa145d2f6236cf3af716de"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808014 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"affb4bc279c4e26b0213bf26fa803d2a6b54fe054c87700ae68e278a97fca108"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808024 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"5ecce271c239c91c72a501424ca7e835ff72e1bf3e5847efbd0d8ee1120b7b78"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808034 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"7f76635edb60bf491a762b76e0a7b8207d158079fd706ba1977dae6cce9c45ed"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808047 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"263e27d05b8311eaef7fd597428646a031a70345e2468d3a197a3e76a71409ad"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808081 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="672f302ded736b1d33cea7a7bcc4fbbc3d883497b6e842feb770ecf241c92532" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808094 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddfda6c4073bc139d7276df8107e2ee077a6d14642bd3a1ecd7361207cad43d9" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808165 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b00c511cb2afcbb4dd9070f0c3fa6c145e466d21cddbb067c88db9c3040e7fce" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808196 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"a66c1588269465b26d7305b3f03876e778d4729a5541123141a6ed0d5a7e9d38"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808214 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"a3ee0654ec8c746b3c22b0baaeb3976af9f6eb3d8adbaef4af98bf7a4ac7a864"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808227 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerDied","Data":"4fd6e7beed435c2febdb58b61912386340be61628332e2e84d31efc5c9a5d3f4"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808239 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"b495b0c38f2c54e7cc46282c5f92aab5","Type":"ContainerStarted","Data":"f876052ea631e48e5439c345d57d966548ec1d2bf09d0e6fdb6d1ebb5f87af57"} Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808267 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbf521b32f3f037a9fbdca13b211c55de4be4b2287d66d4fa6d98333ce221072" Dec 03 22:10:35.808480 master-0 kubenswrapper[36504]: I1203 22:10:35.808287 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2a3b62358bd4a4bbb52a3d9085933220738b643e13a1ead30da4f7f3bef1f4" Dec 03 22:10:35.815239 master-0 kubenswrapper[36504]: I1203 22:10:35.814662 36504 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:10:35.822801 master-0 kubenswrapper[36504]: I1203 22:10:35.821950 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 22:10:35.822801 master-0 kubenswrapper[36504]: I1203 22:10:35.822012 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 22:10:35.822801 master-0 kubenswrapper[36504]: I1203 22:10:35.822024 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 22:10:35.822801 master-0 kubenswrapper[36504]: I1203 22:10:35.822209 36504 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 22:10:35.828500 master-0 kubenswrapper[36504]: E1203 22:10:35.828441 36504 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Dec 03 22:10:35.829223 master-0 kubenswrapper[36504]: E1203 22:10:35.829189 36504 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.837899 master-0 kubenswrapper[36504]: E1203 22:10:35.836852 36504 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.837899 master-0 kubenswrapper[36504]: E1203 22:10:35.837462 36504 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.890540 master-0 kubenswrapper[36504]: I1203 22:10:35.890477 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.891286 master-0 kubenswrapper[36504]: I1203 22:10:35.891265 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.891385 master-0 kubenswrapper[36504]: I1203 22:10:35.891369 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.891457 master-0 kubenswrapper[36504]: I1203 22:10:35.891445 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.891537 master-0 kubenswrapper[36504]: I1203 22:10:35.891525 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.891619 master-0 kubenswrapper[36504]: I1203 22:10:35.891607 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.891698 master-0 kubenswrapper[36504]: I1203 22:10:35.891686 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.891859 master-0 kubenswrapper[36504]: I1203 22:10:35.891842 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.891971 master-0 kubenswrapper[36504]: I1203 22:10:35.891951 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.892072 master-0 kubenswrapper[36504]: I1203 22:10:35.892053 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.892173 master-0 kubenswrapper[36504]: I1203 22:10:35.892159 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.893481 master-0 kubenswrapper[36504]: I1203 22:10:35.893410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.893579 master-0 kubenswrapper[36504]: I1203 22:10:35.893505 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:35.893579 master-0 kubenswrapper[36504]: I1203 22:10:35.893529 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.893579 master-0 kubenswrapper[36504]: I1203 22:10:35.893549 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.893579 master-0 kubenswrapper[36504]: I1203 22:10:35.893566 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.893815 master-0 kubenswrapper[36504]: I1203 22:10:35.893583 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.893815 master-0 kubenswrapper[36504]: I1203 22:10:35.893601 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.893815 master-0 kubenswrapper[36504]: I1203 22:10:35.893620 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.893815 master-0 kubenswrapper[36504]: I1203 22:10:35.893639 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:35.994782 master-0 kubenswrapper[36504]: I1203 22:10:35.994638 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.994782 master-0 kubenswrapper[36504]: I1203 22:10:35.994689 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:35.994782 master-0 kubenswrapper[36504]: I1203 22:10:35.994718 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.994782 master-0 kubenswrapper[36504]: I1203 22:10:35.994737 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.994782 master-0 kubenswrapper[36504]: I1203 22:10:35.994756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994790 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994808 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994828 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994847 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994866 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994885 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994902 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994919 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994936 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994954 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.994976 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.995007 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.995029 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.995050 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.995070 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.995134 master-0 kubenswrapper[36504]: I1203 22:10:35.995123 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995194 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995228 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995256 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-static-pod-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995283 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-resource-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995311 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-cert-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995349 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995377 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995404 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-usr-local-bin\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995432 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995460 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995497 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-data-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995523 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/4dd8b778e190b1975a0a8fad534da6dd-log-dir\") pod \"etcd-master-0\" (UID: \"4dd8b778e190b1975a0a8fad534da6dd\") " pod="openshift-etcd/etcd-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995550 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995576 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995611 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/fd2fa610bb2a39c39fcdd00db03a511a-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"fd2fa610bb2a39c39fcdd00db03a511a\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995637 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b495b0c38f2c54e7cc46282c5f92aab5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"b495b0c38f2c54e7cc46282c5f92aab5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995668 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995728 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:35.995837 master-0 kubenswrapper[36504]: I1203 22:10:35.995757 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:36.011031 master-0 kubenswrapper[36504]: I1203 22:10:36.010970 36504 apiserver.go:52] "Watching apiserver" Dec 03 22:10:36.034129 master-0 kubenswrapper[36504]: I1203 22:10:36.034059 36504 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:10:36.034371 master-0 kubenswrapper[36504]: I1203 22:10:36.034275 36504 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 22:10:36.038499 master-0 kubenswrapper[36504]: I1203 22:10:36.038438 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 22:10:36.038499 master-0 kubenswrapper[36504]: I1203 22:10:36.038494 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 22:10:36.038499 master-0 kubenswrapper[36504]: I1203 22:10:36.038507 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 22:10:36.038922 master-0 kubenswrapper[36504]: I1203 22:10:36.038648 36504 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 22:10:36.039321 master-0 kubenswrapper[36504]: I1203 22:10:36.039216 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-64554dd846-6vfz6","openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b","openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6","openshift-network-diagnostics/network-check-target-78hts","openshift-network-node-identity/network-node-identity-r24k4","openshift-oauth-apiserver/apiserver-67d47fb995-88vr2","openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh","openshift-etcd/etcd-master-0","openshift-machine-config-operator/machine-config-server-vgm8c","openshift-monitoring/metrics-server-b9f5dccb6-4h4jv","openshift-network-operator/network-operator-6cbf58c977-zk7jw","openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5","openshift-ovn-kubernetes/ovnkube-node-k2j45","openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl","openshift-ingress-canary/ingress-canary-qsfnw","openshift-kube-apiserver/installer-2-master-0","openshift-kube-controller-manager/installer-2-master-0","openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5","openshift-marketplace/redhat-marketplace-tcqzq","openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9","openshift-multus/network-metrics-daemon-h6569","assisted-installer/assisted-installer-controller-q7jjz","openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc","openshift-multus/multus-6jlh8","openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5","openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq","openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr","openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm","openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj","openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d","openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb","openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc","openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n","openshift-multus/multus-additional-cni-plugins-qz5vh","openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m","openshift-service-ca/service-ca-6b8bb995f7-69t6v","openshift-kube-apiserver/kube-apiserver-master-0","openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx","openshift-kube-scheduler/installer-4-master-0","openshift-kube-scheduler/installer-5-master-0","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt","openshift-cluster-node-tuning-operator/tuned-fvghq","openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d","openshift-dns/dns-default-9skcn","openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s","openshift-monitoring/node-exporter-nkjnl","openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p","openshift-controller-manager/controller-manager-77778bd57c-xdhvs","openshift-dns/node-resolver-4dx8h","openshift-insights/insights-operator-59d99f9b7b-x4tfh","openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56","openshift-marketplace/redhat-operators-qht46","openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w","openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db","openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv","openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8","openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh","openshift-etcd/installer-1-master-0","openshift-etcd/installer-2-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr","openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx","openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm","openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp","openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j","openshift-ingress/router-default-54f97f57-xq6ch","openshift-marketplace/community-operators-k98b2","openshift-network-operator/iptables-alerter-clt4v","openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-scheduler/installer-4-retry-1-master-0","openshift-monitoring/prometheus-operator-565bdcb8-7s9vg","openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw","openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x","openshift-kube-apiserver/installer-1-master-0","openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49","openshift-machine-config-operator/machine-config-daemon-j9wwr","openshift-marketplace/certified-operators-kp794","openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk","openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc","openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c","openshift-kube-apiserver/installer-5-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh","openshift-kube-controller-manager/installer-3-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-api/machine-api-operator-7486ff55f-w9xk2","openshift-multus/multus-admission-controller-5bdcc987c4-5cs48"] Dec 03 22:10:36.039703 master-0 kubenswrapper[36504]: I1203 22:10:36.039640 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-q7jjz" Dec 03 22:10:36.041658 master-0 kubenswrapper[36504]: I1203 22:10:36.041622 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv" Dec 03 22:10:36.042418 master-0 kubenswrapper[36504]: I1203 22:10:36.042381 36504 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="ea6b79e2-24a1-4ee2-8309-70660598fa75" Dec 03 22:10:36.044274 master-0 kubenswrapper[36504]: I1203 22:10:36.044227 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 22:10:36.044515 master-0 kubenswrapper[36504]: E1203 22:10:36.044493 36504 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Dec 03 22:10:36.045511 master-0 kubenswrapper[36504]: I1203 22:10:36.045491 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 22:10:36.045638 master-0 kubenswrapper[36504]: I1203 22:10:36.045607 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.045740 master-0 kubenswrapper[36504]: I1203 22:10:36.045715 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.045865 master-0 kubenswrapper[36504]: I1203 22:10:36.045514 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 22:10:36.045954 master-0 kubenswrapper[36504]: I1203 22:10:36.045930 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 22:10:36.046013 master-0 kubenswrapper[36504]: I1203 22:10:36.045965 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 22:10:36.046729 master-0 kubenswrapper[36504]: I1203 22:10:36.046186 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.046729 master-0 kubenswrapper[36504]: I1203 22:10:36.046269 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 22:10:36.047091 master-0 kubenswrapper[36504]: I1203 22:10:36.046689 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 22:10:36.047184 master-0 kubenswrapper[36504]: I1203 22:10:36.046798 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.047229 master-0 kubenswrapper[36504]: I1203 22:10:36.046806 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 22:10:36.047280 master-0 kubenswrapper[36504]: I1203 22:10:36.046808 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 22:10:36.048457 master-0 kubenswrapper[36504]: I1203 22:10:36.047550 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 22:10:36.048457 master-0 kubenswrapper[36504]: I1203 22:10:36.048177 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 22:10:36.048457 master-0 kubenswrapper[36504]: I1203 22:10:36.048236 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 22:10:36.049184 master-0 kubenswrapper[36504]: I1203 22:10:36.049162 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 22:10:36.049345 master-0 kubenswrapper[36504]: I1203 22:10:36.049212 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 22:10:36.049403 master-0 kubenswrapper[36504]: I1203 22:10:36.049247 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 22:10:36.049580 master-0 kubenswrapper[36504]: I1203 22:10:36.049243 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.049690 master-0 kubenswrapper[36504]: I1203 22:10:36.049664 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 22:10:36.050068 master-0 kubenswrapper[36504]: I1203 22:10:36.050043 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Dec 03 22:10:36.050390 master-0 kubenswrapper[36504]: I1203 22:10:36.050361 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 22:10:36.050522 master-0 kubenswrapper[36504]: I1203 22:10:36.050494 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 22:10:36.050696 master-0 kubenswrapper[36504]: I1203 22:10:36.050668 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.050954 master-0 kubenswrapper[36504]: I1203 22:10:36.050929 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 22:10:36.051047 master-0 kubenswrapper[36504]: I1203 22:10:36.050966 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 22:10:36.051098 master-0 kubenswrapper[36504]: I1203 22:10:36.051083 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 22:10:36.051202 master-0 kubenswrapper[36504]: I1203 22:10:36.051181 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 22:10:36.051310 master-0 kubenswrapper[36504]: I1203 22:10:36.051289 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.052824 master-0 kubenswrapper[36504]: I1203 22:10:36.052741 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:10:36.053126 master-0 kubenswrapper[36504]: I1203 22:10:36.053100 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 22:10:36.053261 master-0 kubenswrapper[36504]: I1203 22:10:36.053240 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 22:10:36.053322 master-0 kubenswrapper[36504]: I1203 22:10:36.053292 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 22:10:36.053365 master-0 kubenswrapper[36504]: I1203 22:10:36.053330 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053389 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053429 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053469 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053546 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053578 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053586 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053437 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053590 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053673 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 22:10:36.054106 master-0 kubenswrapper[36504]: I1203 22:10:36.053552 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 22:10:36.055141 master-0 kubenswrapper[36504]: I1203 22:10:36.055112 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Dec 03 22:10:36.055431 master-0 kubenswrapper[36504]: I1203 22:10:36.055340 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 22:10:36.055570 master-0 kubenswrapper[36504]: I1203 22:10:36.055539 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 22:10:36.055629 master-0 kubenswrapper[36504]: I1203 22:10:36.055586 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 22:10:36.055712 master-0 kubenswrapper[36504]: I1203 22:10:36.055545 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 22:10:36.055754 master-0 kubenswrapper[36504]: I1203 22:10:36.055724 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.055901 master-0 kubenswrapper[36504]: I1203 22:10:36.055880 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 22:10:36.055954 master-0 kubenswrapper[36504]: I1203 22:10:36.055936 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 22:10:36.055994 master-0 kubenswrapper[36504]: I1203 22:10:36.055965 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.056094 master-0 kubenswrapper[36504]: I1203 22:10:36.056074 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 22:10:36.056247 master-0 kubenswrapper[36504]: I1203 22:10:36.056228 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 22:10:36.056305 master-0 kubenswrapper[36504]: I1203 22:10:36.055889 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 22:10:36.056378 master-0 kubenswrapper[36504]: I1203 22:10:36.056359 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 22:10:36.056452 master-0 kubenswrapper[36504]: I1203 22:10:36.056434 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 22:10:36.056595 master-0 kubenswrapper[36504]: I1203 22:10:36.056543 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 22:10:36.056657 master-0 kubenswrapper[36504]: I1203 22:10:36.056599 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 22:10:36.056707 master-0 kubenswrapper[36504]: I1203 22:10:36.056676 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 22:10:36.056767 master-0 kubenswrapper[36504]: I1203 22:10:36.056747 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.057051 master-0 kubenswrapper[36504]: I1203 22:10:36.057016 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 22:10:36.062056 master-0 kubenswrapper[36504]: I1203 22:10:36.062006 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.064235 master-0 kubenswrapper[36504]: I1203 22:10:36.064206 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 22:10:36.064235 master-0 kubenswrapper[36504]: I1203 22:10:36.064225 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 22:10:36.064401 master-0 kubenswrapper[36504]: I1203 22:10:36.064348 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 22:10:36.064597 master-0 kubenswrapper[36504]: I1203 22:10:36.064543 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 22:10:36.064679 master-0 kubenswrapper[36504]: I1203 22:10:36.057106 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 22:10:36.064679 master-0 kubenswrapper[36504]: I1203 22:10:36.064672 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 22:10:36.064793 master-0 kubenswrapper[36504]: I1203 22:10:36.064699 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 22:10:36.064793 master-0 kubenswrapper[36504]: I1203 22:10:36.057152 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 22:10:36.064870 master-0 kubenswrapper[36504]: I1203 22:10:36.057710 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 22:10:36.064950 master-0 kubenswrapper[36504]: I1203 22:10:36.057750 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 22:10:36.065000 master-0 kubenswrapper[36504]: I1203 22:10:36.057796 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 22:10:36.065145 master-0 kubenswrapper[36504]: I1203 22:10:36.065117 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 22:10:36.065656 master-0 kubenswrapper[36504]: I1203 22:10:36.065626 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 22:10:36.066105 master-0 kubenswrapper[36504]: I1203 22:10:36.066077 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 22:10:36.066232 master-0 kubenswrapper[36504]: I1203 22:10:36.066210 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 22:10:36.068426 master-0 kubenswrapper[36504]: I1203 22:10:36.068374 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 22:10:36.070169 master-0 kubenswrapper[36504]: I1203 22:10:36.070048 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 22:10:36.070353 master-0 kubenswrapper[36504]: I1203 22:10:36.070321 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 22:10:36.070710 master-0 kubenswrapper[36504]: I1203 22:10:36.070657 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 22:10:36.070848 master-0 kubenswrapper[36504]: I1203 22:10:36.070819 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 22:10:36.070954 master-0 kubenswrapper[36504]: I1203 22:10:36.070928 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 22:10:36.071019 master-0 kubenswrapper[36504]: I1203 22:10:36.070981 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 22:10:36.071062 master-0 kubenswrapper[36504]: I1203 22:10:36.071030 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:10:36.071153 master-0 kubenswrapper[36504]: I1203 22:10:36.071131 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 22:10:36.071238 master-0 kubenswrapper[36504]: I1203 22:10:36.071201 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:10:36.071393 master-0 kubenswrapper[36504]: I1203 22:10:36.071373 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 22:10:36.072387 master-0 kubenswrapper[36504]: I1203 22:10:36.071394 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 22:10:36.078039 master-0 kubenswrapper[36504]: I1203 22:10:36.078000 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 22:10:36.078294 master-0 kubenswrapper[36504]: I1203 22:10:36.078264 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 22:10:36.081305 master-0 kubenswrapper[36504]: I1203 22:10:36.081257 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Dec 03 22:10:36.082874 master-0 kubenswrapper[36504]: I1203 22:10:36.082827 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 22:10:36.084755 master-0 kubenswrapper[36504]: I1203 22:10:36.084722 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 22:10:36.090016 master-0 kubenswrapper[36504]: I1203 22:10:36.089956 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 22:10:36.091080 master-0 kubenswrapper[36504]: I1203 22:10:36.091052 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 22:10:36.093285 master-0 kubenswrapper[36504]: I1203 22:10:36.091064 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Dec 03 22:10:36.098088 master-0 kubenswrapper[36504]: I1203 22:10:36.093506 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m" Dec 03 22:10:36.098393 master-0 kubenswrapper[36504]: I1203 22:10:36.093530 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Dec 03 22:10:36.098763 master-0 kubenswrapper[36504]: I1203 22:10:36.091765 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 22:10:36.098909 master-0 kubenswrapper[36504]: I1203 22:10:36.093630 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Dec 03 22:10:36.099045 master-0 kubenswrapper[36504]: I1203 22:10:36.095103 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Dec 03 22:10:36.099118 master-0 kubenswrapper[36504]: I1203 22:10:36.095388 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Dec 03 22:10:36.099304 master-0 kubenswrapper[36504]: I1203 22:10:36.095482 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Dec 03 22:10:36.099498 master-0 kubenswrapper[36504]: I1203 22:10:36.095734 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 22:10:36.099498 master-0 kubenswrapper[36504]: I1203 22:10:36.096000 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.099664 master-0 kubenswrapper[36504]: I1203 22:10:36.091888 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 22:10:36.099664 master-0 kubenswrapper[36504]: I1203 22:10:36.099461 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 22:10:36.099664 master-0 kubenswrapper[36504]: I1203 22:10:36.095725 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 22:10:36.099846 master-0 kubenswrapper[36504]: I1203 22:10:36.095724 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 22:10:36.099893 master-0 kubenswrapper[36504]: I1203 22:10:36.099857 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.099947 master-0 kubenswrapper[36504]: I1203 22:10:36.099928 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.100042 master-0 kubenswrapper[36504]: I1203 22:10:36.100013 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdfbaebe-d655-4c1e-a039-08802c5c35c5-serving-cert\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 22:10:36.100097 master-0 kubenswrapper[36504]: I1203 22:10:36.100040 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 22:10:36.100175 master-0 kubenswrapper[36504]: I1203 22:10:36.100150 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.101356 master-0 kubenswrapper[36504]: I1203 22:10:36.101119 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-serving-ca\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.101497 master-0 kubenswrapper[36504]: I1203 22:10:36.101460 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.101497 master-0 kubenswrapper[36504]: I1203 22:10:36.101462 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50076985-bbaa-4bcf-9d1a-cc25bed016a7-serving-cert\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 22:10:36.101602 master-0 kubenswrapper[36504]: I1203 22:10:36.101537 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:36.101684 master-0 kubenswrapper[36504]: I1203 22:10:36.101629 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-cabundle\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 22:10:36.101895 master-0 kubenswrapper[36504]: I1203 22:10:36.101821 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.101979 master-0 kubenswrapper[36504]: I1203 22:10:36.101885 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 22:10:36.101979 master-0 kubenswrapper[36504]: I1203 22:10:36.101926 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:36.101979 master-0 kubenswrapper[36504]: I1203 22:10:36.101961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102176 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102185 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-cabundle\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102211 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102240 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102264 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102283 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102310 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102330 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102352 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-serving-cert\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102373 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102394 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102415 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102549 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/858384f3-5741-4e67-8669-2eb2b2dcaf7f-cert\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102592 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-lib-modules\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102622 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/698e6d87-1a58-493c-8b69-d22c89d26ac5-service-ca-bundle\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102650 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-policies\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.102791 master-0 kubenswrapper[36504]: I1203 22:10:36.102687 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/785612fc-3f78-4f1a-bc83-7afe5d3b8056-serving-cert\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102832 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102869 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102894 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102920 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102945 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/da949cf7-ab12-43ff-8e45-da1c2fd46e20-cache\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102969 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/858384f3-5741-4e67-8669-2eb2b2dcaf7f-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.102991 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103008 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-client\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103012 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08432be8-0086-48d2-a93d-7a474e96749d-config\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103032 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103057 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103081 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103088 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-trusted-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103102 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103160 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66aa2598-f4b6-4d3a-9623-aeb707e4912b-hosts-file\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103181 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103226 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6976b503-87da-48fc-b097-d1b315fbee3f-serving-cert\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103236 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f0e973-7864-4842-af8e-47718ab1804c-trusted-ca\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103311 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zsw\" (UniqueName: \"kubernetes.io/projected/b522af85-394e-4965-9bf4-83f48fb8ad94-kube-api-access-b7zsw\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103476 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103477 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/bebd69d2-5b0f-4b66-8722-d6861eba3e12-telemetry-config\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103496 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/da949cf7-ab12-43ff-8e45-da1c2fd46e20-cache\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103518 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103561 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-trusted-ca\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103584 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50076985-bbaa-4bcf-9d1a-cc25bed016a7-config\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103679 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103710 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df8nl\" (UniqueName: \"kubernetes.io/projected/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-kube-api-access-df8nl\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103732 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50b85a6-7767-4fca-8133-8243bdd85e5d-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103794 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-host\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103819 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-default-certificate\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103830 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f59094ec-47dd-4547-ad41-b15a7933f461-config\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103848 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103911 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-serving-cert\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.104133 master-0 kubenswrapper[36504]: I1203 22:10:36.103990 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104058 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104300 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-encryption-config\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104328 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-kubernetes\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104350 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104376 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5q4k\" (UniqueName: \"kubernetes.io/projected/e50b85a6-7767-4fca-8133-8243bdd85e5d-kube-api-access-z5q4k\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104413 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/04f5fc52-4ec2-48c3-8441-2b15ad632233-package-server-manager-serving-cert\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104407 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104511 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104098 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0869de9b-6f5b-4c31-81ad-02a9c8888193-metrics-tls\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104549 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7e0eea-3da8-43de-87bc-d10231e7c239-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104582 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-trusted-ca-bundle\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104643 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104681 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104754 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gds\" (UniqueName: \"kubernetes.io/projected/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-kube-api-access-n4gds\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104818 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104850 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104903 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-config\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104921 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104958 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.104983 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105006 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-tuned\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105025 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105049 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105070 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105091 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-node-pullsecrets\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105116 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105136 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105135 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdfbaebe-d655-4c1e-a039-08802c5c35c5-config\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105150 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-tuned\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105158 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105273 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105314 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105343 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105435 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105495 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105540 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f59094ec-47dd-4547-ad41-b15a7933f461-serving-cert\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105554 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-config\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105586 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82055cfc-b4ce-4a00-a51d-141059947693-etcd-client\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105572 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105581 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-whereabouts-configmap\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105636 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50b85a6-7767-4fca-8133-8243bdd85e5d-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105663 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105688 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105709 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105728 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntg2z\" (UniqueName: \"kubernetes.io/projected/2b014bee-5931-4856-b9e8-e38a134a1b6b-kube-api-access-ntg2z\") pod \"migrator-5bcf58cf9c-qc9zc\" (UID: \"2b014bee-5931-4856-b9e8-e38a134a1b6b\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105757 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e50b85a6-7767-4fca-8133-8243bdd85e5d-available-featuregates\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105786 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105817 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysconfig\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105839 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-key\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105861 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-docker\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105883 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105906 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-ca-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.105966 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-service-ca-bundle\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.106003 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.106053 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89p9d\" (UniqueName: \"kubernetes.io/projected/28c42112-a09e-4b7a-b23b-c06bef69cbfb-kube-api-access-89p9d\") pod \"csi-snapshot-controller-86897dd478-g4ldp\" (UID: \"28c42112-a09e-4b7a-b23b-c06bef69cbfb\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.106076 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-catalogserver-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.106151 master-0 kubenswrapper[36504]: I1203 22:10:36.106198 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a0f647a-0260-4737-8ae2-cc90d01d33d1-webhook-cert\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106198 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106229 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-cni-binary-copy\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106262 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/c8da5d44-680e-4169-abc6-607bdc37a64d-operand-assets\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106305 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-signing-key\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106262 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-serving-cert\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/785612fc-3f78-4f1a-bc83-7afe5d3b8056-config\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106435 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106470 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106474 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/892d5611-debf-402f-abc5-3f99aa080159-metrics-tls\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:36.107920 master-0 kubenswrapper[36504]: I1203 22:10:36.106544 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.108438 master-0 kubenswrapper[36504]: I1203 22:10:36.108411 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.108505 master-0 kubenswrapper[36504]: I1203 22:10:36.108446 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97kqz\" (UniqueName: \"kubernetes.io/projected/698e6d87-1a58-493c-8b69-d22c89d26ac5-kube-api-access-97kqz\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.108505 master-0 kubenswrapper[36504]: I1203 22:10:36.108469 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:36.108505 master-0 kubenswrapper[36504]: I1203 22:10:36.108486 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.108638 master-0 kubenswrapper[36504]: I1203 22:10:36.108506 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfd7g\" (UniqueName: \"kubernetes.io/projected/578b2d03-b8b3-4c75-adde-73899c472ad7-kube-api-access-gfd7g\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:36.108638 master-0 kubenswrapper[36504]: I1203 22:10:36.108528 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.108638 master-0 kubenswrapper[36504]: I1203 22:10:36.108609 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 22:10:36.108638 master-0 kubenswrapper[36504]: I1203 22:10:36.108624 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f088999-ec66-402e-9634-8c762206d6b4-config\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108635 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108709 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-conf\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108730 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108781 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108803 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-trusted-ca-bundle\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108842 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108862 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-systemd\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.108893 master-0 kubenswrapper[36504]: I1203 22:10:36.108881 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.108922 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.108942 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.108961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/922419d4-b528-472e-8215-4a55a96dab08-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-rqszb\" (UID: \"922419d4-b528-472e-8215-4a55a96dab08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.108983 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109004 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109022 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109043 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7f68d19-71d4-4129-a575-3ee57fa53493-trusted-ca\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109059 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109105 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-config\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109165 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109257 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-env-overrides\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109313 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109325 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-daemon-config\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.109386 master-0 kubenswrapper[36504]: I1203 22:10:36.109360 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-ca-certs\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109416 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109464 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kvxd\" (UniqueName: \"kubernetes.io/projected/3a7e0eea-3da8-43de-87bc-d10231e7c239-kube-api-access-6kvxd\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109524 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54767c36-ca29-4c91-9a8a-9699ecfa4afb-config-volume\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109541 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/bebd69d2-5b0f-4b66-8722-d6861eba3e12-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109653 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109804 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrbd\" (UniqueName: \"kubernetes.io/projected/858384f3-5741-4e67-8669-2eb2b2dcaf7f-kube-api-access-qgrbd\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 22:10:36.110276 master-0 kubenswrapper[36504]: I1203 22:10:36.109845 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z9vv\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-kube-api-access-4z9vv\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110315 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110334 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110384 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgwxt\" (UniqueName: \"kubernetes.io/projected/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-kube-api-access-wgwxt\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110406 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110445 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110466 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110488 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-client\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110542 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110595 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-metrics-certs\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.110629 master-0 kubenswrapper[36504]: I1203 22:10:36.110614 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110652 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110685 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b7f68d19-71d4-4129-a575-3ee57fa53493-apiservice-cert\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110701 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110724 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtxdk\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-kube-api-access-dtxdk\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110797 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110848 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110868 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110885 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110948 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.110965 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.111021 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53713eab-c920-4d5a-ae05-7cdb59ace852-ovn-node-metrics-cert\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.111004 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.111169 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/812401c0-d1ac-4857-b939-217b7b07f8bc-metrics-certs\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.111195 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8da5d44-680e-4169-abc6-607bdc37a64d-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.111123 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzl8x\" (UniqueName: \"kubernetes.io/projected/787c50e1-35b5-43d7-9c26-8dd5399693d3-kube-api-access-jzl8x\") pod \"network-check-source-6964bb78b7-lntt5\" (UID: \"787c50e1-35b5-43d7-9c26-8dd5399693d3\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 22:10:36.111272 master-0 kubenswrapper[36504]: I1203 22:10:36.111274 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-containers\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111327 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111346 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-stats-auth\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111394 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-tmp\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111414 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111590 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111705 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-tmp\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111523 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111841 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111941 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.111975 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-run\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.112005 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.112029 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 22:10:36.112129 master-0 kubenswrapper[36504]: I1203 22:10:36.112062 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112162 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0869de9b-6f5b-4c31-81ad-02a9c8888193-trusted-ca\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112171 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112223 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112254 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112276 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-serving-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112296 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112317 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112336 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112355 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-audit\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112376 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112397 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw7l6\" (UniqueName: \"kubernetes.io/projected/66aa2598-f4b6-4d3a-9623-aeb707e4912b-kube-api-access-mw7l6\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112415 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112432 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112457 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 22:10:36.112453 master-0 kubenswrapper[36504]: I1203 22:10:36.112462 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6976b503-87da-48fc-b097-d1b315fbee3f-config\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112482 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9sj\" (UniqueName: \"kubernetes.io/projected/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-kube-api-access-dx9sj\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112376 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08432be8-0086-48d2-a93d-7a474e96749d-serving-cert\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112491 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-serving-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112507 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-env-overrides\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112505 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h2wx\" (UniqueName: \"kubernetes.io/projected/246b7846-0dfd-43a8-bcfa-81e7435060dc-kube-api-access-5h2wx\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112492 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-serving-cert\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112623 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/578b2d03-b8b3-4c75-adde-73899c472ad7-snapshots\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112703 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112777 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-audit-dir\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112801 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112816 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/578b2d03-b8b3-4c75-adde-73899c472ad7-snapshots\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112858 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.112888 master-0 kubenswrapper[36504]: I1203 22:10:36.112878 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-dir\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.112909 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl7tn\" (UniqueName: \"kubernetes.io/projected/54767c36-ca29-4c91-9a8a-9699ecfa4afb-kube-api-access-bl7tn\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.112931 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82055cfc-b4ce-4a00-a51d-141059947693-etcd-service-ca\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.112947 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.112976 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.112998 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-encryption-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113018 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113038 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113055 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113074 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113096 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113117 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113135 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-modprobe-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113142 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-binary-copy\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113155 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-var-lib-kubelet\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113175 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.113230 master-0 kubenswrapper[36504]: I1203 22:10:36.113194 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113243 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/29ac4a9d-1228-49c7-9051-338e7dc98a38-ovnkube-config\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113261 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113284 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113326 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/39f0e973-7864-4842-af8e-47718ab1804c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113346 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a4399d20-f9a6-4ab1-86be-e2845394eaba-marketplace-operator-metrics\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113328 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113413 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj79k\" (UniqueName: \"kubernetes.io/projected/fa9b5917-d4f3-4372-a200-45b57412f92f-kube-api-access-pj79k\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113425 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a9a3f403-a742-4977-901a-cf4a8eb7df5a-metrics-tls\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113430 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113452 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113471 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-image-import-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113498 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-cache\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113519 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-sys\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113539 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113518 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f088999-ec66-402e-9634-8c762206d6b4-serving-cert\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 22:10:36.113648 master-0 kubenswrapper[36504]: I1203 22:10:36.113573 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-cache\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.114069 master-0 kubenswrapper[36504]: I1203 22:10:36.113698 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ffad8fc8-4378-44de-8864-dd2f666ade68-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.114551 master-0 kubenswrapper[36504]: I1203 22:10:36.114386 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 22:10:36.120023 master-0 kubenswrapper[36504]: I1203 22:10:36.119988 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 22:10:36.120866 master-0 kubenswrapper[36504]: I1203 22:10:36.120836 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-etcd-client\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.140939 master-0 kubenswrapper[36504]: I1203 22:10:36.140884 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 22:10:36.143288 master-0 kubenswrapper[36504]: I1203 22:10:36.143254 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b522af85-394e-4965-9bf4-83f48fb8ad94-encryption-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.160521 master-0 kubenswrapper[36504]: I1203 22:10:36.160469 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 22:10:36.161808 master-0 kubenswrapper[36504]: I1203 22:10:36.161760 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-ca-certs\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.165354 master-0 kubenswrapper[36504]: I1203 22:10:36.165312 36504 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Dec 03 22:10:36.182158 master-0 kubenswrapper[36504]: I1203 22:10:36.182108 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 22:10:36.187616 master-0 kubenswrapper[36504]: I1203 22:10:36.187564 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-audit\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.219792 master-0 kubenswrapper[36504]: I1203 22:10:36.219695 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:36.220009 master-0 kubenswrapper[36504]: I1203 22:10:36.219803 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-containers\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.220009 master-0 kubenswrapper[36504]: I1203 22:10:36.219855 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:36.220009 master-0 kubenswrapper[36504]: I1203 22:10:36.219918 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-catalog-content\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:36.220009 master-0 kubenswrapper[36504]: I1203 22:10:36.219958 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.220009 master-0 kubenswrapper[36504]: I1203 22:10:36.219992 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.220243 master-0 kubenswrapper[36504]: I1203 22:10:36.220029 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.220243 master-0 kubenswrapper[36504]: I1203 22:10:36.220069 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-run\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.220243 master-0 kubenswrapper[36504]: I1203 22:10:36.220131 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwdzk\" (UniqueName: \"kubernetes.io/projected/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-kube-api-access-dwdzk\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.220243 master-0 kubenswrapper[36504]: I1203 22:10:36.220164 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq55c\" (UniqueName: \"kubernetes.io/projected/a124c14f-20c6-4df3-956f-a858de0c73c9-kube-api-access-fq55c\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:36.220243 master-0 kubenswrapper[36504]: I1203 22:10:36.220203 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:36.220444 master-0 kubenswrapper[36504]: I1203 22:10:36.220250 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.220444 master-0 kubenswrapper[36504]: I1203 22:10:36.220286 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:36.220444 master-0 kubenswrapper[36504]: I1203 22:10:36.220318 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 22:10:36.220444 master-0 kubenswrapper[36504]: I1203 22:10:36.220350 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:36.220444 master-0 kubenswrapper[36504]: I1203 22:10:36.220386 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rjd\" (UniqueName: \"kubernetes.io/projected/6e96335e-1866-41c8-b128-b95e783a9be4-kube-api-access-v8rjd\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 22:10:36.220444 master-0 kubenswrapper[36504]: I1203 22:10:36.220424 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-catalog-content\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:36.220668 master-0 kubenswrapper[36504]: I1203 22:10:36.220456 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.220668 master-0 kubenswrapper[36504]: I1203 22:10:36.220503 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvkk\" (UniqueName: \"kubernetes.io/projected/def52ba3-77c1-4e0c-8a0d-44ff4d677607-kube-api-access-dcvkk\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.220668 master-0 kubenswrapper[36504]: I1203 22:10:36.220538 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.220668 master-0 kubenswrapper[36504]: I1203 22:10:36.220584 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-catalog-content\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:36.220668 master-0 kubenswrapper[36504]: I1203 22:10:36.220623 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.220668 master-0 kubenswrapper[36504]: I1203 22:10:36.220658 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzqm\" (UniqueName: \"kubernetes.io/projected/f9a3f900-60e4-49c2-85ec-88d19852d1b9-kube-api-access-jvzqm\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:36.220919 master-0 kubenswrapper[36504]: I1203 22:10:36.220691 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-utilities\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:36.220919 master-0 kubenswrapper[36504]: I1203 22:10:36.220735 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.220919 master-0 kubenswrapper[36504]: I1203 22:10:36.220814 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:36.220919 master-0 kubenswrapper[36504]: I1203 22:10:36.220852 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-dir\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.220919 master-0 kubenswrapper[36504]: I1203 22:10:36.220915 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-audit-dir\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.220948 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.220982 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dfd\" (UniqueName: \"kubernetes.io/projected/0a49c320-f31d-4f6d-98c3-48d24346b873-kube-api-access-s7dfd\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.221015 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.221057 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.221090 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b658f\" (UniqueName: \"kubernetes.io/projected/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-kube-api-access-b658f\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.221126 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:36.221188 master-0 kubenswrapper[36504]: I1203 22:10:36.221161 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221192 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221240 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221275 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221308 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9q7k\" (UniqueName: \"kubernetes.io/projected/452012bf-eae1-4e69-9ba1-034309e9f2c8-kube-api-access-b9q7k\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221353 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221388 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlw7s\" (UniqueName: \"kubernetes.io/projected/c807d487-5b8f-4747-87ee-df0637e2e11f-kube-api-access-nlw7s\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:36.221457 master-0 kubenswrapper[36504]: I1203 22:10:36.221418 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-modprobe-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221469 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-var-lib-kubelet\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221501 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221549 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221593 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221628 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-wtmp\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221664 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:36.221714 master-0 kubenswrapper[36504]: I1203 22:10:36.221698 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:36.221993 master-0 kubenswrapper[36504]: I1203 22:10:36.221736 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.221993 master-0 kubenswrapper[36504]: I1203 22:10:36.221802 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-sys\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.221993 master-0 kubenswrapper[36504]: I1203 22:10:36.221857 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:36.221993 master-0 kubenswrapper[36504]: I1203 22:10:36.221893 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 22:10:36.221993 master-0 kubenswrapper[36504]: I1203 22:10:36.221929 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.221993 master-0 kubenswrapper[36504]: I1203 22:10:36.221973 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtx6m\" (UniqueName: \"kubernetes.io/projected/8b56c318-09b7-47f0-a7bf-32eb96e836ca-kube-api-access-qtx6m\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:36.222211 master-0 kubenswrapper[36504]: I1203 22:10:36.222012 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.222211 master-0 kubenswrapper[36504]: I1203 22:10:36.222051 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:36.222211 master-0 kubenswrapper[36504]: I1203 22:10:36.222087 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.222211 master-0 kubenswrapper[36504]: I1203 22:10:36.222119 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.222211 master-0 kubenswrapper[36504]: I1203 22:10:36.222172 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:36.222211 master-0 kubenswrapper[36504]: I1203 22:10:36.222209 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.222444 master-0 kubenswrapper[36504]: I1203 22:10:36.222266 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:36.222444 master-0 kubenswrapper[36504]: I1203 22:10:36.222298 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-sys\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.222444 master-0 kubenswrapper[36504]: I1203 22:10:36.222333 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.222444 master-0 kubenswrapper[36504]: I1203 22:10:36.222376 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.222444 master-0 kubenswrapper[36504]: I1203 22:10:36.222405 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.222648 master-0 kubenswrapper[36504]: I1203 22:10:36.222450 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:36.222648 master-0 kubenswrapper[36504]: I1203 22:10:36.222486 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:36.222648 master-0 kubenswrapper[36504]: I1203 22:10:36.222519 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.222648 master-0 kubenswrapper[36504]: I1203 22:10:36.222554 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.222648 master-0 kubenswrapper[36504]: I1203 22:10:36.222597 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v28xw\" (UniqueName: \"kubernetes.io/projected/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-kube-api-access-v28xw\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:36.222648 master-0 kubenswrapper[36504]: I1203 22:10:36.222644 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9pcw\" (UniqueName: \"kubernetes.io/projected/10fc6516-cd4d-4291-a26d-8376ba0affef-kube-api-access-h9pcw\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:36.222908 master-0 kubenswrapper[36504]: I1203 22:10:36.222689 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:36.222908 master-0 kubenswrapper[36504]: I1203 22:10:36.222722 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.222908 master-0 kubenswrapper[36504]: I1203 22:10:36.222764 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-lib-modules\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.222908 master-0 kubenswrapper[36504]: I1203 22:10:36.222851 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-utilities\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:36.222908 master-0 kubenswrapper[36504]: I1203 22:10:36.222897 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.223100 master-0 kubenswrapper[36504]: I1203 22:10:36.222961 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.223100 master-0 kubenswrapper[36504]: I1203 22:10:36.222999 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:36.223100 master-0 kubenswrapper[36504]: I1203 22:10:36.223045 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thgv2\" (UniqueName: \"kubernetes.io/projected/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-kube-api-access-thgv2\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:36.223100 master-0 kubenswrapper[36504]: I1203 22:10:36.223080 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.223244 master-0 kubenswrapper[36504]: I1203 22:10:36.223126 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66aa2598-f4b6-4d3a-9623-aeb707e4912b-hosts-file\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 22:10:36.223244 master-0 kubenswrapper[36504]: I1203 22:10:36.223159 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:36.223244 master-0 kubenswrapper[36504]: I1203 22:10:36.223202 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:36.223348 master-0 kubenswrapper[36504]: I1203 22:10:36.223267 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-host\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.223348 master-0 kubenswrapper[36504]: I1203 22:10:36.223311 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-textfile\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.223348 master-0 kubenswrapper[36504]: I1203 22:10:36.223342 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77e36f4e-845b-4b82-8abc-b634636c087a-tmpfs\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:36.223465 master-0 kubenswrapper[36504]: I1203 22:10:36.223375 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:36.223465 master-0 kubenswrapper[36504]: I1203 22:10:36.223409 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:36.223465 master-0 kubenswrapper[36504]: I1203 22:10:36.223442 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:36.223601 master-0 kubenswrapper[36504]: I1203 22:10:36.223476 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.223601 master-0 kubenswrapper[36504]: I1203 22:10:36.223537 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.223601 master-0 kubenswrapper[36504]: I1203 22:10:36.223582 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-kubernetes\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.223711 master-0 kubenswrapper[36504]: I1203 22:10:36.223616 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.223711 master-0 kubenswrapper[36504]: I1203 22:10:36.223677 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-utilities\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223715 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnfgr\" (UniqueName: \"kubernetes.io/projected/d0be52f3-b318-4630-b4da-f3c4a57d5818-kube-api-access-qnfgr\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223750 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223839 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223915 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223952 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223968 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-containers\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.223987 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224057 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224093 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224116 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-node-pullsecrets\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224144 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224174 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcb88\" (UniqueName: \"kubernetes.io/projected/d6fafa97-812d-4588-95f8-7c4d85f53098-kube-api-access-gcb88\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224194 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224225 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224246 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224246 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-multus\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224264 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224293 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-log-socket\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224321 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224340 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-utilities\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224389 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysconfig\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224425 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224435 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-catalog-content\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224463 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-docker\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224505 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224516 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-os-release\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224541 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87hj\" (UniqueName: \"kubernetes.io/projected/77e36f4e-845b-4b82-8abc-b634636c087a-kube-api-access-s87hj\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224571 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/def52ba3-77c1-4e0c-8a0d-44ff4d677607-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224578 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-run\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224577 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-containers\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224621 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-sys\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224623 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224665 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-cni-bin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224695 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224758 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224806 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224834 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224878 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224908 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd18a700-53b2-430c-a34f-dbb6331cfbe5-rootfs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224939 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-conf\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225002 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225018 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225050 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjtq\" (UniqueName: \"kubernetes.io/projected/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-kube-api-access-hzjtq\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225053 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-ssl-certs\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225078 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225105 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9bs\" (UniqueName: \"kubernetes.io/projected/bd18a700-53b2-430c-a34f-dbb6331cfbe5-kube-api-access-ll9bs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225107 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a49c320-f31d-4f6d-98c3-48d24346b873-catalog-content\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225127 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-systemd\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225179 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-cnibin\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225196 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfrr\" (UniqueName: \"kubernetes.io/projected/2d592f19-c7b9-4b29-9ca2-848572067908-kube-api-access-8hfrr\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225224 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225243 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-cnibin\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225289 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225305 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-etc-kubernetes\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225337 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-modprobe-d\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225376 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-var-lib-kubelet\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225380 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/66aa2598-f4b6-4d3a-9623-aeb707e4912b-hosts-file\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225406 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-systemd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225406 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-catalog-content\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224471 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.224545 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225248 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-node-log\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225635 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-multus-certs\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225644 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-system-cni-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225701 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-k8s-cni-cncf-io\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225710 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-socket-dir-parent\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225723 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-etc-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225806 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-run-netns\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225851 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysctl-conf\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225923 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-os-release\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225986 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-lib-modules\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226030 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-bin\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226041 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10fc6516-cd4d-4291-a26d-8376ba0affef-utilities\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226060 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226140 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-systemd\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226143 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-node-pullsecrets\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.225127 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226219 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226234 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226246 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-run-netns\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226263 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-hostroot\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226300 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-etc-docker\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226306 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c807d487-5b8f-4747-87ee-df0637e2e11f-utilities\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226320 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-ovn\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226348 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ffad8fc8-4378-44de-8864-dd2f666ade68-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226364 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-host\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226376 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-kubernetes\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226400 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226428 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-kubelet\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226448 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-cni-netd\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226487 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-var-lib-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226513 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-multus-conf-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226999 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226571 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/814c8acf-fb8d-4f57-b8db-21304402c1f1-host-slash\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226579 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-etc-sysconfig\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226595 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/892d5611-debf-402f-abc5-3f99aa080159-host-etc-kube\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226637 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-dir\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226640 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-textfile\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226653 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/77e36f4e-845b-4b82-8abc-b634636c087a-tmpfs\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226668 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226682 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b522af85-394e-4965-9bf4-83f48fb8ad94-audit-dir\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.226552 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227116 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-host-slash\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227153 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xtm\" (UniqueName: \"kubernetes.io/projected/7c8ec36d-9179-40ab-a448-440b4501b3e0-kube-api-access-t2xtm\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227267 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227304 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227329 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227423 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-systemd-units\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227449 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53713eab-c920-4d5a-ae05-7cdb59ace852-run-openvswitch\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227537 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227575 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227623 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-root\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227672 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227702 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d592f19-c7b9-4b29-9ca2-848572067908-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227729 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227790 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-system-cni-dir\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227821 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/da949cf7-ab12-43ff-8e45-da1c2fd46e20-etc-docker\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.227959 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2d592f19-c7b9-4b29-9ca2-848572067908-volume-directive-shadow\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228081 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228123 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228145 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228192 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228214 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228339 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.228482 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8194009-3743-4da7-baf1-f9bb0afd6187-host-var-lib-kubelet\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:36.229825 master-0 kubenswrapper[36504]: I1203 22:10:36.229162 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 22:10:36.241167 master-0 kubenswrapper[36504]: I1203 22:10:36.241125 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 22:10:36.245014 master-0 kubenswrapper[36504]: I1203 22:10:36.244845 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-image-import-ca\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.260987 master-0 kubenswrapper[36504]: I1203 22:10:36.260936 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 22:10:36.263995 master-0 kubenswrapper[36504]: I1203 22:10:36.263958 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-config\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.290981 master-0 kubenswrapper[36504]: I1203 22:10:36.290910 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 22:10:36.295939 master-0 kubenswrapper[36504]: I1203 22:10:36.295896 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b522af85-394e-4965-9bf4-83f48fb8ad94-trusted-ca-bundle\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:36.301595 master-0 kubenswrapper[36504]: I1203 22:10:36.300340 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 22:10:36.320909 master-0 kubenswrapper[36504]: I1203 22:10:36.320845 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 22:10:36.325987 master-0 kubenswrapper[36504]: I1203 22:10:36.325940 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-env-overrides\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.330653 master-0 kubenswrapper[36504]: I1203 22:10:36.330589 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/def52ba3-77c1-4e0c-8a0d-44ff4d677607-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.330852 master-0 kubenswrapper[36504]: I1203 22:10:36.330755 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd18a700-53b2-430c-a34f-dbb6331cfbe5-rootfs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:36.331082 master-0 kubenswrapper[36504]: I1203 22:10:36.331043 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-root\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.331121 master-0 kubenswrapper[36504]: I1203 22:10:36.331081 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/def52ba3-77c1-4e0c-8a0d-44ff4d677607-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:36.331225 master-0 kubenswrapper[36504]: I1203 22:10:36.331188 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bd18a700-53b2-430c-a34f-dbb6331cfbe5-rootfs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:36.331274 master-0 kubenswrapper[36504]: I1203 22:10:36.331257 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-root\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.331679 master-0 kubenswrapper[36504]: I1203 22:10:36.331646 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-wtmp\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.331803 master-0 kubenswrapper[36504]: I1203 22:10:36.331738 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-wtmp\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.331849 master-0 kubenswrapper[36504]: I1203 22:10:36.331834 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-sys\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.331903 master-0 kubenswrapper[36504]: I1203 22:10:36.331880 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.331946 master-0 kubenswrapper[36504]: I1203 22:10:36.331918 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.331978 master-0 kubenswrapper[36504]: I1203 22:10:36.331960 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-sys\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:36.332027 master-0 kubenswrapper[36504]: I1203 22:10:36.332008 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.332160 master-0 kubenswrapper[36504]: I1203 22:10:36.332124 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.332252 master-0 kubenswrapper[36504]: I1203 22:10:36.332213 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.343320 master-0 kubenswrapper[36504]: I1203 22:10:36.343249 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 22:10:36.350241 master-0 kubenswrapper[36504]: I1203 22:10:36.349878 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/1a0f647a-0260-4737-8ae2-cc90d01d33d1-ovnkube-identity-cm\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:36.361190 master-0 kubenswrapper[36504]: I1203 22:10:36.361136 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 22:10:36.382100 master-0 kubenswrapper[36504]: I1203 22:10:36.382030 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 22:10:36.392554 master-0 kubenswrapper[36504]: I1203 22:10:36.392489 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/814c8acf-fb8d-4f57-b8db-21304402c1f1-iptables-alerter-script\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:36.401617 master-0 kubenswrapper[36504]: I1203 22:10:36.401578 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 22:10:36.406151 master-0 kubenswrapper[36504]: I1203 22:10:36.406053 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53713eab-c920-4d5a-ae05-7cdb59ace852-ovnkube-script-lib\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:36.421146 master-0 kubenswrapper[36504]: I1203 22:10:36.421078 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 22:10:36.439088 master-0 kubenswrapper[36504]: I1203 22:10:36.439012 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.440955 master-0 kubenswrapper[36504]: I1203 22:10:36.440932 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 22:10:36.444998 master-0 kubenswrapper[36504]: I1203 22:10:36.444971 36504 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 22:10:36.447758 master-0 kubenswrapper[36504]: I1203 22:10:36.447734 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Dec 03 22:10:36.447918 master-0 kubenswrapper[36504]: I1203 22:10:36.447903 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Dec 03 22:10:36.449011 master-0 kubenswrapper[36504]: I1203 22:10:36.448989 36504 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Dec 03 22:10:36.449431 master-0 kubenswrapper[36504]: I1203 22:10:36.449414 36504 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Dec 03 22:10:36.450862 master-0 kubenswrapper[36504]: I1203 22:10:36.450762 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:36.461081 master-0 kubenswrapper[36504]: I1203 22:10:36.461042 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 22:10:36.480639 master-0 kubenswrapper[36504]: I1203 22:10:36.480602 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 22:10:36.500916 master-0 kubenswrapper[36504]: I1203 22:10:36.500776 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 22:10:36.510247 master-0 kubenswrapper[36504]: I1203 22:10:36.510199 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54767c36-ca29-4c91-9a8a-9699ecfa4afb-config-volume\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:36.521537 master-0 kubenswrapper[36504]: I1203 22:10:36.521508 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 22:10:36.523637 master-0 kubenswrapper[36504]: I1203 22:10:36.523599 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54767c36-ca29-4c91-9a8a-9699ecfa4afb-metrics-tls\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:36.535143 master-0 kubenswrapper[36504]: I1203 22:10:36.535085 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") pod \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " Dec 03 22:10:36.535332 master-0 kubenswrapper[36504]: I1203 22:10:36.535306 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") pod \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " Dec 03 22:10:36.535411 master-0 kubenswrapper[36504]: I1203 22:10:36.535360 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock" (OuterVolumeSpecName: "var-lock") pod "5c8c7291-3150-46a5-9d14-57a23bb51cc0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:36.535560 master-0 kubenswrapper[36504]: I1203 22:10:36.535499 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c8c7291-3150-46a5-9d14-57a23bb51cc0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:36.536810 master-0 kubenswrapper[36504]: I1203 22:10:36.536787 36504 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:36.536977 master-0 kubenswrapper[36504]: I1203 22:10:36.536813 36504 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:36.541599 master-0 kubenswrapper[36504]: I1203 22:10:36.541572 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 22:10:36.543730 master-0 kubenswrapper[36504]: I1203 22:10:36.543710 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-serving-cert\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.561326 master-0 kubenswrapper[36504]: I1203 22:10:36.561284 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 22:10:36.562064 master-0 kubenswrapper[36504]: I1203 22:10:36.562006 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-serving-ca\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.581289 master-0 kubenswrapper[36504]: I1203 22:10:36.581235 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 22:10:36.590161 master-0 kubenswrapper[36504]: I1203 22:10:36.590126 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-trusted-ca-bundle\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.602243 master-0 kubenswrapper[36504]: I1203 22:10:36.602193 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 22:10:36.621406 master-0 kubenswrapper[36504]: I1203 22:10:36.621357 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 22:10:36.641709 master-0 kubenswrapper[36504]: I1203 22:10:36.641640 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 22:10:36.643173 master-0 kubenswrapper[36504]: I1203 22:10:36.643134 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/246b7846-0dfd-43a8-bcfa-81e7435060dc-audit-policies\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.645507 master-0 kubenswrapper[36504]: I1203 22:10:36.645447 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:36.664644 master-0 kubenswrapper[36504]: I1203 22:10:36.664592 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 22:10:36.674944 master-0 kubenswrapper[36504]: I1203 22:10:36.674909 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-encryption-config\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.680650 master-0 kubenswrapper[36504]: I1203 22:10:36.680612 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 22:10:36.685125 master-0 kubenswrapper[36504]: I1203 22:10:36.685077 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/246b7846-0dfd-43a8-bcfa-81e7435060dc-etcd-client\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:36.702710 master-0 kubenswrapper[36504]: I1203 22:10:36.702666 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 03 22:10:36.710229 master-0 kubenswrapper[36504]: I1203 22:10:36.710197 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/922419d4-b528-472e-8215-4a55a96dab08-tls-certificates\") pod \"prometheus-operator-admission-webhook-6d4cbfb4b-rqszb\" (UID: \"922419d4-b528-472e-8215-4a55a96dab08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 22:10:36.720569 master-0 kubenswrapper[36504]: I1203 22:10:36.720521 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 22:10:36.724475 master-0 kubenswrapper[36504]: I1203 22:10:36.724434 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-service-ca\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.746082 master-0 kubenswrapper[36504]: I1203 22:10:36.746016 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 22:10:36.764936 master-0 kubenswrapper[36504]: I1203 22:10:36.764760 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 22:10:36.776739 master-0 kubenswrapper[36504]: I1203 22:10:36.776684 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-serving-cert\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:36.782175 master-0 kubenswrapper[36504]: I1203 22:10:36.782134 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 22:10:36.800807 master-0 kubenswrapper[36504]: I1203 22:10:36.800589 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 22:10:36.805173 master-0 kubenswrapper[36504]: I1203 22:10:36.805128 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-default-certificate\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.823045 master-0 kubenswrapper[36504]: I1203 22:10:36.822998 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 22:10:36.823490 master-0 kubenswrapper[36504]: I1203 22:10:36.823460 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/698e6d87-1a58-493c-8b69-d22c89d26ac5-service-ca-bundle\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.841438 master-0 kubenswrapper[36504]: I1203 22:10:36.841388 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 22:10:36.842659 master-0 kubenswrapper[36504]: I1203 22:10:36.842619 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-stats-auth\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.861486 master-0 kubenswrapper[36504]: I1203 22:10:36.861434 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 22:10:36.871353 master-0 kubenswrapper[36504]: I1203 22:10:36.871309 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/698e6d87-1a58-493c-8b69-d22c89d26ac5-metrics-certs\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:36.880447 master-0 kubenswrapper[36504]: I1203 22:10:36.880371 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 22:10:36.900260 master-0 kubenswrapper[36504]: I1203 22:10:36.900213 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 22:10:36.903517 master-0 kubenswrapper[36504]: I1203 22:10:36.903476 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-samples-operator-tls\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 22:10:36.921613 master-0 kubenswrapper[36504]: I1203 22:10:36.921579 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-lm87f" Dec 03 22:10:36.941658 master-0 kubenswrapper[36504]: I1203 22:10:36.941581 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 22:10:36.961335 master-0 kubenswrapper[36504]: I1203 22:10:36.961271 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 22:10:36.980600 master-0 kubenswrapper[36504]: I1203 22:10:36.980504 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 22:10:36.983548 master-0 kubenswrapper[36504]: I1203 22:10:36.983506 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/858384f3-5741-4e67-8669-2eb2b2dcaf7f-cert\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 22:10:37.001461 master-0 kubenswrapper[36504]: I1203 22:10:37.001427 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 22:10:37.021439 master-0 kubenswrapper[36504]: I1203 22:10:37.021295 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-x894t" Dec 03 22:10:37.041052 master-0 kubenswrapper[36504]: I1203 22:10:37.040995 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 22:10:37.044561 master-0 kubenswrapper[36504]: I1203 22:10:37.044508 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/858384f3-5741-4e67-8669-2eb2b2dcaf7f-auth-proxy-config\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 22:10:37.060375 master-0 kubenswrapper[36504]: I1203 22:10:37.060331 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 22:10:37.066088 master-0 kubenswrapper[36504]: I1203 22:10:37.066022 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7e0eea-3da8-43de-87bc-d10231e7c239-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:37.079187 master-0 kubenswrapper[36504]: I1203 22:10:37.079114 36504 request.go:700] Waited for 1.011878042s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-config-operator/secrets?fieldSelector=metadata.name%3Dconfig-operator-serving-cert&limit=500&resourceVersion=0 Dec 03 22:10:37.081627 master-0 kubenswrapper[36504]: I1203 22:10:37.081286 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 22:10:37.084222 master-0 kubenswrapper[36504]: I1203 22:10:37.084177 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e50b85a6-7767-4fca-8133-8243bdd85e5d-serving-cert\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:37.103815 master-0 kubenswrapper[36504]: E1203 22:10:37.103756 36504 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.104155 master-0 kubenswrapper[36504]: I1203 22:10:37.103836 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-6lwtj" Dec 03 22:10:37.104251 master-0 kubenswrapper[36504]: I1203 22:10:37.104218 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13238af3704fe583f617f61e755cf4c2" path="/var/lib/kubelet/pods/13238af3704fe583f617f61e755cf4c2/volumes" Dec 03 22:10:37.104366 master-0 kubenswrapper[36504]: E1203 22:10:37.104346 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle podName:578b2d03-b8b3-4c75-adde-73899c472ad7 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.604316987 +0000 UTC m=+2.824089014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle") pod "insights-operator-59d99f9b7b-x4tfh" (UID: "578b2d03-b8b3-4c75-adde-73899c472ad7") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.104670 master-0 kubenswrapper[36504]: E1203 22:10:37.104627 36504 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.104749 master-0 kubenswrapper[36504]: I1203 22:10:37.104682 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 03 22:10:37.104824 master-0 kubenswrapper[36504]: E1203 22:10:37.104745 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls podName:fa9b5917-d4f3-4372-a200-45b57412f92f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.60471849 +0000 UTC m=+2.824490497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5fdc576499-q9tf6" (UID: "fa9b5917-d4f3-4372-a200-45b57412f92f") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.109382 master-0 kubenswrapper[36504]: E1203 22:10:37.109345 36504 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.109485 master-0 kubenswrapper[36504]: E1203 22:10:37.109413 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config podName:fa9b5917-d4f3-4372-a200-45b57412f92f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.609396488 +0000 UTC m=+2.829168495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config") pod "cluster-baremetal-operator-5fdc576499-q9tf6" (UID: "fa9b5917-d4f3-4372-a200-45b57412f92f") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.109541 master-0 kubenswrapper[36504]: E1203 22:10:37.109521 36504 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.109597 master-0 kubenswrapper[36504]: E1203 22:10:37.109566 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert podName:fa9b5917-d4f3-4372-a200-45b57412f92f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.609555183 +0000 UTC m=+2.829327190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert") pod "cluster-baremetal-operator-5fdc576499-q9tf6" (UID: "fa9b5917-d4f3-4372-a200-45b57412f92f") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.109597 master-0 kubenswrapper[36504]: E1203 22:10:37.109587 36504 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.109689 master-0 kubenswrapper[36504]: E1203 22:10:37.109614 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert podName:578b2d03-b8b3-4c75-adde-73899c472ad7 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.609608914 +0000 UTC m=+2.829380921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert") pod "insights-operator-59d99f9b7b-x4tfh" (UID: "578b2d03-b8b3-4c75-adde-73899c472ad7") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.112859 master-0 kubenswrapper[36504]: E1203 22:10:37.112839 36504 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.112924 master-0 kubenswrapper[36504]: E1203 22:10:37.112888 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls podName:ba624ed0-32cc-4c87-81a5-708a8a8a7f88 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.612876316 +0000 UTC m=+2.832648323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-66f4cc99d4-jlq49" (UID: "ba624ed0-32cc-4c87-81a5-708a8a8a7f88") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.112924 master-0 kubenswrapper[36504]: E1203 22:10:37.112921 36504 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.112993 master-0 kubenswrapper[36504]: E1203 22:10:37.112932 36504 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.112993 master-0 kubenswrapper[36504]: E1203 22:10:37.112949 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images podName:fa9b5917-d4f3-4372-a200-45b57412f92f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.612938538 +0000 UTC m=+2.832710535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images") pod "cluster-baremetal-operator-5fdc576499-q9tf6" (UID: "fa9b5917-d4f3-4372-a200-45b57412f92f") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.113058 master-0 kubenswrapper[36504]: E1203 22:10:37.113008 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle podName:578b2d03-b8b3-4c75-adde-73899c472ad7 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.61299488 +0000 UTC m=+2.832766967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle") pod "insights-operator-59d99f9b7b-x4tfh" (UID: "578b2d03-b8b3-4c75-adde-73899c472ad7") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.113133 master-0 kubenswrapper[36504]: E1203 22:10:37.113115 36504 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.113231 master-0 kubenswrapper[36504]: E1203 22:10:37.113217 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca podName:3a7e0eea-3da8-43de-87bc-d10231e7c239 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.613205647 +0000 UTC m=+2.832977654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca") pod "cloud-credential-operator-7c4dc67499-jhd6n" (UID: "3a7e0eea-3da8-43de-87bc-d10231e7c239") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.127477 master-0 kubenswrapper[36504]: I1203 22:10:37.127432 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 22:10:37.141517 master-0 kubenswrapper[36504]: I1203 22:10:37.141457 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 22:10:37.161319 master-0 kubenswrapper[36504]: I1203 22:10:37.161130 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 22:10:37.180301 master-0 kubenswrapper[36504]: I1203 22:10:37.180234 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 22:10:37.201613 master-0 kubenswrapper[36504]: I1203 22:10:37.201554 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 22:10:37.221236 master-0 kubenswrapper[36504]: I1203 22:10:37.221187 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 22:10:37.224625 master-0 kubenswrapper[36504]: E1203 22:10:37.224588 36504 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.224625 master-0 kubenswrapper[36504]: E1203 22:10:37.224614 36504 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.224754 master-0 kubenswrapper[36504]: E1203 22:10:37.224683 36504 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.224819 master-0 kubenswrapper[36504]: E1203 22:10:37.224803 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert podName:64856d96-023f-46db-819c-02f1adea5aab nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.72466545 +0000 UTC m=+2.944437457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert") pod "route-controller-manager-8667dd96f5-qf2rc" (UID: "64856d96-023f-46db-819c-02f1adea5aab") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.224876 master-0 kubenswrapper[36504]: E1203 22:10:37.224824 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert podName:77e36f4e-845b-4b82-8abc-b634636c087a nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.724817094 +0000 UTC m=+2.944589101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert") pod "packageserver-684c49c488-fpmzc" (UID: "77e36f4e-845b-4b82-8abc-b634636c087a") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.224876 master-0 kubenswrapper[36504]: E1203 22:10:37.224838 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls podName:d0be52f3-b318-4630-b4da-f3c4a57d5818 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.724831015 +0000 UTC m=+2.944603022 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls") pod "openshift-state-metrics-57cbc648f8-rhf8p" (UID: "d0be52f3-b318-4630-b4da-f3c4a57d5818") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.224876 master-0 kubenswrapper[36504]: E1203 22:10:37.224869 36504 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225007 master-0 kubenswrapper[36504]: E1203 22:10:37.224904 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls podName:bcbec7ef-0b98-4346-8c6b-c5fa37e90286 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.724897117 +0000 UTC m=+2.944669124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls") pod "node-exporter-nkjnl" (UID: "bcbec7ef-0b98-4346-8c6b-c5fa37e90286") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225007 master-0 kubenswrapper[36504]: E1203 22:10:37.224921 36504 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225007 master-0 kubenswrapper[36504]: E1203 22:10:37.224945 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert podName:f9a3f900-60e4-49c2-85ec-88d19852d1b9 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.724939628 +0000 UTC m=+2.944711635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert") pod "catalog-operator-7cf5cf757f-shpjd" (UID: "f9a3f900-60e4-49c2-85ec-88d19852d1b9") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225007 master-0 kubenswrapper[36504]: E1203 22:10:37.224998 36504 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225021 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config podName:8b56c318-09b7-47f0-a7bf-32eb96e836ca nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725016131 +0000 UTC m=+2.944788128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config") pod "machine-api-operator-7486ff55f-w9xk2" (UID: "8b56c318-09b7-47f0-a7bf-32eb96e836ca") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225042 36504 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225067 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config podName:a124c14f-20c6-4df3-956f-a858de0c73c9 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725059682 +0000 UTC m=+2.944831689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config") pod "machine-config-controller-74cddd4fb5-7zg56" (UID: "a124c14f-20c6-4df3-956f-a858de0c73c9") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225083 36504 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225103 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert podName:7c8ec36d-9179-40ab-a448-440b4501b3e0 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725098793 +0000 UTC m=+2.944870800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert") pod "ingress-canary-qsfnw" (UID: "7c8ec36d-9179-40ab-a448-440b4501b3e0") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225124 36504 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225151 master-0 kubenswrapper[36504]: E1203 22:10:37.225146 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs podName:452012bf-eae1-4e69-9ba1-034309e9f2c8 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725141475 +0000 UTC m=+2.944913482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs") pod "multus-admission-controller-5bdcc987c4-5cs48" (UID: "452012bf-eae1-4e69-9ba1-034309e9f2c8") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225372 master-0 kubenswrapper[36504]: E1203 22:10:37.225173 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225372 master-0 kubenswrapper[36504]: E1203 22:10:37.225192 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca podName:bcbec7ef-0b98-4346-8c6b-c5fa37e90286 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725187326 +0000 UTC m=+2.944959333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca") pod "node-exporter-nkjnl" (UID: "bcbec7ef-0b98-4346-8c6b-c5fa37e90286") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225372 master-0 kubenswrapper[36504]: E1203 22:10:37.225211 36504 secret.go:189] Couldn't get secret openshift-monitoring/metrics-client-certs: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225372 master-0 kubenswrapper[36504]: E1203 22:10:37.225235 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs podName:40f8e70d-5f98-47f1-afa8-ea67242981fc nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725229878 +0000 UTC m=+2.945001885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-client-certs" (UniqueName: "kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs") pod "metrics-server-b9f5dccb6-4h4jv" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225860 master-0 kubenswrapper[36504]: E1203 22:10:37.225830 36504 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225860 master-0 kubenswrapper[36504]: E1203 22:10:37.225843 36504 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225873 36504 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225877 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert podName:6e96335e-1866-41c8-b128-b95e783a9be4 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725865208 +0000 UTC m=+2.945637215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-f84784664-hv5z8" (UID: "6e96335e-1866-41c8-b128-b95e783a9be4") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225898 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls podName:d6fafa97-812d-4588-95f8-7c4d85f53098 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725890198 +0000 UTC m=+2.945662205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls") pod "prometheus-operator-565bdcb8-7s9vg" (UID: "d6fafa97-812d-4588-95f8-7c4d85f53098") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225907 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-server-audit-profiles: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225914 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert podName:f9a3f900-60e4-49c2-85ec-88d19852d1b9 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725906569 +0000 UTC m=+2.945678576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert") pod "catalog-operator-7cf5cf757f-shpjd" (UID: "f9a3f900-60e4-49c2-85ec-88d19852d1b9") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225927 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles podName:40f8e70d-5f98-47f1-afa8-ea67242981fc nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725919779 +0000 UTC m=+2.945691776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-server-audit-profiles" (UniqueName: "kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles") pod "metrics-server-b9f5dccb6-4h4jv" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225945 36504 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-gfs59jbdhk2g: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225975 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle podName:40f8e70d-5f98-47f1-afa8-ea67242981fc nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.725967001 +0000 UTC m=+2.945739008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca-bundle" (UniqueName: "kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle") pod "metrics-server-b9f5dccb6-4h4jv" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.225990 master-0 kubenswrapper[36504]: E1203 22:10:37.225987 36504 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226010 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images podName:a9940ff5-36a6-4c04-a51d-66f7d83bea7c nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726004392 +0000 UTC m=+2.945776399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images") pod "machine-config-operator-664c9d94c9-bdps5" (UID: "a9940ff5-36a6-4c04-a51d-66f7d83bea7c") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226010 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/kubelet-serving-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226047 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle podName:40f8e70d-5f98-47f1-afa8-ea67242981fc nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726038023 +0000 UTC m=+2.945810030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "configmap-kubelet-serving-ca-bundle" (UniqueName: "kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle") pod "metrics-server-b9f5dccb6-4h4jv" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226050 36504 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226060 36504 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226067 36504 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226153 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226080 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert podName:77e36f4e-845b-4b82-8abc-b634636c087a nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726072664 +0000 UTC m=+2.945844661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert") pod "packageserver-684c49c488-fpmzc" (UID: "77e36f4e-845b-4b82-8abc-b634636c087a") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226196 36504 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226205 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert podName:ac3d3235-531e-4c7d-9fc9-e65c97970d0f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726197028 +0000 UTC m=+2.945969035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert") pod "olm-operator-76bd5d69c7-6tjzq" (UID: "ac3d3235-531e-4c7d-9fc9-e65c97970d0f") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226216 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config podName:d6fafa97-812d-4588-95f8-7c4d85f53098 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726211199 +0000 UTC m=+2.945983206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-565bdcb8-7s9vg" (UID: "d6fafa97-812d-4588-95f8-7c4d85f53098") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226236 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226257 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token podName:66f7b08c-52e8-4795-9cf0-74402a9cc0bb nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726232449 +0000 UTC m=+2.946004516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token") pod "machine-config-server-vgm8c" (UID: "66f7b08c-52e8-4795-9cf0-74402a9cc0bb") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226281 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap podName:2d592f19-c7b9-4b29-9ca2-848572067908 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.72626737 +0000 UTC m=+2.946039387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-7dcc7f9bd6-kldf9" (UID: "2d592f19-c7b9-4b29-9ca2-848572067908") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226299 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca podName:d0be52f3-b318-4630-b4da-f3c4a57d5818 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726291421 +0000 UTC m=+2.946063438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca") pod "openshift-state-metrics-57cbc648f8-rhf8p" (UID: "d0be52f3-b318-4630-b4da-f3c4a57d5818") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226298 master-0 kubenswrapper[36504]: E1203 22:10:37.226311 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226331 36504 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226346 36504 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226370 36504 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226372 36504 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226334 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca podName:d6fafa97-812d-4588-95f8-7c4d85f53098 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726328432 +0000 UTC m=+2.946100439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca") pod "prometheus-operator-565bdcb8-7s9vg" (UID: "d6fafa97-812d-4588-95f8-7c4d85f53098") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226414 36504 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226418 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config podName:d0be52f3-b318-4630-b4da-f3c4a57d5818 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726406965 +0000 UTC m=+2.946178992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config") pod "openshift-state-metrics-57cbc648f8-rhf8p" (UID: "d0be52f3-b318-4630-b4da-f3c4a57d5818") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226440 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images podName:8b56c318-09b7-47f0-a7bf-32eb96e836ca nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726430246 +0000 UTC m=+2.946202263 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images") pod "machine-api-operator-7486ff55f-w9xk2" (UID: "8b56c318-09b7-47f0-a7bf-32eb96e836ca") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226447 36504 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226455 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls podName:a9940ff5-36a6-4c04-a51d-66f7d83bea7c nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726448676 +0000 UTC m=+2.946220693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls") pod "machine-config-operator-664c9d94c9-bdps5" (UID: "a9940ff5-36a6-4c04-a51d-66f7d83bea7c") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226472 36504 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226476 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls podName:bd18a700-53b2-430c-a34f-dbb6331cfbe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726464617 +0000 UTC m=+2.946236634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls") pod "machine-config-daemon-j9wwr" (UID: "bd18a700-53b2-430c-a34f-dbb6331cfbe5") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226495 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config podName:def52ba3-77c1-4e0c-8a0d-44ff4d677607 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726487217 +0000 UTC m=+2.946259244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" (UID: "def52ba3-77c1-4e0c-8a0d-44ff4d677607") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226499 36504 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226510 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images podName:def52ba3-77c1-4e0c-8a0d-44ff4d677607 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726502998 +0000 UTC m=+2.946275015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images") pod "cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" (UID: "def52ba3-77c1-4e0c-8a0d-44ff4d677607") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226524 36504 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226531 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls podName:01b80ad5-7d7c-4ecd-90b0-2913d4559b5f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726521459 +0000 UTC m=+2.946293476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls") pod "machine-approver-cb84b9cdf-wkcnd" (UID: "01b80ad5-7d7c-4ecd-90b0-2913d4559b5f") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226546 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls podName:8b56c318-09b7-47f0-a7bf-32eb96e836ca nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726540719 +0000 UTC m=+2.946312726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls") pod "machine-api-operator-7486ff55f-w9xk2" (UID: "8b56c318-09b7-47f0-a7bf-32eb96e836ca") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226559 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config podName:01b80ad5-7d7c-4ecd-90b0-2913d4559b5f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.72655381 +0000 UTC m=+2.946325817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config") pod "machine-approver-cb84b9cdf-wkcnd" (UID: "01b80ad5-7d7c-4ecd-90b0-2913d4559b5f") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226563 36504 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226576 36504 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226598 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config podName:01b80ad5-7d7c-4ecd-90b0-2913d4559b5f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726593171 +0000 UTC m=+2.946365168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config") pod "machine-approver-cb84b9cdf-wkcnd" (UID: "01b80ad5-7d7c-4ecd-90b0-2913d4559b5f") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226616 36504 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226624 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs podName:66f7b08c-52e8-4795-9cf0-74402a9cc0bb nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726615292 +0000 UTC m=+2.946387319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs") pod "machine-config-server-vgm8c" (UID: "66f7b08c-52e8-4795-9cf0-74402a9cc0bb") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226640 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726633482 +0000 UTC m=+2.946405489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226655 36504 secret.go:189] Couldn't get secret openshift-monitoring/metrics-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226658 36504 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226688 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls podName:40f8e70d-5f98-47f1-afa8-ea67242981fc nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726679974 +0000 UTC m=+2.946452001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-metrics-server-tls" (UniqueName: "kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls") pod "metrics-server-b9f5dccb6-4h4jv" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226711 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config podName:2d592f19-c7b9-4b29-9ca2-848572067908 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726704804 +0000 UTC m=+2.946476811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-7dcc7f9bd6-kldf9" (UID: "2d592f19-c7b9-4b29-9ca2-848572067908") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226716 36504 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226730 36504 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226744 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca podName:2d592f19-c7b9-4b29-9ca2-848572067908 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726736295 +0000 UTC m=+2.946508312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca") pod "kube-state-metrics-7dcc7f9bd6-kldf9" (UID: "2d592f19-c7b9-4b29-9ca2-848572067908") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.226753 master-0 kubenswrapper[36504]: E1203 22:10:37.226807 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls podName:a124c14f-20c6-4df3-956f-a858de0c73c9 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726753776 +0000 UTC m=+2.946525803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls") pod "machine-config-controller-74cddd4fb5-7zg56" (UID: "a124c14f-20c6-4df3-956f-a858de0c73c9") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.227726 master-0 kubenswrapper[36504]: E1203 22:10:37.226843 36504 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.227726 master-0 kubenswrapper[36504]: E1203 22:10:37.226847 36504 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.227726 master-0 kubenswrapper[36504]: E1203 22:10:37.226866 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726860219 +0000 UTC m=+2.946632226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.227726 master-0 kubenswrapper[36504]: E1203 22:10:37.226880 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca podName:64856d96-023f-46db-819c-02f1adea5aab nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.72687265 +0000 UTC m=+2.946644667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca") pod "route-controller-manager-8667dd96f5-qf2rc" (UID: "64856d96-023f-46db-819c-02f1adea5aab") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.227726 master-0 kubenswrapper[36504]: E1203 22:10:37.226903 36504 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.227726 master-0 kubenswrapper[36504]: E1203 22:10:37.226941 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config podName:bcbec7ef-0b98-4346-8c6b-c5fa37e90286 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.726933132 +0000 UTC m=+2.946705149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config") pod "node-exporter-nkjnl" (UID: "bcbec7ef-0b98-4346-8c6b-c5fa37e90286") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228221 master-0 kubenswrapper[36504]: E1203 22:10:37.228129 36504 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228221 master-0 kubenswrapper[36504]: E1203 22:10:37.228152 36504 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228221 master-0 kubenswrapper[36504]: E1203 22:10:37.228181 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls podName:2d592f19-c7b9-4b29-9ca2-848572067908 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.72816912 +0000 UTC m=+2.947941147 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls") pod "kube-state-metrics-7dcc7f9bd6-kldf9" (UID: "2d592f19-c7b9-4b29-9ca2-848572067908") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228221 master-0 kubenswrapper[36504]: E1203 22:10:37.228153 36504 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228221 master-0 kubenswrapper[36504]: E1203 22:10:37.228203 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert podName:ac3d3235-531e-4c7d-9fc9-e65c97970d0f nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.728191481 +0000 UTC m=+2.947963588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert") pod "olm-operator-76bd5d69c7-6tjzq" (UID: "ac3d3235-531e-4c7d-9fc9-e65c97970d0f") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228240 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls podName:def52ba3-77c1-4e0c-8a0d-44ff4d677607 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.728231032 +0000 UTC m=+2.948003049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" (UID: "def52ba3-77c1-4e0c-8a0d-44ff4d677607") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228308 36504 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228360 36504 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228399 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.728391467 +0000 UTC m=+2.948163464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228324 36504 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228344 36504 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228453 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config podName:bd18a700-53b2-430c-a34f-dbb6331cfbe5 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.728408128 +0000 UTC m=+2.948180125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config") pod "machine-config-daemon-j9wwr" (UID: "bd18a700-53b2-430c-a34f-dbb6331cfbe5") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228472 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.728465319 +0000 UTC m=+2.948237326 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.228474 master-0 kubenswrapper[36504]: E1203 22:10:37.228485 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config podName:64856d96-023f-46db-819c-02f1adea5aab nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.72847943 +0000 UTC m=+2.948251437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config") pod "route-controller-manager-8667dd96f5-qf2rc" (UID: "64856d96-023f-46db-819c-02f1adea5aab") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.229355 master-0 kubenswrapper[36504]: E1203 22:10:37.229317 36504 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.229414 master-0 kubenswrapper[36504]: E1203 22:10:37.229366 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config podName:a9940ff5-36a6-4c04-a51d-66f7d83bea7c nodeName:}" failed. No retries permitted until 2025-12-03 22:10:37.729354138 +0000 UTC m=+2.949126235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config") pod "machine-config-operator-664c9d94c9-bdps5" (UID: "a9940ff5-36a6-4c04-a51d-66f7d83bea7c") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:37.245712 master-0 kubenswrapper[36504]: I1203 22:10:37.242295 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 22:10:37.265851 master-0 kubenswrapper[36504]: I1203 22:10:37.265004 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 22:10:37.282217 master-0 kubenswrapper[36504]: I1203 22:10:37.282054 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-2jjj9" Dec 03 22:10:37.301984 master-0 kubenswrapper[36504]: I1203 22:10:37.301913 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 22:10:37.321251 master-0 kubenswrapper[36504]: I1203 22:10:37.321090 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 22:10:37.342809 master-0 kubenswrapper[36504]: I1203 22:10:37.340667 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 22:10:37.361903 master-0 kubenswrapper[36504]: I1203 22:10:37.361637 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 22:10:37.380970 master-0 kubenswrapper[36504]: I1203 22:10:37.380921 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 22:10:37.401679 master-0 kubenswrapper[36504]: I1203 22:10:37.401622 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zj6wp" Dec 03 22:10:37.421276 master-0 kubenswrapper[36504]: I1203 22:10:37.421220 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 22:10:37.443744 master-0 kubenswrapper[36504]: I1203 22:10:37.441245 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 22:10:37.450955 master-0 kubenswrapper[36504]: I1203 22:10:37.450454 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:37.463327 master-0 kubenswrapper[36504]: I1203 22:10:37.463266 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 22:10:37.501246 master-0 kubenswrapper[36504]: I1203 22:10:37.501171 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 22:10:37.520662 master-0 kubenswrapper[36504]: I1203 22:10:37.520624 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-c2z8w" Dec 03 22:10:37.542721 master-0 kubenswrapper[36504]: I1203 22:10:37.542583 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 22:10:37.561084 master-0 kubenswrapper[36504]: I1203 22:10:37.561044 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 22:10:37.582120 master-0 kubenswrapper[36504]: I1203 22:10:37.581102 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 22:10:37.601478 master-0 kubenswrapper[36504]: I1203 22:10:37.601430 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-tphv8" Dec 03 22:10:37.621628 master-0 kubenswrapper[36504]: I1203 22:10:37.621550 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 22:10:37.645196 master-0 kubenswrapper[36504]: I1203 22:10:37.643394 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 22:10:37.665870 master-0 kubenswrapper[36504]: I1203 22:10:37.662174 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 22:10:37.670817 master-0 kubenswrapper[36504]: I1203 22:10:37.670721 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.671127 master-0 kubenswrapper[36504]: I1203 22:10:37.671083 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.671203 master-0 kubenswrapper[36504]: I1203 22:10:37.671179 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:37.671265 master-0 kubenswrapper[36504]: I1203 22:10:37.671228 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.671500 master-0 kubenswrapper[36504]: I1203 22:10:37.671460 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:37.671594 master-0 kubenswrapper[36504]: I1203 22:10:37.671569 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.671678 master-0 kubenswrapper[36504]: I1203 22:10:37.671651 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 22:10:37.671939 master-0 kubenswrapper[36504]: I1203 22:10:37.671884 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:37.672266 master-0 kubenswrapper[36504]: I1203 22:10:37.672223 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:37.672739 master-0 kubenswrapper[36504]: I1203 22:10:37.672699 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-trusted-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:37.673106 master-0 kubenswrapper[36504]: I1203 22:10:37.672989 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.673200 master-0 kubenswrapper[36504]: I1203 22:10:37.673173 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-config\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.675342 master-0 kubenswrapper[36504]: I1203 22:10:37.675284 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fa9b5917-d4f3-4372-a200-45b57412f92f-images\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.675708 master-0 kubenswrapper[36504]: I1203 22:10:37.675629 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa9b5917-d4f3-4372-a200-45b57412f92f-cert\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:37.675925 master-0 kubenswrapper[36504]: I1203 22:10:37.675870 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578b2d03-b8b3-4c75-adde-73899c472ad7-service-ca-bundle\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:37.676040 master-0 kubenswrapper[36504]: I1203 22:10:37.675921 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 22:10:37.676267 master-0 kubenswrapper[36504]: I1203 22:10:37.676225 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3a7e0eea-3da8-43de-87bc-d10231e7c239-cco-trusted-ca\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:37.678144 master-0 kubenswrapper[36504]: I1203 22:10:37.678099 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/578b2d03-b8b3-4c75-adde-73899c472ad7-serving-cert\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:37.684334 master-0 kubenswrapper[36504]: I1203 22:10:37.683757 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 22:10:37.705132 master-0 kubenswrapper[36504]: I1203 22:10:37.701070 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 22:10:37.723336 master-0 kubenswrapper[36504]: I1203 22:10:37.721034 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 22:10:37.741235 master-0 kubenswrapper[36504]: I1203 22:10:37.741139 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-5skvl" Dec 03 22:10:37.760609 master-0 kubenswrapper[36504]: I1203 22:10:37.760543 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 22:10:37.773940 master-0 kubenswrapper[36504]: I1203 22:10:37.773856 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:37.773940 master-0 kubenswrapper[36504]: I1203 22:10:37.773935 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.773985 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774012 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774057 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774081 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774106 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774162 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774189 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774230 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:37.774294 master-0 kubenswrapper[36504]: I1203 22:10:37.774260 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774297 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774345 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774417 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774490 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774565 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774591 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774637 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774662 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774683 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-profile-collector-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:37.774730 master-0 kubenswrapper[36504]: I1203 22:10:37.774717 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774814 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774844 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-auth-proxy-config\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774869 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774687 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-config\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774929 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774952 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8b56c318-09b7-47f0-a7bf-32eb96e836ca-machine-api-operator-tls\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.774982 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.775025 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.775055 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.775085 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.775115 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:37.775199 master-0 kubenswrapper[36504]: I1203 22:10:37.775167 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775231 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775257 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775316 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775340 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775361 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a124c14f-20c6-4df3-956f-a858de0c73c9-mcc-auth-proxy-config\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775391 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775419 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775444 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775492 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775518 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775581 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775604 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775631 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775648 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f9a3f900-60e4-49c2-85ec-88d19852d1b9-srv-cert\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:37.775662 master-0 kubenswrapper[36504]: I1203 22:10:37.775664 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd18a700-53b2-430c-a34f-dbb6331cfbe5-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.775670 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.775745 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.775806 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.775855 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8b56c318-09b7-47f0-a7bf-32eb96e836ca-images\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.775867 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.775972 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776012 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-profile-collector-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776021 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-proxy-tls\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776048 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776103 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776110 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-images\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776133 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776160 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:37.776291 master-0 kubenswrapper[36504]: I1203 22:10:37.776275 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:37.776881 master-0 kubenswrapper[36504]: I1203 22:10:37.776313 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:37.776881 master-0 kubenswrapper[36504]: I1203 22:10:37.776592 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-srv-cert\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:37.781064 master-0 kubenswrapper[36504]: I1203 22:10:37.781007 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 22:10:37.801888 master-0 kubenswrapper[36504]: I1203 22:10:37.800920 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-k4746" Dec 03 22:10:37.821436 master-0 kubenswrapper[36504]: I1203 22:10:37.821232 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 22:10:37.828528 master-0 kubenswrapper[36504]: I1203 22:10:37.827607 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-machine-approver-tls\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:37.840879 master-0 kubenswrapper[36504]: I1203 22:10:37.840633 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 22:10:37.848882 master-0 kubenswrapper[36504]: I1203 22:10:37.848309 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-auth-proxy-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:37.854968 master-0 kubenswrapper[36504]: E1203 22:10:37.854912 36504 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Dec 03 22:10:37.861821 master-0 kubenswrapper[36504]: I1203 22:10:37.861526 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 22:10:37.865598 master-0 kubenswrapper[36504]: I1203 22:10:37.865543 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-config\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:37.881785 master-0 kubenswrapper[36504]: I1203 22:10:37.881709 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 22:10:37.901014 master-0 kubenswrapper[36504]: I1203 22:10:37.900940 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 22:10:37.907797 master-0 kubenswrapper[36504]: I1203 22:10:37.907173 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a124c14f-20c6-4df3-956f-a858de0c73c9-proxy-tls\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:37.921961 master-0 kubenswrapper[36504]: I1203 22:10:37.921903 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-5q257" Dec 03 22:10:37.940808 master-0 kubenswrapper[36504]: I1203 22:10:37.940738 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 22:10:37.961722 master-0 kubenswrapper[36504]: I1203 22:10:37.961666 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 22:10:37.966789 master-0 kubenswrapper[36504]: I1203 22:10:37.966699 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-images\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:37.981574 master-0 kubenswrapper[36504]: I1203 22:10:37.981522 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:10:38.000858 master-0 kubenswrapper[36504]: I1203 22:10:38.000809 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 22:10:38.004957 master-0 kubenswrapper[36504]: I1203 22:10:38.004900 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/def52ba3-77c1-4e0c-8a0d-44ff4d677607-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:38.021852 master-0 kubenswrapper[36504]: I1203 22:10:38.021739 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 22:10:38.026948 master-0 kubenswrapper[36504]: I1203 22:10:38.026900 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/def52ba3-77c1-4e0c-8a0d-44ff4d677607-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:38.041335 master-0 kubenswrapper[36504]: I1203 22:10:38.041255 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-w6qk9" Dec 03 22:10:38.061757 master-0 kubenswrapper[36504]: I1203 22:10:38.061620 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-gk9qh" Dec 03 22:10:38.079649 master-0 kubenswrapper[36504]: I1203 22:10:38.079589 36504 request.go:700] Waited for 1.99466583s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-xm8lg&limit=500&resourceVersion=0 Dec 03 22:10:38.081908 master-0 kubenswrapper[36504]: I1203 22:10:38.081872 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-xm8lg" Dec 03 22:10:38.101349 master-0 kubenswrapper[36504]: I1203 22:10:38.101303 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 22:10:38.107676 master-0 kubenswrapper[36504]: I1203 22:10:38.107618 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-node-bootstrap-token\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:38.120370 master-0 kubenswrapper[36504]: I1203 22:10:38.120332 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 22:10:38.126363 master-0 kubenswrapper[36504]: I1203 22:10:38.126300 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-certs\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:38.141015 master-0 kubenswrapper[36504]: I1203 22:10:38.140954 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 22:10:38.144850 master-0 kubenswrapper[36504]: I1203 22:10:38.144815 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-webhook-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:38.146351 master-0 kubenswrapper[36504]: I1203 22:10:38.146308 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/77e36f4e-845b-4b82-8abc-b634636c087a-apiservice-cert\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:38.160390 master-0 kubenswrapper[36504]: I1203 22:10:38.160330 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-5qqsp" Dec 03 22:10:38.180509 master-0 kubenswrapper[36504]: I1203 22:10:38.180451 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-7n9xc" Dec 03 22:10:38.200298 master-0 kubenswrapper[36504]: I1203 22:10:38.200231 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 22:10:38.205416 master-0 kubenswrapper[36504]: I1203 22:10:38.205390 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e96335e-1866-41c8-b128-b95e783a9be4-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 22:10:38.221799 master-0 kubenswrapper[36504]: I1203 22:10:38.221729 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 22:10:38.226549 master-0 kubenswrapper[36504]: I1203 22:10:38.226498 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd18a700-53b2-430c-a34f-dbb6331cfbe5-proxy-tls\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:38.241330 master-0 kubenswrapper[36504]: I1203 22:10:38.241275 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 03 22:10:38.245536 master-0 kubenswrapper[36504]: I1203 22:10:38.245482 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-tls\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:38.260143 master-0 kubenswrapper[36504]: I1203 22:10:38.260074 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 03 22:10:38.266474 master-0 kubenswrapper[36504]: I1203 22:10:38.266423 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6fafa97-812d-4588-95f8-7c4d85f53098-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:38.280254 master-0 kubenswrapper[36504]: I1203 22:10:38.280185 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-9g5tm" Dec 03 22:10:38.300303 master-0 kubenswrapper[36504]: I1203 22:10:38.300249 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-ml7r7" Dec 03 22:10:38.320722 master-0 kubenswrapper[36504]: I1203 22:10:38.320608 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-mss7s" Dec 03 22:10:38.341015 master-0 kubenswrapper[36504]: I1203 22:10:38.340950 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-88jcx" Dec 03 22:10:38.361084 master-0 kubenswrapper[36504]: I1203 22:10:38.361016 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 03 22:10:38.365914 master-0 kubenswrapper[36504]: I1203 22:10:38.365869 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6fafa97-812d-4588-95f8-7c4d85f53098-metrics-client-ca\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:38.366136 master-0 kubenswrapper[36504]: I1203 22:10:38.365880 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-metrics-client-ca\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:38.366199 master-0 kubenswrapper[36504]: I1203 22:10:38.366123 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-metrics-client-ca\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:38.366436 master-0 kubenswrapper[36504]: I1203 22:10:38.366401 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0be52f3-b318-4630-b4da-f3c4a57d5818-metrics-client-ca\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:38.381150 master-0 kubenswrapper[36504]: I1203 22:10:38.381077 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 03 22:10:38.384628 master-0 kubenswrapper[36504]: I1203 22:10:38.384584 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-tls\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:38.400614 master-0 kubenswrapper[36504]: I1203 22:10:38.400548 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 03 22:10:38.407223 master-0 kubenswrapper[36504]: I1203 22:10:38.407179 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0be52f3-b318-4630-b4da-f3c4a57d5818-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:38.420204 master-0 kubenswrapper[36504]: I1203 22:10:38.420156 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-8whkp" Dec 03 22:10:38.441596 master-0 kubenswrapper[36504]: I1203 22:10:38.441501 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-cwgtj" Dec 03 22:10:38.461517 master-0 kubenswrapper[36504]: I1203 22:10:38.461459 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 03 22:10:38.465357 master-0 kubenswrapper[36504]: I1203 22:10:38.465313 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-tls\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:38.481137 master-0 kubenswrapper[36504]: I1203 22:10:38.481068 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 03 22:10:38.487532 master-0 kubenswrapper[36504]: I1203 22:10:38.487387 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:38.500841 master-0 kubenswrapper[36504]: I1203 22:10:38.500783 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 03 22:10:38.505412 master-0 kubenswrapper[36504]: I1203 22:10:38.505354 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-tls\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:38.520648 master-0 kubenswrapper[36504]: I1203 22:10:38.520590 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-t6rs7" Dec 03 22:10:38.540442 master-0 kubenswrapper[36504]: I1203 22:10:38.540390 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 03 22:10:38.546134 master-0 kubenswrapper[36504]: I1203 22:10:38.546082 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:38.561567 master-0 kubenswrapper[36504]: I1203 22:10:38.561516 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 03 22:10:38.565516 master-0 kubenswrapper[36504]: I1203 22:10:38.565451 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2d592f19-c7b9-4b29-9ca2-848572067908-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:38.580440 master-0 kubenswrapper[36504]: I1203 22:10:38.580312 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 03 22:10:38.585748 master-0 kubenswrapper[36504]: I1203 22:10:38.585704 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:38.600811 master-0 kubenswrapper[36504]: I1203 22:10:38.600746 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 03 22:10:38.606883 master-0 kubenswrapper[36504]: I1203 22:10:38.606825 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:38.621560 master-0 kubenswrapper[36504]: I1203 22:10:38.621501 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 03 22:10:38.625511 master-0 kubenswrapper[36504]: I1203 22:10:38.625464 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:38.640316 master-0 kubenswrapper[36504]: I1203 22:10:38.640248 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-gfs59jbdhk2g" Dec 03 22:10:38.646679 master-0 kubenswrapper[36504]: I1203 22:10:38.646625 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:38.661137 master-0 kubenswrapper[36504]: I1203 22:10:38.661086 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 03 22:10:38.666001 master-0 kubenswrapper[36504]: I1203 22:10:38.665958 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:38.681227 master-0 kubenswrapper[36504]: I1203 22:10:38.681026 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 22:10:38.686067 master-0 kubenswrapper[36504]: I1203 22:10:38.686020 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7c8ec36d-9179-40ab-a448-440b4501b3e0-cert\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 22:10:38.699967 master-0 kubenswrapper[36504]: I1203 22:10:38.699912 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-5qkbc" Dec 03 22:10:38.727199 master-0 kubenswrapper[36504]: I1203 22:10:38.726824 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 22:10:38.740432 master-0 kubenswrapper[36504]: I1203 22:10:38.740348 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 22:10:38.760921 master-0 kubenswrapper[36504]: I1203 22:10:38.760869 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-lrw2p" Dec 03 22:10:38.774374 master-0 kubenswrapper[36504]: E1203 22:10:38.774307 36504 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.774630 master-0 kubenswrapper[36504]: E1203 22:10:38.774441 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config podName:64856d96-023f-46db-819c-02f1adea5aab nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.774415238 +0000 UTC m=+4.994187255 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config") pod "route-controller-manager-8667dd96f5-qf2rc" (UID: "64856d96-023f-46db-819c-02f1adea5aab") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.774630 master-0 kubenswrapper[36504]: E1203 22:10:38.774314 36504 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.774630 master-0 kubenswrapper[36504]: E1203 22:10:38.774542 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.774521581 +0000 UTC m=+4.994293588 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.774630 master-0 kubenswrapper[36504]: E1203 22:10:38.774314 36504 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.774630 master-0 kubenswrapper[36504]: E1203 22:10:38.774579 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.774571263 +0000 UTC m=+4.994343360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.774972 master-0 kubenswrapper[36504]: E1203 22:10:38.774946 36504 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:38.775046 master-0 kubenswrapper[36504]: E1203 22:10:38.774977 36504 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:38.775046 master-0 kubenswrapper[36504]: E1203 22:10:38.775003 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs podName:452012bf-eae1-4e69-9ba1-034309e9f2c8 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.774991016 +0000 UTC m=+4.994763093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs") pod "multus-admission-controller-5bdcc987c4-5cs48" (UID: "452012bf-eae1-4e69-9ba1-034309e9f2c8") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:38.775046 master-0 kubenswrapper[36504]: E1203 22:10:38.775036 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert podName:64856d96-023f-46db-819c-02f1adea5aab nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.775021307 +0000 UTC m=+4.994793384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert") pod "route-controller-manager-8667dd96f5-qf2rc" (UID: "64856d96-023f-46db-819c-02f1adea5aab") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:38.775470 master-0 kubenswrapper[36504]: E1203 22:10:38.775427 36504 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:38.775536 master-0 kubenswrapper[36504]: E1203 22:10:38.775528 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.775508032 +0000 UTC m=+4.995280039 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync secret cache: timed out waiting for the condition Dec 03 22:10:38.776680 master-0 kubenswrapper[36504]: E1203 22:10:38.776661 36504 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.776757 master-0 kubenswrapper[36504]: E1203 22:10:38.776709 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles podName:f6498ac1-7d07-4a5f-a968-d8bda72d1002 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.776696789 +0000 UTC m=+4.996468796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles") pod "controller-manager-77778bd57c-xdhvs" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.776757 master-0 kubenswrapper[36504]: E1203 22:10:38.776713 36504 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.776901 master-0 kubenswrapper[36504]: E1203 22:10:38.776764 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca podName:64856d96-023f-46db-819c-02f1adea5aab nodeName:}" failed. No retries permitted until 2025-12-03 22:10:39.776751551 +0000 UTC m=+4.996523618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca") pod "route-controller-manager-8667dd96f5-qf2rc" (UID: "64856d96-023f-46db-819c-02f1adea5aab") : failed to sync configmap cache: timed out waiting for the condition Dec 03 22:10:38.780311 master-0 kubenswrapper[36504]: I1203 22:10:38.780272 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:10:38.801267 master-0 kubenswrapper[36504]: I1203 22:10:38.801166 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vbccl" Dec 03 22:10:38.824305 master-0 kubenswrapper[36504]: I1203 22:10:38.824246 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:10:38.842050 master-0 kubenswrapper[36504]: I1203 22:10:38.841922 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:10:38.862376 master-0 kubenswrapper[36504]: I1203 22:10:38.862320 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:10:38.881137 master-0 kubenswrapper[36504]: I1203 22:10:38.881078 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:10:38.900657 master-0 kubenswrapper[36504]: I1203 22:10:38.900470 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 22:10:38.920880 master-0 kubenswrapper[36504]: I1203 22:10:38.920830 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-dhpt2" Dec 03 22:10:38.942476 master-0 kubenswrapper[36504]: I1203 22:10:38.942401 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 22:10:38.960480 master-0 kubenswrapper[36504]: I1203 22:10:38.960411 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 22:10:38.980963 master-0 kubenswrapper[36504]: I1203 22:10:38.980904 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 22:10:39.002395 master-0 kubenswrapper[36504]: I1203 22:10:39.001075 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 22:10:39.025730 master-0 kubenswrapper[36504]: I1203 22:10:39.025647 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 22:10:39.040985 master-0 kubenswrapper[36504]: I1203 22:10:39.040917 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 22:10:39.061855 master-0 kubenswrapper[36504]: I1203 22:10:39.061244 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-hf9lh" Dec 03 22:10:39.079852 master-0 kubenswrapper[36504]: I1203 22:10:39.079790 36504 request.go:700] Waited for 2.983786098s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/serviceaccounts/csi-snapshot-controller-operator/token Dec 03 22:10:39.095735 master-0 kubenswrapper[36504]: I1203 22:10:39.095580 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkhtr\" (UniqueName: \"kubernetes.io/projected/add88bf0-c88d-427d-94bb-897e088a1378-kube-api-access-hkhtr\") pod \"csi-snapshot-controller-operator-7b795784b8-l9q2j\" (UID: \"add88bf0-c88d-427d-94bb-897e088a1378\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-7b795784b8-l9q2j" Dec 03 22:10:39.117272 master-0 kubenswrapper[36504]: I1203 22:10:39.117217 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd9vn\" (UniqueName: \"kubernetes.io/projected/b8194009-3743-4da7-baf1-f9bb0afd6187-kube-api-access-rd9vn\") pod \"multus-6jlh8\" (UID: \"b8194009-3743-4da7-baf1-f9bb0afd6187\") " pod="openshift-multus/multus-6jlh8" Dec 03 22:10:39.135053 master-0 kubenswrapper[36504]: I1203 22:10:39.134996 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4c4r\" (UniqueName: \"kubernetes.io/projected/b7f68d19-71d4-4129-a575-3ee57fa53493-kube-api-access-t4c4r\") pod \"cluster-node-tuning-operator-bbd9b9dff-96glt\" (UID: \"b7f68d19-71d4-4129-a575-3ee57fa53493\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-bbd9b9dff-96glt" Dec 03 22:10:39.153621 master-0 kubenswrapper[36504]: I1203 22:10:39.153569 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgddm\" (UniqueName: \"kubernetes.io/projected/04f5fc52-4ec2-48c3-8441-2b15ad632233-kube-api-access-tgddm\") pod \"package-server-manager-75b4d49d4c-psjj5\" (UID: \"04f5fc52-4ec2-48c3-8441-2b15ad632233\") " pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:10:39.172924 master-0 kubenswrapper[36504]: I1203 22:10:39.172871 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7fsz\" (UniqueName: \"kubernetes.io/projected/814c8acf-fb8d-4f57-b8db-21304402c1f1-kube-api-access-x7fsz\") pod \"iptables-alerter-clt4v\" (UID: \"814c8acf-fb8d-4f57-b8db-21304402c1f1\") " pod="openshift-network-operator/iptables-alerter-clt4v" Dec 03 22:10:39.192997 master-0 kubenswrapper[36504]: I1203 22:10:39.192933 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fsxc\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-kube-api-access-4fsxc\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:39.216868 master-0 kubenswrapper[36504]: I1203 22:10:39.216810 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mbz\" (UniqueName: \"kubernetes.io/projected/29ac4a9d-1228-49c7-9051-338e7dc98a38-kube-api-access-p4mbz\") pod \"ovnkube-control-plane-f9f7f4946-8qg8w\" (UID: \"29ac4a9d-1228-49c7-9051-338e7dc98a38\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-f9f7f4946-8qg8w" Dec 03 22:10:39.232011 master-0 kubenswrapper[36504]: I1203 22:10:39.231959 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zsw\" (UniqueName: \"kubernetes.io/projected/b522af85-394e-4965-9bf4-83f48fb8ad94-kube-api-access-b7zsw\") pod \"apiserver-64554dd846-6vfz6\" (UID: \"b522af85-394e-4965-9bf4-83f48fb8ad94\") " pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:39.258328 master-0 kubenswrapper[36504]: I1203 22:10:39.258258 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df8nl\" (UniqueName: \"kubernetes.io/projected/b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23-kube-api-access-df8nl\") pod \"tuned-fvghq\" (UID: \"b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23\") " pod="openshift-cluster-node-tuning-operator/tuned-fvghq" Dec 03 22:10:39.277308 master-0 kubenswrapper[36504]: I1203 22:10:39.277241 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvkz7\" (UniqueName: \"kubernetes.io/projected/53713eab-c920-4d5a-ae05-7cdb59ace852-kube-api-access-nvkz7\") pod \"ovnkube-node-k2j45\" (UID: \"53713eab-c920-4d5a-ae05-7cdb59ace852\") " pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:39.299138 master-0 kubenswrapper[36504]: I1203 22:10:39.299076 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5q4k\" (UniqueName: \"kubernetes.io/projected/e50b85a6-7767-4fca-8133-8243bdd85e5d-kube-api-access-z5q4k\") pod \"openshift-config-operator-68c95b6cf5-2cs5d\" (UID: \"e50b85a6-7767-4fca-8133-8243bdd85e5d\") " pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:39.312452 master-0 kubenswrapper[36504]: I1203 22:10:39.312394 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gds\" (UniqueName: \"kubernetes.io/projected/ba624ed0-32cc-4c87-81a5-708a8a8a7f88-kube-api-access-n4gds\") pod \"control-plane-machine-set-operator-66f4cc99d4-jlq49\" (UID: \"ba624ed0-32cc-4c87-81a5-708a8a8a7f88\") " pod="openshift-machine-api/control-plane-machine-set-operator-66f4cc99d4-jlq49" Dec 03 22:10:39.336811 master-0 kubenswrapper[36504]: I1203 22:10:39.336692 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7qqf\" (UniqueName: \"kubernetes.io/projected/bebd69d2-5b0f-4b66-8722-d6861eba3e12-kube-api-access-n7qqf\") pod \"cluster-monitoring-operator-69cc794c58-vns7s\" (UID: \"bebd69d2-5b0f-4b66-8722-d6861eba3e12\") " pod="openshift-monitoring/cluster-monitoring-operator-69cc794c58-vns7s" Dec 03 22:10:39.352816 master-0 kubenswrapper[36504]: I1203 22:10:39.352645 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2krlg\" (UniqueName: \"kubernetes.io/projected/a4399d20-f9a6-4ab1-86be-e2845394eaba-kube-api-access-2krlg\") pod \"marketplace-operator-7d67745bb7-4jd6d\" (UID: \"a4399d20-f9a6-4ab1-86be-e2845394eaba\") " pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:39.385061 master-0 kubenswrapper[36504]: I1203 22:10:39.384948 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6m8f\" (UniqueName: \"kubernetes.io/projected/785612fc-3f78-4f1a-bc83-7afe5d3b8056-kube-api-access-j6m8f\") pod \"authentication-operator-7479ffdf48-fqnsm\" (UID: \"785612fc-3f78-4f1a-bc83-7afe5d3b8056\") " pod="openshift-authentication-operator/authentication-operator-7479ffdf48-fqnsm" Dec 03 22:10:39.394429 master-0 kubenswrapper[36504]: I1203 22:10:39.394355 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kfg5\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-kube-api-access-2kfg5\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:40.005031 master-0 kubenswrapper[36504]: I1203 22:10:40.004904 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntg2z\" (UniqueName: \"kubernetes.io/projected/2b014bee-5931-4856-b9e8-e38a134a1b6b-kube-api-access-ntg2z\") pod \"migrator-5bcf58cf9c-qc9zc\" (UID: \"2b014bee-5931-4856-b9e8-e38a134a1b6b\") " pod="openshift-kube-storage-version-migrator/migrator-5bcf58cf9c-qc9zc" Dec 03 22:10:40.005714 master-0 kubenswrapper[36504]: I1203 22:10:40.005448 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.005714 master-0 kubenswrapper[36504]: I1203 22:10:40.005563 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.005714 master-0 kubenswrapper[36504]: I1203 22:10:40.005637 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.005917 master-0 kubenswrapper[36504]: I1203 22:10:40.005832 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.005917 master-0 kubenswrapper[36504]: I1203 22:10:40.005866 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.005917 master-0 kubenswrapper[36504]: I1203 22:10:40.005908 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.006111 master-0 kubenswrapper[36504]: I1203 22:10:40.006079 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:10:40.006170 master-0 kubenswrapper[36504]: I1203 22:10:40.006152 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.006268 master-0 kubenswrapper[36504]: I1203 22:10:40.006233 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.006444 master-0 kubenswrapper[36504]: I1203 22:10:40.006414 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.006545 master-0 kubenswrapper[36504]: I1203 22:10:40.006526 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.006731 master-0 kubenswrapper[36504]: I1203 22:10:40.006704 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.006807 master-0 kubenswrapper[36504]: I1203 22:10:40.006785 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.006930 master-0 kubenswrapper[36504]: I1203 22:10:40.006910 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.007111 master-0 kubenswrapper[36504]: I1203 22:10:40.007090 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.007163 master-0 kubenswrapper[36504]: I1203 22:10:40.007110 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/452012bf-eae1-4e69-9ba1-034309e9f2c8-webhook-certs\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:10:40.080045 master-0 kubenswrapper[36504]: I1203 22:10:40.079808 36504 request.go:700] Waited for 3.855040421s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/node-exporter/token Dec 03 22:10:40.123793 master-0 kubenswrapper[36504]: I1203 22:10:40.119806 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwdzk\" (UniqueName: \"kubernetes.io/projected/bcbec7ef-0b98-4346-8c6b-c5fa37e90286-kube-api-access-dwdzk\") pod \"node-exporter-nkjnl\" (UID: \"bcbec7ef-0b98-4346-8c6b-c5fa37e90286\") " pod="openshift-monitoring/node-exporter-nkjnl" Dec 03 22:10:40.123793 master-0 kubenswrapper[36504]: I1203 22:10:40.121292 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl7tn\" (UniqueName: \"kubernetes.io/projected/54767c36-ca29-4c91-9a8a-9699ecfa4afb-kube-api-access-bl7tn\") pod \"dns-default-9skcn\" (UID: \"54767c36-ca29-4c91-9a8a-9699ecfa4afb\") " pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:40.123793 master-0 kubenswrapper[36504]: I1203 22:10:40.122382 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj79k\" (UniqueName: \"kubernetes.io/projected/fa9b5917-d4f3-4372-a200-45b57412f92f-kube-api-access-pj79k\") pod \"cluster-baremetal-operator-5fdc576499-q9tf6\" (UID: \"fa9b5917-d4f3-4372-a200-45b57412f92f\") " pod="openshift-machine-api/cluster-baremetal-operator-5fdc576499-q9tf6" Dec 03 22:10:40.123793 master-0 kubenswrapper[36504]: I1203 22:10:40.122407 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vvg\" (UniqueName: \"kubernetes.io/projected/5f088999-ec66-402e-9634-8c762206d6b4-kube-api-access-94vvg\") pod \"service-ca-operator-56f5898f45-mjdfr\" (UID: \"5f088999-ec66-402e-9634-8c762206d6b4\") " pod="openshift-service-ca-operator/service-ca-operator-56f5898f45-mjdfr" Dec 03 22:10:40.129790 master-0 kubenswrapper[36504]: I1203 22:10:40.127784 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtxdk\" (UniqueName: \"kubernetes.io/projected/da949cf7-ab12-43ff-8e45-da1c2fd46e20-kube-api-access-dtxdk\") pod \"operator-controller-controller-manager-5f78c89466-kz8nk\" (UID: \"da949cf7-ab12-43ff-8e45-da1c2fd46e20\") " pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:40.129790 master-0 kubenswrapper[36504]: I1203 22:10:40.128535 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z9vv\" (UniqueName: \"kubernetes.io/projected/9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6-kube-api-access-4z9vv\") pod \"catalogd-controller-manager-754cfd84-bnstw\" (UID: \"9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6\") " pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:40.129790 master-0 kubenswrapper[36504]: I1203 22:10:40.128959 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxl5r\" (UniqueName: \"kubernetes.io/projected/812401c0-d1ac-4857-b939-217b7b07f8bc-kube-api-access-mxl5r\") pod \"network-metrics-daemon-h6569\" (UID: \"812401c0-d1ac-4857-b939-217b7b07f8bc\") " pod="openshift-multus/network-metrics-daemon-h6569" Dec 03 22:10:40.129790 master-0 kubenswrapper[36504]: I1203 22:10:40.129302 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575vn\" (UniqueName: \"kubernetes.io/projected/a9a3f403-a742-4977-901a-cf4a8eb7df5a-kube-api-access-575vn\") pod \"dns-operator-6b7bcd6566-qcg9x\" (UID: \"a9a3f403-a742-4977-901a-cf4a8eb7df5a\") " pod="openshift-dns-operator/dns-operator-6b7bcd6566-qcg9x" Dec 03 22:10:40.129790 master-0 kubenswrapper[36504]: I1203 22:10:40.129565 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq55c\" (UniqueName: \"kubernetes.io/projected/a124c14f-20c6-4df3-956f-a858de0c73c9-kube-api-access-fq55c\") pod \"machine-config-controller-74cddd4fb5-7zg56\" (UID: \"a124c14f-20c6-4df3-956f-a858de0c73c9\") " pod="openshift-machine-config-operator/machine-config-controller-74cddd4fb5-7zg56" Dec 03 22:10:40.134787 master-0 kubenswrapper[36504]: I1203 22:10:40.130682 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9427b8-d62c-45f7-97d0-1f7667ff27aa-kube-api-access\") pod \"cluster-version-operator-7c49fbfc6f-xg98g\" (UID: \"0e9427b8-d62c-45f7-97d0-1f7667ff27aa\") " pod="openshift-cluster-version/cluster-version-operator-7c49fbfc6f-xg98g" Dec 03 22:10:40.134787 master-0 kubenswrapper[36504]: I1203 22:10:40.131568 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2l8\" (UniqueName: \"kubernetes.io/projected/c8da5d44-680e-4169-abc6-607bdc37a64d-kube-api-access-pm2l8\") pod \"cluster-olm-operator-589f5cdc9d-25qxh\" (UID: \"c8da5d44-680e-4169-abc6-607bdc37a64d\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-589f5cdc9d-25qxh" Dec 03 22:10:40.134787 master-0 kubenswrapper[36504]: I1203 22:10:40.132943 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw7p\" (UniqueName: \"kubernetes.io/projected/50076985-bbaa-4bcf-9d1a-cc25bed016a7-kube-api-access-jvw7p\") pod \"kube-storage-version-migrator-operator-67c4cff67d-vcd7b\" (UID: \"50076985-bbaa-4bcf-9d1a-cc25bed016a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-67c4cff67d-vcd7b" Dec 03 22:10:40.134787 master-0 kubenswrapper[36504]: I1203 22:10:40.134749 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvmxp\" (UniqueName: \"kubernetes.io/projected/892d5611-debf-402f-abc5-3f99aa080159-kube-api-access-bvmxp\") pod \"network-operator-6cbf58c977-zk7jw\" (UID: \"892d5611-debf-402f-abc5-3f99aa080159\") " pod="openshift-network-operator/network-operator-6cbf58c977-zk7jw" Dec 03 22:10:40.138789 master-0 kubenswrapper[36504]: I1203 22:10:40.135269 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgwxt\" (UniqueName: \"kubernetes.io/projected/62b43fe1-63f5-4d29-90a2-f36cb9e880ff-kube-api-access-wgwxt\") pod \"cluster-samples-operator-6d64b47964-66dsl\" (UID: \"62b43fe1-63f5-4d29-90a2-f36cb9e880ff\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6d64b47964-66dsl" Dec 03 22:10:40.138789 master-0 kubenswrapper[36504]: I1203 22:10:40.135393 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw7l6\" (UniqueName: \"kubernetes.io/projected/66aa2598-f4b6-4d3a-9623-aeb707e4912b-kube-api-access-mw7l6\") pod \"node-resolver-4dx8h\" (UID: \"66aa2598-f4b6-4d3a-9623-aeb707e4912b\") " pod="openshift-dns/node-resolver-4dx8h" Dec 03 22:10:40.138789 master-0 kubenswrapper[36504]: I1203 22:10:40.135972 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0869de9b-6f5b-4c31-81ad-02a9c8888193-bound-sa-token\") pod \"ingress-operator-85dbd94574-2hxlh\" (UID: \"0869de9b-6f5b-4c31-81ad-02a9c8888193\") " pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" Dec 03 22:10:40.138789 master-0 kubenswrapper[36504]: I1203 22:10:40.136955 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9sj\" (UniqueName: \"kubernetes.io/projected/1b47f2ef-9923-411f-9f2f-ddaea8bc7053-kube-api-access-dx9sj\") pod \"service-ca-6b8bb995f7-69t6v\" (UID: \"1b47f2ef-9923-411f-9f2f-ddaea8bc7053\") " pod="openshift-service-ca/service-ca-6b8bb995f7-69t6v" Dec 03 22:10:40.138789 master-0 kubenswrapper[36504]: I1203 22:10:40.137413 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzl8x\" (UniqueName: \"kubernetes.io/projected/787c50e1-35b5-43d7-9c26-8dd5399693d3-kube-api-access-jzl8x\") pod \"network-check-source-6964bb78b7-lntt5\" (UID: \"787c50e1-35b5-43d7-9c26-8dd5399693d3\") " pod="openshift-network-diagnostics/network-check-source-6964bb78b7-lntt5" Dec 03 22:10:40.138789 master-0 kubenswrapper[36504]: I1203 22:10:40.137924 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kvxd\" (UniqueName: \"kubernetes.io/projected/3a7e0eea-3da8-43de-87bc-d10231e7c239-kube-api-access-6kvxd\") pod \"cloud-credential-operator-7c4dc67499-jhd6n\" (UID: \"3a7e0eea-3da8-43de-87bc-d10231e7c239\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-7c4dc67499-jhd6n" Dec 03 22:10:40.139067 master-0 kubenswrapper[36504]: I1203 22:10:40.138904 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr8k\" (UniqueName: \"kubernetes.io/projected/1a0f647a-0260-4737-8ae2-cc90d01d33d1-kube-api-access-lsr8k\") pod \"network-node-identity-r24k4\" (UID: \"1a0f647a-0260-4737-8ae2-cc90d01d33d1\") " pod="openshift-network-node-identity/network-node-identity-r24k4" Dec 03 22:10:40.142788 master-0 kubenswrapper[36504]: I1203 22:10:40.139707 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfd7g\" (UniqueName: \"kubernetes.io/projected/578b2d03-b8b3-4c75-adde-73899c472ad7-kube-api-access-gfd7g\") pod \"insights-operator-59d99f9b7b-x4tfh\" (UID: \"578b2d03-b8b3-4c75-adde-73899c472ad7\") " pod="openshift-insights/insights-operator-59d99f9b7b-x4tfh" Dec 03 22:10:40.142788 master-0 kubenswrapper[36504]: I1203 22:10:40.140889 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrbd\" (UniqueName: \"kubernetes.io/projected/858384f3-5741-4e67-8669-2eb2b2dcaf7f-kube-api-access-qgrbd\") pod \"cluster-autoscaler-operator-7f88444875-kb5rx\" (UID: \"858384f3-5741-4e67-8669-2eb2b2dcaf7f\") " pod="openshift-machine-api/cluster-autoscaler-operator-7f88444875-kb5rx" Dec 03 22:10:40.142788 master-0 kubenswrapper[36504]: I1203 22:10:40.141917 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4clxk\" (UniqueName: \"kubernetes.io/projected/82055cfc-b4ce-4a00-a51d-141059947693-kube-api-access-4clxk\") pod \"etcd-operator-7978bf889c-w8hsm\" (UID: \"82055cfc-b4ce-4a00-a51d-141059947693\") " pod="openshift-etcd-operator/etcd-operator-7978bf889c-w8hsm" Dec 03 22:10:40.142788 master-0 kubenswrapper[36504]: I1203 22:10:40.142001 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08432be8-0086-48d2-a93d-7a474e96749d-kube-api-access\") pod \"kube-apiserver-operator-5b557b5f57-jxw8c\" (UID: \"08432be8-0086-48d2-a93d-7a474e96749d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-5b557b5f57-jxw8c" Dec 03 22:10:40.143002 master-0 kubenswrapper[36504]: I1203 22:10:40.142920 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx6rt\" (UniqueName: \"kubernetes.io/projected/6976b503-87da-48fc-b097-d1b315fbee3f-kube-api-access-vx6rt\") pod \"openshift-controller-manager-operator-7c4697b5f5-458zh\" (UID: \"6976b503-87da-48fc-b097-d1b315fbee3f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-7c4697b5f5-458zh" Dec 03 22:10:40.145475 master-0 kubenswrapper[36504]: I1203 22:10:40.143907 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c1ee4db7-f2d3-4064-a189-f66fd0a021eb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5f574c6c79-6mvwr\" (UID: \"c1ee4db7-f2d3-4064-a189-f66fd0a021eb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5f574c6c79-6mvwr" Dec 03 22:10:40.145475 master-0 kubenswrapper[36504]: I1203 22:10:40.144218 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89p9d\" (UniqueName: \"kubernetes.io/projected/28c42112-a09e-4b7a-b23b-c06bef69cbfb-kube-api-access-89p9d\") pod \"csi-snapshot-controller-86897dd478-g4ldp\" (UID: \"28c42112-a09e-4b7a-b23b-c06bef69cbfb\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-86897dd478-g4ldp" Dec 03 22:10:40.145475 master-0 kubenswrapper[36504]: I1203 22:10:40.144275 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rjd\" (UniqueName: \"kubernetes.io/projected/6e96335e-1866-41c8-b128-b95e783a9be4-kube-api-access-v8rjd\") pod \"cluster-storage-operator-f84784664-hv5z8\" (UID: \"6e96335e-1866-41c8-b128-b95e783a9be4\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-f84784664-hv5z8" Dec 03 22:10:40.145475 master-0 kubenswrapper[36504]: I1203 22:10:40.145284 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68fd\" (UniqueName: \"kubernetes.io/projected/e6d5d61a-c5de-4619-9afb-7fad63ba0525-kube-api-access-s68fd\") pod \"network-check-target-78hts\" (UID: \"e6d5d61a-c5de-4619-9afb-7fad63ba0525\") " pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 22:10:40.145475 master-0 kubenswrapper[36504]: I1203 22:10:40.145309 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzklx\" (UniqueName: \"kubernetes.io/projected/f59094ec-47dd-4547-ad41-b15a7933f461-kube-api-access-mzklx\") pod \"openshift-apiserver-operator-667484ff5-st2db\" (UID: \"f59094ec-47dd-4547-ad41-b15a7933f461\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-667484ff5-st2db" Dec 03 22:10:40.146501 master-0 kubenswrapper[36504]: I1203 22:10:40.146459 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h2wx\" (UniqueName: \"kubernetes.io/projected/246b7846-0dfd-43a8-bcfa-81e7435060dc-kube-api-access-5h2wx\") pod \"apiserver-67d47fb995-88vr2\" (UID: \"246b7846-0dfd-43a8-bcfa-81e7435060dc\") " pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:40.146990 master-0 kubenswrapper[36504]: I1203 22:10:40.146934 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") pod \"community-operators-k98b2\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:40.147503 master-0 kubenswrapper[36504]: I1203 22:10:40.147440 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/39f0e973-7864-4842-af8e-47718ab1804c-bound-sa-token\") pod \"cluster-image-registry-operator-65dc4bcb88-2vvsj\" (UID: \"39f0e973-7864-4842-af8e-47718ab1804c\") " pod="openshift-image-registry/cluster-image-registry-operator-65dc4bcb88-2vvsj" Dec 03 22:10:40.148899 master-0 kubenswrapper[36504]: I1203 22:10:40.148858 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97kqz\" (UniqueName: \"kubernetes.io/projected/698e6d87-1a58-493c-8b69-d22c89d26ac5-kube-api-access-97kqz\") pod \"router-default-54f97f57-xq6ch\" (UID: \"698e6d87-1a58-493c-8b69-d22c89d26ac5\") " pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:40.149895 master-0 kubenswrapper[36504]: I1203 22:10:40.149860 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fdfbaebe-d655-4c1e-a039-08802c5c35c5-kube-api-access\") pod \"kube-controller-manager-operator-b5dddf8f5-llvrh\" (UID: \"fdfbaebe-d655-4c1e-a039-08802c5c35c5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-b5dddf8f5-llvrh" Dec 03 22:10:40.154672 master-0 kubenswrapper[36504]: I1203 22:10:40.154626 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9q7k\" (UniqueName: \"kubernetes.io/projected/452012bf-eae1-4e69-9ba1-034309e9f2c8-kube-api-access-b9q7k\") pod \"multus-admission-controller-5bdcc987c4-5cs48\" (UID: \"452012bf-eae1-4e69-9ba1-034309e9f2c8\") " pod="openshift-multus/multus-admission-controller-5bdcc987c4-5cs48" Dec 03 22:10:40.158756 master-0 kubenswrapper[36504]: I1203 22:10:40.158725 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcq9j\" (UniqueName: \"kubernetes.io/projected/ffad8fc8-4378-44de-8864-dd2f666ade68-kube-api-access-xcq9j\") pod \"multus-additional-cni-plugins-qz5vh\" (UID: \"ffad8fc8-4378-44de-8864-dd2f666ade68\") " pod="openshift-multus/multus-additional-cni-plugins-qz5vh" Dec 03 22:10:40.176681 master-0 kubenswrapper[36504]: I1203 22:10:40.176614 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlw7s\" (UniqueName: \"kubernetes.io/projected/c807d487-5b8f-4747-87ee-df0637e2e11f-kube-api-access-nlw7s\") pod \"redhat-operators-qht46\" (UID: \"c807d487-5b8f-4747-87ee-df0637e2e11f\") " pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:40.195197 master-0 kubenswrapper[36504]: I1203 22:10:40.195119 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgv2\" (UniqueName: \"kubernetes.io/projected/a9940ff5-36a6-4c04-a51d-66f7d83bea7c-kube-api-access-thgv2\") pod \"machine-config-operator-664c9d94c9-bdps5\" (UID: \"a9940ff5-36a6-4c04-a51d-66f7d83bea7c\") " pod="openshift-machine-config-operator/machine-config-operator-664c9d94c9-bdps5" Dec 03 22:10:40.214446 master-0 kubenswrapper[36504]: I1203 22:10:40.214391 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvkk\" (UniqueName: \"kubernetes.io/projected/def52ba3-77c1-4e0c-8a0d-44ff4d677607-kube-api-access-dcvkk\") pod \"cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx\" (UID: \"def52ba3-77c1-4e0c-8a0d-44ff4d677607\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx" Dec 03 22:10:40.236583 master-0 kubenswrapper[36504]: I1203 22:10:40.236532 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") pod \"controller-manager-77778bd57c-xdhvs\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:40.243130 master-0 kubenswrapper[36504]: I1203 22:10:40.242895 36504 scope.go:117] "RemoveContainer" containerID="e64206fa7e2ac5064c9f543b141703a3c454e2abcc78e0ccc7a94ea5a9ffd082" Dec 03 22:10:40.266563 master-0 kubenswrapper[36504]: I1203 22:10:40.266436 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v28xw\" (UniqueName: \"kubernetes.io/projected/01b80ad5-7d7c-4ecd-90b0-2913d4559b5f-kube-api-access-v28xw\") pod \"machine-approver-cb84b9cdf-wkcnd\" (UID: \"01b80ad5-7d7c-4ecd-90b0-2913d4559b5f\") " pod="openshift-cluster-machine-approver/machine-approver-cb84b9cdf-wkcnd" Dec 03 22:10:40.332523 master-0 kubenswrapper[36504]: I1203 22:10:40.332479 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9pcw\" (UniqueName: \"kubernetes.io/projected/10fc6516-cd4d-4291-a26d-8376ba0affef-kube-api-access-h9pcw\") pod \"redhat-marketplace-tcqzq\" (UID: \"10fc6516-cd4d-4291-a26d-8376ba0affef\") " pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:40.341161 master-0 kubenswrapper[36504]: I1203 22:10:40.341104 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjtq\" (UniqueName: \"kubernetes.io/projected/ac3d3235-531e-4c7d-9fc9-e65c97970d0f-kube-api-access-hzjtq\") pod \"olm-operator-76bd5d69c7-6tjzq\" (UID: \"ac3d3235-531e-4c7d-9fc9-e65c97970d0f\") " pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:40.353450 master-0 kubenswrapper[36504]: I1203 22:10:40.353410 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtx6m\" (UniqueName: \"kubernetes.io/projected/8b56c318-09b7-47f0-a7bf-32eb96e836ca-kube-api-access-qtx6m\") pod \"machine-api-operator-7486ff55f-w9xk2\" (UID: \"8b56c318-09b7-47f0-a7bf-32eb96e836ca\") " pod="openshift-machine-api/machine-api-operator-7486ff55f-w9xk2" Dec 03 22:10:40.354229 master-0 kubenswrapper[36504]: I1203 22:10:40.354177 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzqm\" (UniqueName: \"kubernetes.io/projected/f9a3f900-60e4-49c2-85ec-88d19852d1b9-kube-api-access-jvzqm\") pod \"catalog-operator-7cf5cf757f-shpjd\" (UID: \"f9a3f900-60e4-49c2-85ec-88d19852d1b9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:40.357691 master-0 kubenswrapper[36504]: I1203 22:10:40.357654 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") pod \"metrics-server-b9f5dccb6-4h4jv\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:40.372112 master-0 kubenswrapper[36504]: I1203 22:10:40.372072 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9bs\" (UniqueName: \"kubernetes.io/projected/bd18a700-53b2-430c-a34f-dbb6331cfbe5-kube-api-access-ll9bs\") pod \"machine-config-daemon-j9wwr\" (UID: \"bd18a700-53b2-430c-a34f-dbb6331cfbe5\") " pod="openshift-machine-config-operator/machine-config-daemon-j9wwr" Dec 03 22:10:40.396712 master-0 kubenswrapper[36504]: I1203 22:10:40.396662 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfrr\" (UniqueName: \"kubernetes.io/projected/2d592f19-c7b9-4b29-9ca2-848572067908-kube-api-access-8hfrr\") pod \"kube-state-metrics-7dcc7f9bd6-kldf9\" (UID: \"2d592f19-c7b9-4b29-9ca2-848572067908\") " pod="openshift-monitoring/kube-state-metrics-7dcc7f9bd6-kldf9" Dec 03 22:10:40.417733 master-0 kubenswrapper[36504]: I1203 22:10:40.417659 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") pod \"route-controller-manager-8667dd96f5-qf2rc\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:40.434424 master-0 kubenswrapper[36504]: I1203 22:10:40.434349 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87hj\" (UniqueName: \"kubernetes.io/projected/77e36f4e-845b-4b82-8abc-b634636c087a-kube-api-access-s87hj\") pod \"packageserver-684c49c488-fpmzc\" (UID: \"77e36f4e-845b-4b82-8abc-b634636c087a\") " pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:40.457179 master-0 kubenswrapper[36504]: I1203 22:10:40.457116 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnfgr\" (UniqueName: \"kubernetes.io/projected/d0be52f3-b318-4630-b4da-f3c4a57d5818-kube-api-access-qnfgr\") pod \"openshift-state-metrics-57cbc648f8-rhf8p\" (UID: \"d0be52f3-b318-4630-b4da-f3c4a57d5818\") " pod="openshift-monitoring/openshift-state-metrics-57cbc648f8-rhf8p" Dec 03 22:10:40.506818 master-0 kubenswrapper[36504]: I1203 22:10:40.506728 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b658f\" (UniqueName: \"kubernetes.io/projected/66f7b08c-52e8-4795-9cf0-74402a9cc0bb-kube-api-access-b658f\") pod \"machine-config-server-vgm8c\" (UID: \"66f7b08c-52e8-4795-9cf0-74402a9cc0bb\") " pod="openshift-machine-config-operator/machine-config-server-vgm8c" Dec 03 22:10:40.507021 master-0 kubenswrapper[36504]: I1203 22:10:40.506874 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcb88\" (UniqueName: \"kubernetes.io/projected/d6fafa97-812d-4588-95f8-7c4d85f53098-kube-api-access-gcb88\") pod \"prometheus-operator-565bdcb8-7s9vg\" (UID: \"d6fafa97-812d-4588-95f8-7c4d85f53098\") " pod="openshift-monitoring/prometheus-operator-565bdcb8-7s9vg" Dec 03 22:10:40.552320 master-0 kubenswrapper[36504]: E1203 22:10:40.552159 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:40.552320 master-0 kubenswrapper[36504]: E1203 22:10:40.552214 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:40.552320 master-0 kubenswrapper[36504]: E1203 22:10:40.552295 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:41.052270444 +0000 UTC m=+6.272042451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:40.557677 master-0 kubenswrapper[36504]: I1203 22:10:40.557632 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dfd\" (UniqueName: \"kubernetes.io/projected/0a49c320-f31d-4f6d-98c3-48d24346b873-kube-api-access-s7dfd\") pod \"certified-operators-kp794\" (UID: \"0a49c320-f31d-4f6d-98c3-48d24346b873\") " pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:40.559758 master-0 kubenswrapper[36504]: I1203 22:10:40.559720 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xtm\" (UniqueName: \"kubernetes.io/projected/7c8ec36d-9179-40ab-a448-440b4501b3e0-kube-api-access-t2xtm\") pod \"ingress-canary-qsfnw\" (UID: \"7c8ec36d-9179-40ab-a448-440b4501b3e0\") " pod="openshift-ingress-canary/ingress-canary-qsfnw" Dec 03 22:10:40.974871 master-0 kubenswrapper[36504]: I1203 22:10:40.971566 36504 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Dec 03 22:10:40.974871 master-0 kubenswrapper[36504]: I1203 22:10:40.971760 36504 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: E1203 22:10:40.992838 36504 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.898s" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.992936 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993086 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993124 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993201 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993229 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993305 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993361 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-68c95b6cf5-2cs5d" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993419 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993447 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993542 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993593 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-75b4d49d4c-psjj5" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993619 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993644 36504 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="ea6b79e2-24a1-4ee2-8309-70660598fa75" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.993923 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.994664 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:40.995277 master-0 kubenswrapper[36504]: I1203 22:10:40.995026 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:41.009878 master-0 kubenswrapper[36504]: I1203 22:10:41.009695 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Dec 03 22:10:41.013353 master-0 kubenswrapper[36504]: I1203 22:10:41.013288 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Dec 03 22:10:41.013353 master-0 kubenswrapper[36504]: I1203 22:10:41.013342 36504 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="ea6b79e2-24a1-4ee2-8309-70660598fa75" Dec 03 22:10:41.013675 master-0 kubenswrapper[36504]: I1203 22:10:41.013408 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:41.013675 master-0 kubenswrapper[36504]: I1203 22:10:41.013460 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9skcn" Dec 03 22:10:41.013675 master-0 kubenswrapper[36504]: I1203 22:10:41.013592 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:41.014926 master-0 kubenswrapper[36504]: I1203 22:10:41.014860 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:41.015034 master-0 kubenswrapper[36504]: I1203 22:10:41.015019 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:41.015114 master-0 kubenswrapper[36504]: I1203 22:10:41.015047 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Dec 03 22:10:41.015896 master-0 kubenswrapper[36504]: I1203 22:10:41.015262 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:41.015896 master-0 kubenswrapper[36504]: I1203 22:10:41.015630 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:41.016284 master-0 kubenswrapper[36504]: I1203 22:10:41.016214 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Dec 03 22:10:41.016653 master-0 kubenswrapper[36504]: I1203 22:10:41.016292 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 22:10:41.016653 master-0 kubenswrapper[36504]: I1203 22:10:41.016418 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Dec 03 22:10:41.016653 master-0 kubenswrapper[36504]: I1203 22:10:41.016459 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-6d4cbfb4b-rqszb" Dec 03 22:10:41.016653 master-0 kubenswrapper[36504]: I1203 22:10:41.016517 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:41.016653 master-0 kubenswrapper[36504]: I1203 22:10:41.016567 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:10:41.016653 master-0 kubenswrapper[36504]: I1203 22:10:41.016616 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:41.031956 master-0 kubenswrapper[36504]: I1203 22:10:41.031479 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/5.log" Dec 03 22:10:41.032203 master-0 kubenswrapper[36504]: I1203 22:10:41.032068 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:10:41.032203 master-0 kubenswrapper[36504]: I1203 22:10:41.032090 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:10:41.032203 master-0 kubenswrapper[36504]: I1203 22:10:41.032124 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-85dbd94574-2hxlh" event={"ID":"0869de9b-6f5b-4c31-81ad-02a9c8888193","Type":"ContainerStarted","Data":"42aa1c19c5ad4f07f80b8e66b25f81f39d7836ee577a1529052f99b0e690803f"} Dec 03 22:10:41.048386 master-0 kubenswrapper[36504]: I1203 22:10:41.048325 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Dec 03 22:10:41.082647 master-0 kubenswrapper[36504]: I1203 22:10:41.082581 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:41.136486 master-0 kubenswrapper[36504]: I1203 22:10:41.136405 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:41.138344 master-0 kubenswrapper[36504]: E1203 22:10:41.138304 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:41.138344 master-0 kubenswrapper[36504]: E1203 22:10:41.138333 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:41.138484 master-0 kubenswrapper[36504]: E1203 22:10:41.138379 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:42.138362645 +0000 UTC m=+7.358134692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:41.267398 master-0 kubenswrapper[36504]: I1203 22:10:41.267259 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:41.267398 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:41.267398 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:41.267398 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:41.267398 master-0 kubenswrapper[36504]: I1203 22:10:41.267352 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:41.520189 master-0 kubenswrapper[36504]: I1203 22:10:41.519978 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:41.521635 master-0 kubenswrapper[36504]: I1203 22:10:41.521559 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-754cfd84-bnstw" Dec 03 22:10:41.575124 master-0 kubenswrapper[36504]: I1203 22:10:41.574972 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=6.574938166 podStartE2EDuration="6.574938166s" podCreationTimestamp="2025-12-03 22:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:41.574378748 +0000 UTC m=+6.794150815" watchObservedRunningTime="2025-12-03 22:10:41.574938166 +0000 UTC m=+6.794710213" Dec 03 22:10:42.166634 master-0 kubenswrapper[36504]: I1203 22:10:42.166505 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:42.167688 master-0 kubenswrapper[36504]: E1203 22:10:42.166740 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:42.167688 master-0 kubenswrapper[36504]: E1203 22:10:42.166794 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:42.167688 master-0 kubenswrapper[36504]: E1203 22:10:42.166875 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:44.166852569 +0000 UTC m=+9.386624596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:42.261545 master-0 kubenswrapper[36504]: I1203 22:10:42.261469 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:42.261545 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:42.261545 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:42.261545 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:42.262047 master-0 kubenswrapper[36504]: I1203 22:10:42.261556 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:42.357363 master-0 kubenswrapper[36504]: I1203 22:10:42.357281 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:42.421742 master-0 kubenswrapper[36504]: I1203 22:10:42.421539 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:42.427844 master-0 kubenswrapper[36504]: I1203 22:10:42.427734 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:10:42.716942 master-0 kubenswrapper[36504]: I1203 22:10:42.716690 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:10:42.744042 master-0 kubenswrapper[36504]: I1203 22:10:42.743912 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=7.743894115 podStartE2EDuration="7.743894115s" podCreationTimestamp="2025-12-03 22:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:10:42.742540332 +0000 UTC m=+7.962312349" watchObservedRunningTime="2025-12-03 22:10:42.743894115 +0000 UTC m=+7.963666122" Dec 03 22:10:42.776668 master-0 kubenswrapper[36504]: I1203 22:10:42.776409 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:42.782282 master-0 kubenswrapper[36504]: I1203 22:10:42.782227 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:42.787632 master-0 kubenswrapper[36504]: I1203 22:10:42.787585 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:42.790698 master-0 kubenswrapper[36504]: I1203 22:10:42.790661 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-5f78c89466-kz8nk" Dec 03 22:10:43.260532 master-0 kubenswrapper[36504]: I1203 22:10:43.260482 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:43.260532 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:43.260532 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:43.260532 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:43.261136 master-0 kubenswrapper[36504]: I1203 22:10:43.260543 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:43.281848 master-0 kubenswrapper[36504]: I1203 22:10:43.278692 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:43.287829 master-0 kubenswrapper[36504]: I1203 22:10:43.285001 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-76bd5d69c7-6tjzq" Dec 03 22:10:43.665640 master-0 kubenswrapper[36504]: I1203 22:10:43.665559 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:43.672269 master-0 kubenswrapper[36504]: I1203 22:10:43.672225 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:10:44.198013 master-0 kubenswrapper[36504]: I1203 22:10:44.197923 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:44.198415 master-0 kubenswrapper[36504]: E1203 22:10:44.198359 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:44.198415 master-0 kubenswrapper[36504]: E1203 22:10:44.198412 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:44.198547 master-0 kubenswrapper[36504]: E1203 22:10:44.198490 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:48.198466141 +0000 UTC m=+13.418238208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:44.260717 master-0 kubenswrapper[36504]: I1203 22:10:44.260613 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:44.260717 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:44.260717 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:44.260717 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:44.260717 master-0 kubenswrapper[36504]: I1203 22:10:44.260697 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:44.290443 master-0 kubenswrapper[36504]: I1203 22:10:44.290354 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:44.294270 master-0 kubenswrapper[36504]: I1203 22:10:44.294203 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:10:45.011173 master-0 kubenswrapper[36504]: I1203 22:10:45.011131 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:45.017998 master-0 kubenswrapper[36504]: I1203 22:10:45.017938 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-64554dd846-6vfz6" Dec 03 22:10:45.134014 master-0 kubenswrapper[36504]: I1203 22:10:45.133958 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:45.143042 master-0 kubenswrapper[36504]: I1203 22:10:45.141008 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:45.260196 master-0 kubenswrapper[36504]: I1203 22:10:45.260129 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:45.260196 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:45.260196 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:45.260196 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:45.260196 master-0 kubenswrapper[36504]: I1203 22:10:45.260180 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:45.262106 master-0 kubenswrapper[36504]: I1203 22:10:45.262015 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-67d47fb995-88vr2" Dec 03 22:10:45.894876 master-0 kubenswrapper[36504]: I1203 22:10:45.894788 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:45.896866 master-0 kubenswrapper[36504]: I1203 22:10:45.896823 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:45.897071 master-0 kubenswrapper[36504]: I1203 22:10:45.897028 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:10:45.897071 master-0 kubenswrapper[36504]: I1203 22:10:45.897065 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:10:45.907808 master-0 kubenswrapper[36504]: I1203 22:10:45.903481 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:10:45.944815 master-0 kubenswrapper[36504]: I1203 22:10:45.944425 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:46.008175 master-0 kubenswrapper[36504]: I1203 22:10:46.007949 36504 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 22:10:46.008413 master-0 kubenswrapper[36504]: I1203 22:10:46.008191 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f08d11be0e2919664ff2ea4b2440d0e0" containerName="startup-monitor" containerID="cri-o://42ac76745cd48697a9d60b7a3008b7dd9f6c94eb9ad7c1bc7b99f348cd44c91a" gracePeriod=5 Dec 03 22:10:46.074725 master-0 kubenswrapper[36504]: I1203 22:10:46.074662 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:10:46.260731 master-0 kubenswrapper[36504]: I1203 22:10:46.260598 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:46.260731 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:46.260731 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:46.260731 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:46.260731 master-0 kubenswrapper[36504]: I1203 22:10:46.260682 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:46.310874 master-0 kubenswrapper[36504]: I1203 22:10:46.310801 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:47.261707 master-0 kubenswrapper[36504]: I1203 22:10:47.260853 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:47.261707 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:47.261707 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:47.261707 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:47.261707 master-0 kubenswrapper[36504]: I1203 22:10:47.260957 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:47.325541 master-0 kubenswrapper[36504]: I1203 22:10:47.325483 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:47.377308 master-0 kubenswrapper[36504]: I1203 22:10:47.377137 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:47.705112 master-0 kubenswrapper[36504]: I1203 22:10:47.705045 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:47.924245 master-0 kubenswrapper[36504]: I1203 22:10:47.924174 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:47.984080 master-0 kubenswrapper[36504]: I1203 22:10:47.982837 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:48.004193 master-0 kubenswrapper[36504]: I1203 22:10:48.004101 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:48.067342 master-0 kubenswrapper[36504]: I1203 22:10:48.067221 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k98b2" Dec 03 22:10:48.133878 master-0 kubenswrapper[36504]: I1203 22:10:48.133691 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:48.137867 master-0 kubenswrapper[36504]: I1203 22:10:48.137820 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-684c49c488-fpmzc" Dec 03 22:10:48.261321 master-0 kubenswrapper[36504]: I1203 22:10:48.259206 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:48.261321 master-0 kubenswrapper[36504]: E1203 22:10:48.260742 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:48.261321 master-0 kubenswrapper[36504]: E1203 22:10:48.260762 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:48.261321 master-0 kubenswrapper[36504]: E1203 22:10:48.260825 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:10:56.260809557 +0000 UTC m=+21.480581584 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:48.264918 master-0 kubenswrapper[36504]: I1203 22:10:48.263573 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:48.264918 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:48.264918 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:48.264918 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:48.264918 master-0 kubenswrapper[36504]: I1203 22:10:48.263624 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.241723 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jsp5g"] Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242034 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242049 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242079 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242087 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242101 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242110 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242126 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242135 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242157 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" containerName="collect-profiles" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242165 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" containerName="collect-profiles" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242180 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d60f02e-1803-461e-9606-667d91fcae14" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242188 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d60f02e-1803-461e-9606-667d91fcae14" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242203 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242211 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242227 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242237 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242251 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242259 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242271 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242279 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242296 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08d11be0e2919664ff2ea4b2440d0e0" containerName="startup-monitor" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242304 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08d11be0e2919664ff2ea4b2440d0e0" containerName="startup-monitor" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242321 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242331 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242346 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea59f59-8970-4eea-994d-9763792ee704" containerName="collect-profiles" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242355 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea59f59-8970-4eea-994d-9763792ee704" containerName="collect-profiles" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242366 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242374 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242391 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242400 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242417 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ee3291-2979-4903-98a2-355855cedd55" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242426 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ee3291-2979-4903-98a2-355855cedd55" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242440 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242448 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: E1203 22:10:49.242466 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242474 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242606 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="25602d69-3aec-487d-8d62-c2c21f27e2b7" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242634 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242656 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="237bf861-d24e-4fd7-9aee-24b6a79cd6c2" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242678 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dada903-4b2b-450a-a55f-502ff892fd9f" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242695 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08d11be0e2919664ff2ea4b2440d0e0" containerName="startup-monitor" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242706 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="3497f5dd-4c6f-4108-a948-481cef475ba9" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242722 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="05dd6e8e0dea56089da96190349dd4c1" containerName="cluster-policy-controller" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242745 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ee3291-2979-4903-98a2-355855cedd55" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242970 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2364d3-47b2-4784-9c42-76bf2547b797" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.242989 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="setup" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243012 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c3edb2-12e8-45b0-99ac-9a794dd2881d" containerName="assisted-installer-controller" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243035 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8c7291-3150-46a5-9d14-57a23bb51cc0" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243055 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" containerName="collect-profiles" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243069 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="13238af3704fe583f617f61e755cf4c2" containerName="kube-apiserver" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243087 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea59f59-8970-4eea-994d-9763792ee704" containerName="collect-profiles" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243101 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0269ada6-cb6e-4c98-bd24-752ae0286498" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243115 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d60f02e-1803-461e-9606-667d91fcae14" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243125 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="70e52a8c-7f9e-47fa-85ca-41f90dcb9747" containerName="installer" Dec 03 22:10:49.243881 master-0 kubenswrapper[36504]: I1203 22:10:49.243743 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.247278 master-0 kubenswrapper[36504]: I1203 22:10:49.247180 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 22:10:49.248851 master-0 kubenswrapper[36504]: I1203 22:10:49.248803 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-l4d7p" Dec 03 22:10:49.262459 master-0 kubenswrapper[36504]: I1203 22:10:49.262256 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:49.262459 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:49.262459 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:49.262459 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:49.262459 master-0 kubenswrapper[36504]: I1203 22:10:49.262331 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:49.373322 master-0 kubenswrapper[36504]: I1203 22:10:49.373269 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78nqb\" (UniqueName: \"kubernetes.io/projected/688984b6-3531-488d-a769-908bb88fa342-kube-api-access-78nqb\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.373554 master-0 kubenswrapper[36504]: I1203 22:10:49.373359 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/688984b6-3531-488d-a769-908bb88fa342-host\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.373554 master-0 kubenswrapper[36504]: I1203 22:10:49.373417 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/688984b6-3531-488d-a769-908bb88fa342-serviceca\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.475366 master-0 kubenswrapper[36504]: I1203 22:10:49.475281 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/688984b6-3531-488d-a769-908bb88fa342-serviceca\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.475630 master-0 kubenswrapper[36504]: I1203 22:10:49.475398 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78nqb\" (UniqueName: \"kubernetes.io/projected/688984b6-3531-488d-a769-908bb88fa342-kube-api-access-78nqb\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.475630 master-0 kubenswrapper[36504]: I1203 22:10:49.475535 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/688984b6-3531-488d-a769-908bb88fa342-host\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.475789 master-0 kubenswrapper[36504]: I1203 22:10:49.475695 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/688984b6-3531-488d-a769-908bb88fa342-host\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.476695 master-0 kubenswrapper[36504]: I1203 22:10:49.476633 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/688984b6-3531-488d-a769-908bb88fa342-serviceca\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.490727 master-0 kubenswrapper[36504]: I1203 22:10:49.490680 36504 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 22:10:49.493309 master-0 kubenswrapper[36504]: I1203 22:10:49.493276 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78nqb\" (UniqueName: \"kubernetes.io/projected/688984b6-3531-488d-a769-908bb88fa342-kube-api-access-78nqb\") pod \"node-ca-jsp5g\" (UID: \"688984b6-3531-488d-a769-908bb88fa342\") " pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.575486 master-0 kubenswrapper[36504]: I1203 22:10:49.575314 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jsp5g" Dec 03 22:10:49.605011 master-0 kubenswrapper[36504]: W1203 22:10:49.604955 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod688984b6_3531_488d_a769_908bb88fa342.slice/crio-d3101a72a16d34ea7c4eb99d092ca37b467f4d1be70e7bf279d59cc411dd85e0 WatchSource:0}: Error finding container d3101a72a16d34ea7c4eb99d092ca37b467f4d1be70e7bf279d59cc411dd85e0: Status 404 returned error can't find the container with id d3101a72a16d34ea7c4eb99d092ca37b467f4d1be70e7bf279d59cc411dd85e0 Dec 03 22:10:49.607641 master-0 kubenswrapper[36504]: I1203 22:10:49.607616 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:10:49.620248 master-0 kubenswrapper[36504]: I1203 22:10:49.620156 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 22:10:49.625231 master-0 kubenswrapper[36504]: I1203 22:10:49.625116 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-78hts" Dec 03 22:10:50.006891 master-0 kubenswrapper[36504]: I1203 22:10:50.006824 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:50.010268 master-0 kubenswrapper[36504]: I1203 22:10:50.010227 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:10:50.137983 master-0 kubenswrapper[36504]: I1203 22:10:50.137501 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jsp5g" event={"ID":"688984b6-3531-488d-a769-908bb88fa342","Type":"ContainerStarted","Data":"d3101a72a16d34ea7c4eb99d092ca37b467f4d1be70e7bf279d59cc411dd85e0"} Dec 03 22:10:50.262739 master-0 kubenswrapper[36504]: I1203 22:10:50.261659 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:50.262739 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:50.262739 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:50.262739 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:50.262739 master-0 kubenswrapper[36504]: I1203 22:10:50.261790 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:50.266827 master-0 kubenswrapper[36504]: I1203 22:10:50.266785 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:50.272031 master-0 kubenswrapper[36504]: I1203 22:10:50.271976 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7cf5cf757f-shpjd" Dec 03 22:10:50.431591 master-0 kubenswrapper[36504]: I1203 22:10:50.431423 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:50.477507 master-0 kubenswrapper[36504]: I1203 22:10:50.477098 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kp794" Dec 03 22:10:50.541152 master-0 kubenswrapper[36504]: I1203 22:10:50.540501 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:50.605096 master-0 kubenswrapper[36504]: I1203 22:10:50.604892 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qht46" Dec 03 22:10:51.138363 master-0 kubenswrapper[36504]: I1203 22:10:51.138317 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:51.144030 master-0 kubenswrapper[36504]: I1203 22:10:51.143995 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f08d11be0e2919664ff2ea4b2440d0e0/startup-monitor/0.log" Dec 03 22:10:51.144116 master-0 kubenswrapper[36504]: I1203 22:10:51.144041 36504 generic.go:334] "Generic (PLEG): container finished" podID="f08d11be0e2919664ff2ea4b2440d0e0" containerID="42ac76745cd48697a9d60b7a3008b7dd9f6c94eb9ad7c1bc7b99f348cd44c91a" exitCode=137 Dec 03 22:10:51.228753 master-0 kubenswrapper[36504]: I1203 22:10:51.227577 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tcqzq" Dec 03 22:10:51.261153 master-0 kubenswrapper[36504]: I1203 22:10:51.261054 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:51.261153 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:51.261153 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:51.261153 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:51.261153 master-0 kubenswrapper[36504]: I1203 22:10:51.261115 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:51.731125 master-0 kubenswrapper[36504]: I1203 22:10:51.731032 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:51.731797 master-0 kubenswrapper[36504]: I1203 22:10:51.731349 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:10:51.751961 master-0 kubenswrapper[36504]: I1203 22:10:51.751938 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f08d11be0e2919664ff2ea4b2440d0e0/startup-monitor/0.log" Dec 03 22:10:51.752058 master-0 kubenswrapper[36504]: I1203 22:10:51.752015 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:51.758714 master-0 kubenswrapper[36504]: I1203 22:10:51.758471 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k2j45" Dec 03 22:10:51.843510 master-0 kubenswrapper[36504]: I1203 22:10:51.843378 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") pod \"f08d11be0e2919664ff2ea4b2440d0e0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " Dec 03 22:10:51.843510 master-0 kubenswrapper[36504]: I1203 22:10:51.843455 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") pod \"f08d11be0e2919664ff2ea4b2440d0e0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " Dec 03 22:10:51.843510 master-0 kubenswrapper[36504]: I1203 22:10:51.843500 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") pod \"f08d11be0e2919664ff2ea4b2440d0e0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.843525 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests" (OuterVolumeSpecName: "manifests") pod "f08d11be0e2919664ff2ea4b2440d0e0" (UID: "f08d11be0e2919664ff2ea4b2440d0e0"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.843543 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") pod \"f08d11be0e2919664ff2ea4b2440d0e0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.843755 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log" (OuterVolumeSpecName: "var-log") pod "f08d11be0e2919664ff2ea4b2440d0e0" (UID: "f08d11be0e2919664ff2ea4b2440d0e0"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.843828 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f08d11be0e2919664ff2ea4b2440d0e0" (UID: "f08d11be0e2919664ff2ea4b2440d0e0"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.845699 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock" (OuterVolumeSpecName: "var-lock") pod "f08d11be0e2919664ff2ea4b2440d0e0" (UID: "f08d11be0e2919664ff2ea4b2440d0e0"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.845664 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") pod \"f08d11be0e2919664ff2ea4b2440d0e0\" (UID: \"f08d11be0e2919664ff2ea4b2440d0e0\") " Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.848369 36504 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-manifests\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.848390 36504 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-log\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.848401 36504 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:51.848716 master-0 kubenswrapper[36504]: I1203 22:10:51.848413 36504 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:51.849236 master-0 kubenswrapper[36504]: I1203 22:10:51.849081 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f08d11be0e2919664ff2ea4b2440d0e0" (UID: "f08d11be0e2919664ff2ea4b2440d0e0"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:10:51.956975 master-0 kubenswrapper[36504]: I1203 22:10:51.953154 36504 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f08d11be0e2919664ff2ea4b2440d0e0-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:10:52.153636 master-0 kubenswrapper[36504]: I1203 22:10:52.153584 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f08d11be0e2919664ff2ea4b2440d0e0/startup-monitor/0.log" Dec 03 22:10:52.153882 master-0 kubenswrapper[36504]: I1203 22:10:52.153691 36504 scope.go:117] "RemoveContainer" containerID="42ac76745cd48697a9d60b7a3008b7dd9f6c94eb9ad7c1bc7b99f348cd44c91a" Dec 03 22:10:52.153882 master-0 kubenswrapper[36504]: I1203 22:10:52.153844 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:10:52.156165 master-0 kubenswrapper[36504]: I1203 22:10:52.156100 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jsp5g" event={"ID":"688984b6-3531-488d-a769-908bb88fa342","Type":"ContainerStarted","Data":"a40e377b97f96e49fc7d1db8df26d260d5ec7526539eb434da143d3d3f4cf892"} Dec 03 22:10:52.180333 master-0 kubenswrapper[36504]: I1203 22:10:52.180250 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jsp5g" podStartSLOduration=1.044524107 podStartE2EDuration="3.180224371s" podCreationTimestamp="2025-12-03 22:10:49 +0000 UTC" firstStartedPulling="2025-12-03 22:10:49.607519253 +0000 UTC m=+14.827291260" lastFinishedPulling="2025-12-03 22:10:51.743219517 +0000 UTC m=+16.962991524" observedRunningTime="2025-12-03 22:10:52.178508878 +0000 UTC m=+17.398280915" watchObservedRunningTime="2025-12-03 22:10:52.180224371 +0000 UTC m=+17.399996378" Dec 03 22:10:52.209971 master-0 kubenswrapper[36504]: I1203 22:10:52.209895 36504 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f764149f-1535-4977-85f4-b3faf08cbfe8" Dec 03 22:10:52.260641 master-0 kubenswrapper[36504]: I1203 22:10:52.260549 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:52.260641 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:52.260641 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:52.260641 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:52.261058 master-0 kubenswrapper[36504]: I1203 22:10:52.260662 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:53.105348 master-0 kubenswrapper[36504]: I1203 22:10:53.105295 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08d11be0e2919664ff2ea4b2440d0e0" path="/var/lib/kubelet/pods/f08d11be0e2919664ff2ea4b2440d0e0/volumes" Dec 03 22:10:53.106326 master-0 kubenswrapper[36504]: I1203 22:10:53.105543 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Dec 03 22:10:53.128074 master-0 kubenswrapper[36504]: I1203 22:10:53.127996 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 22:10:53.128074 master-0 kubenswrapper[36504]: I1203 22:10:53.128052 36504 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f764149f-1535-4977-85f4-b3faf08cbfe8" Dec 03 22:10:53.131546 master-0 kubenswrapper[36504]: I1203 22:10:53.131500 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 22:10:53.131631 master-0 kubenswrapper[36504]: I1203 22:10:53.131543 36504 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f764149f-1535-4977-85f4-b3faf08cbfe8" Dec 03 22:10:53.261093 master-0 kubenswrapper[36504]: I1203 22:10:53.261034 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:53.261093 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:53.261093 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:53.261093 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:53.261515 master-0 kubenswrapper[36504]: I1203 22:10:53.261101 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:54.260742 master-0 kubenswrapper[36504]: I1203 22:10:54.260664 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:54.260742 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:54.260742 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:54.260742 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:54.261348 master-0 kubenswrapper[36504]: I1203 22:10:54.260815 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:55.260492 master-0 kubenswrapper[36504]: I1203 22:10:55.260412 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:55.260492 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:55.260492 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:55.260492 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:55.261281 master-0 kubenswrapper[36504]: I1203 22:10:55.260525 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:56.264912 master-0 kubenswrapper[36504]: I1203 22:10:56.264835 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:56.264912 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:56.264912 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:56.264912 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:56.264912 master-0 kubenswrapper[36504]: I1203 22:10:56.264907 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:56.312455 master-0 kubenswrapper[36504]: I1203 22:10:56.312256 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:10:56.312740 master-0 kubenswrapper[36504]: E1203 22:10:56.312486 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:56.312740 master-0 kubenswrapper[36504]: E1203 22:10:56.312520 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:56.312740 master-0 kubenswrapper[36504]: E1203 22:10:56.312621 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:11:12.312601059 +0000 UTC m=+37.532373076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:10:56.653306 master-0 kubenswrapper[36504]: I1203 22:10:56.653212 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Dec 03 22:10:57.260929 master-0 kubenswrapper[36504]: I1203 22:10:57.260858 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:57.260929 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:57.260929 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:57.260929 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:57.260929 master-0 kubenswrapper[36504]: I1203 22:10:57.260923 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:58.260428 master-0 kubenswrapper[36504]: I1203 22:10:58.260345 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:58.260428 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:58.260428 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:58.260428 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:58.260428 master-0 kubenswrapper[36504]: I1203 22:10:58.260421 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:10:59.261424 master-0 kubenswrapper[36504]: I1203 22:10:59.261326 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:10:59.261424 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:10:59.261424 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:10:59.261424 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:10:59.261424 master-0 kubenswrapper[36504]: I1203 22:10:59.261391 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:11:00.139464 master-0 kubenswrapper[36504]: I1203 22:11:00.139411 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-576975bfdb-gd855"] Dec 03 22:11:00.140327 master-0 kubenswrapper[36504]: I1203 22:11:00.140292 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.141880 master-0 kubenswrapper[36504]: I1203 22:11:00.141842 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-52h2hp3iuputk" Dec 03 22:11:00.153302 master-0 kubenswrapper[36504]: I1203 22:11:00.153246 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-576975bfdb-gd855"] Dec 03 22:11:00.172045 master-0 kubenswrapper[36504]: I1203 22:11:00.171979 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-b9f5dccb6-4h4jv"] Dec 03 22:11:00.172422 master-0 kubenswrapper[36504]: I1203 22:11:00.172383 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" podUID="40f8e70d-5f98-47f1-afa8-ea67242981fc" containerName="metrics-server" containerID="cri-o://eac3faec501ffdc007c07d93a9508e47b671bbce7cc0a7b3a4970c2ac98f0e4b" gracePeriod=170 Dec 03 22:11:00.180666 master-0 kubenswrapper[36504]: I1203 22:11:00.180611 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57hrm\" (UniqueName: \"kubernetes.io/projected/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-kube-api-access-57hrm\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.181138 master-0 kubenswrapper[36504]: I1203 22:11:00.181077 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-secret-metrics-client-certs\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.181564 master-0 kubenswrapper[36504]: I1203 22:11:00.181525 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-client-ca-bundle\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.181564 master-0 kubenswrapper[36504]: I1203 22:11:00.181561 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-secret-metrics-server-tls\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.181689 master-0 kubenswrapper[36504]: I1203 22:11:00.181586 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-audit-log\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.181689 master-0 kubenswrapper[36504]: I1203 22:11:00.181655 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.181689 master-0 kubenswrapper[36504]: I1203 22:11:00.181685 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-metrics-server-audit-profiles\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.203135 master-0 kubenswrapper[36504]: I1203 22:11:00.203058 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-55c7ddc498-h4b86"] Dec 03 22:11:00.206196 master-0 kubenswrapper[36504]: I1203 22:11:00.206127 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.211738 master-0 kubenswrapper[36504]: I1203 22:11:00.211653 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Dec 03 22:11:00.213373 master-0 kubenswrapper[36504]: I1203 22:11:00.212690 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Dec 03 22:11:00.213504 master-0 kubenswrapper[36504]: I1203 22:11:00.213449 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Dec 03 22:11:00.213973 master-0 kubenswrapper[36504]: I1203 22:11:00.213924 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Dec 03 22:11:00.220068 master-0 kubenswrapper[36504]: I1203 22:11:00.219992 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Dec 03 22:11:00.240105 master-0 kubenswrapper[36504]: I1203 22:11:00.239246 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Dec 03 22:11:00.240372 master-0 kubenswrapper[36504]: I1203 22:11:00.240143 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-55c7ddc498-h4b86"] Dec 03 22:11:00.261228 master-0 kubenswrapper[36504]: I1203 22:11:00.261167 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:11:00.261228 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:11:00.261228 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:11:00.261228 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:11:00.261904 master-0 kubenswrapper[36504]: I1203 22:11:00.261242 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:11:00.283312 master-0 kubenswrapper[36504]: I1203 22:11:00.283257 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-secret-metrics-server-tls\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283312 master-0 kubenswrapper[36504]: I1203 22:11:00.283301 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-audit-log\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283312 master-0 kubenswrapper[36504]: I1203 22:11:00.283320 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-client-ca-bundle\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283353 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-serving-certs-ca-bundle\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283378 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-federate-client-tls\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283399 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-telemeter-client-tls\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283423 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lcj7\" (UniqueName: \"kubernetes.io/projected/4af62c74-a1a5-497c-80d8-9cb916e8e762-kube-api-access-7lcj7\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283442 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283463 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-metrics-server-audit-profiles\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283479 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283503 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-metrics-client-ca\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283525 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57hrm\" (UniqueName: \"kubernetes.io/projected/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-kube-api-access-57hrm\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283547 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-secret-metrics-client-certs\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283564 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.283608 master-0 kubenswrapper[36504]: I1203 22:11:00.283614 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-secret-telemeter-client\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.284023 master-0 kubenswrapper[36504]: I1203 22:11:00.283962 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-audit-log\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.284872 master-0 kubenswrapper[36504]: I1203 22:11:00.284843 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.285313 master-0 kubenswrapper[36504]: I1203 22:11:00.285280 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-metrics-server-audit-profiles\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.286859 master-0 kubenswrapper[36504]: I1203 22:11:00.286817 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-secret-metrics-server-tls\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.286975 master-0 kubenswrapper[36504]: I1203 22:11:00.286944 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-client-ca-bundle\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.288002 master-0 kubenswrapper[36504]: I1203 22:11:00.287972 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-secret-metrics-client-certs\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.302121 master-0 kubenswrapper[36504]: I1203 22:11:00.302067 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57hrm\" (UniqueName: \"kubernetes.io/projected/1993ad4d-95fc-4af9-9ce3-e29112a1a3a4-kube-api-access-57hrm\") pod \"metrics-server-576975bfdb-gd855\" (UID: \"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4\") " pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.384954 master-0 kubenswrapper[36504]: I1203 22:11:00.384906 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-secret-telemeter-client\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.384978 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-serving-certs-ca-bundle\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.384997 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-federate-client-tls\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.385020 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-telemeter-client-tls\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.385048 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lcj7\" (UniqueName: \"kubernetes.io/projected/4af62c74-a1a5-497c-80d8-9cb916e8e762-kube-api-access-7lcj7\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.385072 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.385096 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-metrics-client-ca\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.385204 master-0 kubenswrapper[36504]: I1203 22:11:00.385121 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.386602 master-0 kubenswrapper[36504]: I1203 22:11:00.386549 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-serving-certs-ca-bundle\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.386751 master-0 kubenswrapper[36504]: I1203 22:11:00.386703 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-metrics-client-ca\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.387132 master-0 kubenswrapper[36504]: I1203 22:11:00.387091 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af62c74-a1a5-497c-80d8-9cb916e8e762-telemeter-trusted-ca-bundle\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.388328 master-0 kubenswrapper[36504]: I1203 22:11:00.388293 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.391790 master-0 kubenswrapper[36504]: I1203 22:11:00.391696 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-secret-telemeter-client\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.391790 master-0 kubenswrapper[36504]: I1203 22:11:00.391747 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-telemeter-client-tls\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.392373 master-0 kubenswrapper[36504]: I1203 22:11:00.392340 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4af62c74-a1a5-497c-80d8-9cb916e8e762-federate-client-tls\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.402795 master-0 kubenswrapper[36504]: I1203 22:11:00.402749 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lcj7\" (UniqueName: \"kubernetes.io/projected/4af62c74-a1a5-497c-80d8-9cb916e8e762-kube-api-access-7lcj7\") pod \"telemeter-client-55c7ddc498-h4b86\" (UID: \"4af62c74-a1a5-497c-80d8-9cb916e8e762\") " pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.461963 master-0 kubenswrapper[36504]: I1203 22:11:00.461883 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:00.548843 master-0 kubenswrapper[36504]: I1203 22:11:00.548754 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" Dec 03 22:11:00.948910 master-0 kubenswrapper[36504]: I1203 22:11:00.948834 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-576975bfdb-gd855"] Dec 03 22:11:00.972750 master-0 kubenswrapper[36504]: W1203 22:11:00.972703 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1993ad4d_95fc_4af9_9ce3_e29112a1a3a4.slice/crio-e631c3dd89087bb8495b0ac257f24e49b919387b8e9923bd2ebf2bb067747420 WatchSource:0}: Error finding container e631c3dd89087bb8495b0ac257f24e49b919387b8e9923bd2ebf2bb067747420: Status 404 returned error can't find the container with id e631c3dd89087bb8495b0ac257f24e49b919387b8e9923bd2ebf2bb067747420 Dec 03 22:11:01.246982 master-0 kubenswrapper[36504]: I1203 22:11:01.246850 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" event={"ID":"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4","Type":"ContainerStarted","Data":"ed393c790b5a9cff98a1dad86f25746dba868e8b90b173529d959ca4492fc867"} Dec 03 22:11:01.246982 master-0 kubenswrapper[36504]: I1203 22:11:01.246897 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" event={"ID":"1993ad4d-95fc-4af9-9ce3-e29112a1a3a4","Type":"ContainerStarted","Data":"e631c3dd89087bb8495b0ac257f24e49b919387b8e9923bd2ebf2bb067747420"} Dec 03 22:11:01.267917 master-0 kubenswrapper[36504]: I1203 22:11:01.267840 36504 patch_prober.go:28] interesting pod/router-default-54f97f57-xq6ch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 22:11:01.267917 master-0 kubenswrapper[36504]: [-]has-synced failed: reason withheld Dec 03 22:11:01.267917 master-0 kubenswrapper[36504]: [+]process-running ok Dec 03 22:11:01.267917 master-0 kubenswrapper[36504]: healthz check failed Dec 03 22:11:01.268623 master-0 kubenswrapper[36504]: I1203 22:11:01.267946 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-54f97f57-xq6ch" podUID="698e6d87-1a58-493c-8b69-d22c89d26ac5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 22:11:01.361751 master-0 kubenswrapper[36504]: I1203 22:11:01.361684 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" podStartSLOduration=1.361658738 podStartE2EDuration="1.361658738s" podCreationTimestamp="2025-12-03 22:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:01.2981199 +0000 UTC m=+26.517891927" watchObservedRunningTime="2025-12-03 22:11:01.361658738 +0000 UTC m=+26.581430745" Dec 03 22:11:01.365589 master-0 kubenswrapper[36504]: I1203 22:11:01.365549 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-55c7ddc498-h4b86"] Dec 03 22:11:01.369176 master-0 kubenswrapper[36504]: W1203 22:11:01.369126 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af62c74_a1a5_497c_80d8_9cb916e8e762.slice/crio-6a67b6ff7e7e17638f8a36e2190f664274247eb615f2e09ab153d7fbbfef1f8e WatchSource:0}: Error finding container 6a67b6ff7e7e17638f8a36e2190f664274247eb615f2e09ab153d7fbbfef1f8e: Status 404 returned error can't find the container with id 6a67b6ff7e7e17638f8a36e2190f664274247eb615f2e09ab153d7fbbfef1f8e Dec 03 22:11:02.259290 master-0 kubenswrapper[36504]: I1203 22:11:02.259150 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" event={"ID":"4af62c74-a1a5-497c-80d8-9cb916e8e762","Type":"ContainerStarted","Data":"6a67b6ff7e7e17638f8a36e2190f664274247eb615f2e09ab153d7fbbfef1f8e"} Dec 03 22:11:02.260993 master-0 kubenswrapper[36504]: I1203 22:11:02.260948 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:11:02.263274 master-0 kubenswrapper[36504]: I1203 22:11:02.263215 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-54f97f57-xq6ch" Dec 03 22:11:04.416748 master-0 kubenswrapper[36504]: I1203 22:11:04.416699 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-f59c45d65-v6hb2"] Dec 03 22:11:04.420620 master-0 kubenswrapper[36504]: I1203 22:11:04.420586 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.433942 master-0 kubenswrapper[36504]: I1203 22:11:04.428918 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 03 22:11:04.433942 master-0 kubenswrapper[36504]: I1203 22:11:04.429240 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 03 22:11:04.433942 master-0 kubenswrapper[36504]: I1203 22:11:04.429293 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 03 22:11:04.433942 master-0 kubenswrapper[36504]: I1203 22:11:04.428924 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 03 22:11:04.433942 master-0 kubenswrapper[36504]: I1203 22:11:04.429259 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 03 22:11:04.433942 master-0 kubenswrapper[36504]: I1203 22:11:04.429632 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-1c09f6l24b5bu" Dec 03 22:11:04.446246 master-0 kubenswrapper[36504]: I1203 22:11:04.446194 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f59c45d65-v6hb2"] Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450561 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450633 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450668 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m989t\" (UniqueName: \"kubernetes.io/projected/d0e75f01-426f-4c0d-8398-76bd3edc06cb-kube-api-access-m989t\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450687 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-tls\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450730 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450761 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-grpc-tls\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450790 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0e75f01-426f-4c0d-8398-76bd3edc06cb-metrics-client-ca\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.451903 master-0 kubenswrapper[36504]: I1203 22:11:04.450828 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552681 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552747 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552804 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m989t\" (UniqueName: \"kubernetes.io/projected/d0e75f01-426f-4c0d-8398-76bd3edc06cb-kube-api-access-m989t\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552834 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-tls\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552877 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552919 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-grpc-tls\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552942 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0e75f01-426f-4c0d-8398-76bd3edc06cb-metrics-client-ca\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.553779 master-0 kubenswrapper[36504]: I1203 22:11:04.552979 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.556054 master-0 kubenswrapper[36504]: I1203 22:11:04.555523 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0e75f01-426f-4c0d-8398-76bd3edc06cb-metrics-client-ca\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.558189 master-0 kubenswrapper[36504]: I1203 22:11:04.558156 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.558306 master-0 kubenswrapper[36504]: I1203 22:11:04.558266 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-tls\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.558972 master-0 kubenswrapper[36504]: I1203 22:11:04.558910 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-grpc-tls\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.560684 master-0 kubenswrapper[36504]: I1203 22:11:04.560650 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.562148 master-0 kubenswrapper[36504]: I1203 22:11:04.562103 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.571870 master-0 kubenswrapper[36504]: I1203 22:11:04.567191 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d0e75f01-426f-4c0d-8398-76bd3edc06cb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.580604 master-0 kubenswrapper[36504]: I1203 22:11:04.580548 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m989t\" (UniqueName: \"kubernetes.io/projected/d0e75f01-426f-4c0d-8398-76bd3edc06cb-kube-api-access-m989t\") pod \"thanos-querier-f59c45d65-v6hb2\" (UID: \"d0e75f01-426f-4c0d-8398-76bd3edc06cb\") " pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:04.787614 master-0 kubenswrapper[36504]: I1203 22:11:04.787556 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:05.293264 master-0 kubenswrapper[36504]: I1203 22:11:05.293136 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" event={"ID":"4af62c74-a1a5-497c-80d8-9cb916e8e762","Type":"ContainerStarted","Data":"a39141bc00ca7172b2760e6fdfa48b92e39b32d2868f485333bf0812691db5c2"} Dec 03 22:11:05.322854 master-0 kubenswrapper[36504]: I1203 22:11:05.320838 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f59c45d65-v6hb2"] Dec 03 22:11:05.730913 master-0 kubenswrapper[36504]: I1203 22:11:05.730860 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:11:05.738306 master-0 kubenswrapper[36504]: I1203 22:11:05.735472 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.741820 master-0 kubenswrapper[36504]: I1203 22:11:05.740830 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 22:11:05.741820 master-0 kubenswrapper[36504]: I1203 22:11:05.740999 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 22:11:05.741820 master-0 kubenswrapper[36504]: I1203 22:11:05.741114 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 22:11:05.741820 master-0 kubenswrapper[36504]: I1203 22:11:05.741248 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 22:11:05.742386 master-0 kubenswrapper[36504]: I1203 22:11:05.742345 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 22:11:05.742461 master-0 kubenswrapper[36504]: I1203 22:11:05.742405 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 22:11:05.742889 master-0 kubenswrapper[36504]: I1203 22:11:05.742851 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 22:11:05.746502 master-0 kubenswrapper[36504]: I1203 22:11:05.746465 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 22:11:05.759788 master-0 kubenswrapper[36504]: I1203 22:11:05.759727 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:11:05.880175 master-0 kubenswrapper[36504]: I1203 22:11:05.880092 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880175 master-0 kubenswrapper[36504]: I1203 22:11:05.880161 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-web-config\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880263 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880291 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-volume\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880312 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880344 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880372 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880399 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddspb\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-kube-api-access-ddspb\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880418 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880441 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880460 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-out\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.880478 master-0 kubenswrapper[36504]: I1203 22:11:05.880490 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982279 master-0 kubenswrapper[36504]: I1203 22:11:05.982121 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddspb\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-kube-api-access-ddspb\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982279 master-0 kubenswrapper[36504]: I1203 22:11:05.982193 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982279 master-0 kubenswrapper[36504]: I1203 22:11:05.982238 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982279 master-0 kubenswrapper[36504]: I1203 22:11:05.982269 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-out\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982320 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982353 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982376 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-web-config\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982421 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982451 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-volume\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982478 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982521 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.982729 master-0 kubenswrapper[36504]: I1203 22:11:05.982557 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.983659 master-0 kubenswrapper[36504]: I1203 22:11:05.983610 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.984618 master-0 kubenswrapper[36504]: I1203 22:11:05.984584 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.988147 master-0 kubenswrapper[36504]: I1203 22:11:05.988093 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.988742 master-0 kubenswrapper[36504]: I1203 22:11:05.988606 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.990404 master-0 kubenswrapper[36504]: I1203 22:11:05.990353 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.996485 master-0 kubenswrapper[36504]: I1203 22:11:05.996432 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-web-config\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.997559 master-0 kubenswrapper[36504]: I1203 22:11:05.997429 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.997559 master-0 kubenswrapper[36504]: I1203 22:11:05.997454 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-out\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:05.998448 master-0 kubenswrapper[36504]: I1203 22:11:05.998299 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:06.002436 master-0 kubenswrapper[36504]: I1203 22:11:06.002162 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:06.003400 master-0 kubenswrapper[36504]: I1203 22:11:06.003336 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-volume\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:06.123439 master-0 kubenswrapper[36504]: I1203 22:11:06.122069 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddspb\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-kube-api-access-ddspb\") pod \"alertmanager-main-0\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:06.302736 master-0 kubenswrapper[36504]: I1203 22:11:06.302576 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"b75c685f3dfe308cac2dd23a209f7ae15d07fee6c6acc3f3c6135035b85750c0"} Dec 03 22:11:06.392472 master-0 kubenswrapper[36504]: I1203 22:11:06.391974 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:11:06.602054 master-0 kubenswrapper[36504]: I1203 22:11:06.601420 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-77df56447c-s242z"] Dec 03 22:11:06.615862 master-0 kubenswrapper[36504]: I1203 22:11:06.615363 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.617958 master-0 kubenswrapper[36504]: I1203 22:11:06.617917 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 22:11:06.618070 master-0 kubenswrapper[36504]: I1203 22:11:06.618016 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 22:11:06.618203 master-0 kubenswrapper[36504]: I1203 22:11:06.618175 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 22:11:06.618547 master-0 kubenswrapper[36504]: I1203 22:11:06.618518 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 22:11:06.620987 master-0 kubenswrapper[36504]: I1203 22:11:06.620952 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77df56447c-s242z"] Dec 03 22:11:06.624164 master-0 kubenswrapper[36504]: I1203 22:11:06.624124 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 22:11:06.707036 master-0 kubenswrapper[36504]: I1203 22:11:06.706970 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d022890-8679-440b-bfb1-dbda6ee771f0-trusted-ca\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.707282 master-0 kubenswrapper[36504]: I1203 22:11:06.707078 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d022890-8679-440b-bfb1-dbda6ee771f0-config\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.707282 master-0 kubenswrapper[36504]: I1203 22:11:06.707113 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lph2\" (UniqueName: \"kubernetes.io/projected/9d022890-8679-440b-bfb1-dbda6ee771f0-kube-api-access-4lph2\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.707282 master-0 kubenswrapper[36504]: I1203 22:11:06.707137 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d022890-8679-440b-bfb1-dbda6ee771f0-serving-cert\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.810183 master-0 kubenswrapper[36504]: I1203 22:11:06.808636 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d022890-8679-440b-bfb1-dbda6ee771f0-trusted-ca\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.810183 master-0 kubenswrapper[36504]: I1203 22:11:06.808728 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d022890-8679-440b-bfb1-dbda6ee771f0-config\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.810183 master-0 kubenswrapper[36504]: I1203 22:11:06.808754 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lph2\" (UniqueName: \"kubernetes.io/projected/9d022890-8679-440b-bfb1-dbda6ee771f0-kube-api-access-4lph2\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.810183 master-0 kubenswrapper[36504]: I1203 22:11:06.808791 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d022890-8679-440b-bfb1-dbda6ee771f0-serving-cert\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.814786 master-0 kubenswrapper[36504]: I1203 22:11:06.811682 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d022890-8679-440b-bfb1-dbda6ee771f0-config\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.814786 master-0 kubenswrapper[36504]: I1203 22:11:06.811856 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d022890-8679-440b-bfb1-dbda6ee771f0-trusted-ca\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.820873 master-0 kubenswrapper[36504]: I1203 22:11:06.816533 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d022890-8679-440b-bfb1-dbda6ee771f0-serving-cert\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.847804 master-0 kubenswrapper[36504]: I1203 22:11:06.847479 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lph2\" (UniqueName: \"kubernetes.io/projected/9d022890-8679-440b-bfb1-dbda6ee771f0-kube-api-access-4lph2\") pod \"console-operator-77df56447c-s242z\" (UID: \"9d022890-8679-440b-bfb1-dbda6ee771f0\") " pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:06.966346 master-0 kubenswrapper[36504]: I1203 22:11:06.966246 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:07.645187 master-0 kubenswrapper[36504]: I1203 22:11:07.645118 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:11:07.666326 master-0 kubenswrapper[36504]: I1203 22:11:07.666089 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77df56447c-s242z"] Dec 03 22:11:07.674375 master-0 kubenswrapper[36504]: W1203 22:11:07.674301 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef63e073_73a7_4ac5_9077_c6a8491440a8.slice/crio-547a7ab480497e319d1d714117c8bc3e504a86f60e3b65f35661301426122e98 WatchSource:0}: Error finding container 547a7ab480497e319d1d714117c8bc3e504a86f60e3b65f35661301426122e98: Status 404 returned error can't find the container with id 547a7ab480497e319d1d714117c8bc3e504a86f60e3b65f35661301426122e98 Dec 03 22:11:07.676811 master-0 kubenswrapper[36504]: W1203 22:11:07.676745 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d022890_8679_440b_bfb1_dbda6ee771f0.slice/crio-9987dc2a1631174c07f8083e6f5803623015d12c5dab88b4fcdff4e29fa74649 WatchSource:0}: Error finding container 9987dc2a1631174c07f8083e6f5803623015d12c5dab88b4fcdff4e29fa74649: Status 404 returned error can't find the container with id 9987dc2a1631174c07f8083e6f5803623015d12c5dab88b4fcdff4e29fa74649 Dec 03 22:11:08.317862 master-0 kubenswrapper[36504]: I1203 22:11:08.317685 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"547a7ab480497e319d1d714117c8bc3e504a86f60e3b65f35661301426122e98"} Dec 03 22:11:08.319274 master-0 kubenswrapper[36504]: I1203 22:11:08.319239 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77df56447c-s242z" event={"ID":"9d022890-8679-440b-bfb1-dbda6ee771f0","Type":"ContainerStarted","Data":"9987dc2a1631174c07f8083e6f5803623015d12c5dab88b4fcdff4e29fa74649"} Dec 03 22:11:08.326286 master-0 kubenswrapper[36504]: I1203 22:11:08.326231 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" event={"ID":"4af62c74-a1a5-497c-80d8-9cb916e8e762","Type":"ContainerStarted","Data":"c587e86882824c145ad6904ac8fd6ea05db63ff50210d9cbabfc4c3c72e8e7c3"} Dec 03 22:11:08.326286 master-0 kubenswrapper[36504]: I1203 22:11:08.326288 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" event={"ID":"4af62c74-a1a5-497c-80d8-9cb916e8e762","Type":"ContainerStarted","Data":"68041b3f93fadb0342b9d61c7f8dc45c299dfe02ed05acfc07afe3553cc68de0"} Dec 03 22:11:09.091455 master-0 kubenswrapper[36504]: I1203 22:11:09.091186 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-55c7ddc498-h4b86" podStartSLOduration=3.134730463 podStartE2EDuration="9.09116888s" podCreationTimestamp="2025-12-03 22:11:00 +0000 UTC" firstStartedPulling="2025-12-03 22:11:01.370919008 +0000 UTC m=+26.590691015" lastFinishedPulling="2025-12-03 22:11:07.327357425 +0000 UTC m=+32.547129432" observedRunningTime="2025-12-03 22:11:08.358657867 +0000 UTC m=+33.578429874" watchObservedRunningTime="2025-12-03 22:11:09.09116888 +0000 UTC m=+34.310940887" Dec 03 22:11:09.092699 master-0 kubenswrapper[36504]: I1203 22:11:09.092675 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77778bd57c-xdhvs"] Dec 03 22:11:09.092905 master-0 kubenswrapper[36504]: I1203 22:11:09.092879 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" podUID="f6498ac1-7d07-4a5f-a968-d8bda72d1002" containerName="controller-manager" containerID="cri-o://c83609fd10c7378baa8493e0ab13c737c56d73cf8e97ee62fc24dbfcfaebb463" gracePeriod=30 Dec 03 22:11:09.185001 master-0 kubenswrapper[36504]: I1203 22:11:09.184748 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc"] Dec 03 22:11:09.185001 master-0 kubenswrapper[36504]: I1203 22:11:09.184991 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" containerID="cri-o://65df04b5fb3a155ca93bd902fc9d9e8367e9f5c39df53b66ea5a1b06073ed946" gracePeriod=30 Dec 03 22:11:09.344508 master-0 kubenswrapper[36504]: I1203 22:11:09.344358 36504 generic.go:334] "Generic (PLEG): container finished" podID="64856d96-023f-46db-819c-02f1adea5aab" containerID="65df04b5fb3a155ca93bd902fc9d9e8367e9f5c39df53b66ea5a1b06073ed946" exitCode=0 Dec 03 22:11:09.344508 master-0 kubenswrapper[36504]: I1203 22:11:09.344464 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" event={"ID":"64856d96-023f-46db-819c-02f1adea5aab","Type":"ContainerDied","Data":"65df04b5fb3a155ca93bd902fc9d9e8367e9f5c39df53b66ea5a1b06073ed946"} Dec 03 22:11:09.344508 master-0 kubenswrapper[36504]: I1203 22:11:09.344508 36504 scope.go:117] "RemoveContainer" containerID="5af03b1fc8b9d3242cc113182968101569dfe657f33d984d37c023bd205c8309" Dec 03 22:11:09.349949 master-0 kubenswrapper[36504]: I1203 22:11:09.349082 36504 generic.go:334] "Generic (PLEG): container finished" podID="f6498ac1-7d07-4a5f-a968-d8bda72d1002" containerID="c83609fd10c7378baa8493e0ab13c737c56d73cf8e97ee62fc24dbfcfaebb463" exitCode=0 Dec 03 22:11:09.351023 master-0 kubenswrapper[36504]: I1203 22:11:09.350975 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" event={"ID":"f6498ac1-7d07-4a5f-a968-d8bda72d1002","Type":"ContainerDied","Data":"c83609fd10c7378baa8493e0ab13c737c56d73cf8e97ee62fc24dbfcfaebb463"} Dec 03 22:11:09.579932 master-0 kubenswrapper[36504]: I1203 22:11:09.579563 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:11:09.682359 master-0 kubenswrapper[36504]: I1203 22:11:09.682256 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") pod \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " Dec 03 22:11:09.682359 master-0 kubenswrapper[36504]: I1203 22:11:09.682347 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") pod \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " Dec 03 22:11:09.684674 master-0 kubenswrapper[36504]: I1203 22:11:09.682507 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") pod \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " Dec 03 22:11:09.687451 master-0 kubenswrapper[36504]: I1203 22:11:09.687032 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") pod \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " Dec 03 22:11:09.687451 master-0 kubenswrapper[36504]: I1203 22:11:09.682743 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca" (OuterVolumeSpecName: "client-ca") pod "f6498ac1-7d07-4a5f-a968-d8bda72d1002" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:09.687451 master-0 kubenswrapper[36504]: I1203 22:11:09.687114 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") pod \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\" (UID: \"f6498ac1-7d07-4a5f-a968-d8bda72d1002\") " Dec 03 22:11:09.687451 master-0 kubenswrapper[36504]: I1203 22:11:09.686308 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5" (OuterVolumeSpecName: "kube-api-access-bt8l5") pod "f6498ac1-7d07-4a5f-a968-d8bda72d1002" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002"). InnerVolumeSpecName "kube-api-access-bt8l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:09.687451 master-0 kubenswrapper[36504]: I1203 22:11:09.686422 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f6498ac1-7d07-4a5f-a968-d8bda72d1002" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687588 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f6498ac1-7d07-4a5f-a968-d8bda72d1002" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687690 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config" (OuterVolumeSpecName: "config") pod "f6498ac1-7d07-4a5f-a968-d8bda72d1002" (UID: "f6498ac1-7d07-4a5f-a968-d8bda72d1002"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687763 36504 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687797 36504 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6498ac1-7d07-4a5f-a968-d8bda72d1002-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687808 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt8l5\" (UniqueName: \"kubernetes.io/projected/f6498ac1-7d07-4a5f-a968-d8bda72d1002-kube-api-access-bt8l5\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687818 36504 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.687897 master-0 kubenswrapper[36504]: I1203 22:11:09.687828 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6498ac1-7d07-4a5f-a968-d8bda72d1002-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.761285 master-0 kubenswrapper[36504]: I1203 22:11:09.761243 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:11:09.789087 master-0 kubenswrapper[36504]: I1203 22:11:09.789040 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") pod \"64856d96-023f-46db-819c-02f1adea5aab\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " Dec 03 22:11:09.789341 master-0 kubenswrapper[36504]: I1203 22:11:09.789176 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") pod \"64856d96-023f-46db-819c-02f1adea5aab\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " Dec 03 22:11:09.789341 master-0 kubenswrapper[36504]: I1203 22:11:09.789251 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") pod \"64856d96-023f-46db-819c-02f1adea5aab\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " Dec 03 22:11:09.789341 master-0 kubenswrapper[36504]: I1203 22:11:09.789282 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") pod \"64856d96-023f-46db-819c-02f1adea5aab\" (UID: \"64856d96-023f-46db-819c-02f1adea5aab\") " Dec 03 22:11:09.791051 master-0 kubenswrapper[36504]: I1203 22:11:09.791027 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca" (OuterVolumeSpecName: "client-ca") pod "64856d96-023f-46db-819c-02f1adea5aab" (UID: "64856d96-023f-46db-819c-02f1adea5aab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:09.791993 master-0 kubenswrapper[36504]: I1203 22:11:09.791971 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config" (OuterVolumeSpecName: "config") pod "64856d96-023f-46db-819c-02f1adea5aab" (UID: "64856d96-023f-46db-819c-02f1adea5aab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:09.794513 master-0 kubenswrapper[36504]: I1203 22:11:09.794480 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "64856d96-023f-46db-819c-02f1adea5aab" (UID: "64856d96-023f-46db-819c-02f1adea5aab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:09.824444 master-0 kubenswrapper[36504]: I1203 22:11:09.824313 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg" (OuterVolumeSpecName: "kube-api-access-h2tqg") pod "64856d96-023f-46db-819c-02f1adea5aab" (UID: "64856d96-023f-46db-819c-02f1adea5aab"). InnerVolumeSpecName "kube-api-access-h2tqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:09.891745 master-0 kubenswrapper[36504]: I1203 22:11:09.891646 36504 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64856d96-023f-46db-819c-02f1adea5aab-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.891745 master-0 kubenswrapper[36504]: I1203 22:11:09.891705 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.891745 master-0 kubenswrapper[36504]: I1203 22:11:09.891720 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tqg\" (UniqueName: \"kubernetes.io/projected/64856d96-023f-46db-819c-02f1adea5aab-kube-api-access-h2tqg\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.891745 master-0 kubenswrapper[36504]: I1203 22:11:09.891732 36504 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/64856d96-023f-46db-819c-02f1adea5aab-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927023 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: E1203 22:11:09.927286 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6498ac1-7d07-4a5f-a968-d8bda72d1002" containerName="controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927299 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6498ac1-7d07-4a5f-a968-d8bda72d1002" containerName="controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: E1203 22:11:09.927335 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927342 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: E1203 22:11:09.927358 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927364 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927470 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6498ac1-7d07-4a5f-a968-d8bda72d1002" containerName="controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927491 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.927714 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="64856d96-023f-46db-819c-02f1adea5aab" containerName="route-controller-manager" Dec 03 22:11:09.929802 master-0 kubenswrapper[36504]: I1203 22:11:09.929210 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.935351 master-0 kubenswrapper[36504]: I1203 22:11:09.935290 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 03 22:11:09.935607 master-0 kubenswrapper[36504]: I1203 22:11:09.935476 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-26pw7" Dec 03 22:11:09.940850 master-0 kubenswrapper[36504]: I1203 22:11:09.940655 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 03 22:11:09.940850 master-0 kubenswrapper[36504]: I1203 22:11:09.940744 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 03 22:11:09.941143 master-0 kubenswrapper[36504]: I1203 22:11:09.940891 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 03 22:11:09.941143 master-0 kubenswrapper[36504]: I1203 22:11:09.940978 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 03 22:11:09.947115 master-0 kubenswrapper[36504]: I1203 22:11:09.946177 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-3uob4dessmatv" Dec 03 22:11:09.947115 master-0 kubenswrapper[36504]: I1203 22:11:09.946371 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 03 22:11:09.949944 master-0 kubenswrapper[36504]: I1203 22:11:09.947933 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 03 22:11:09.949944 master-0 kubenswrapper[36504]: I1203 22:11:09.948112 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 03 22:11:09.949944 master-0 kubenswrapper[36504]: I1203 22:11:09.948126 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 03 22:11:09.949944 master-0 kubenswrapper[36504]: I1203 22:11:09.949123 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 03 22:11:09.972260 master-0 kubenswrapper[36504]: I1203 22:11:09.972198 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 03 22:11:09.979137 master-0 kubenswrapper[36504]: I1203 22:11:09.979074 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:11:09.993157 master-0 kubenswrapper[36504]: I1203 22:11:09.993098 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.993157 master-0 kubenswrapper[36504]: I1203 22:11:09.993158 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993182 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7f4\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-kube-api-access-4g7f4\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993205 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993228 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993244 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993259 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993274 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-config-out\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993293 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993307 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993354 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993375 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-config\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993392 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993413 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993428 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-web-config\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993441 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993473 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:09.994080 master-0 kubenswrapper[36504]: I1203 22:11:09.993490 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095654 master-0 kubenswrapper[36504]: I1203 22:11:10.095609 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095654 master-0 kubenswrapper[36504]: I1203 22:11:10.095660 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095679 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095699 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-config-out\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095719 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095738 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095812 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095835 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-config\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095857 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095881 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095905 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-web-config\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095963 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.095977 master-0 kubenswrapper[36504]: I1203 22:11:10.095982 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.096474 master-0 kubenswrapper[36504]: I1203 22:11:10.096005 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.096474 master-0 kubenswrapper[36504]: I1203 22:11:10.096031 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.096474 master-0 kubenswrapper[36504]: I1203 22:11:10.096050 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7f4\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-kube-api-access-4g7f4\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.096474 master-0 kubenswrapper[36504]: I1203 22:11:10.096077 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.097143 master-0 kubenswrapper[36504]: I1203 22:11:10.096918 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.097143 master-0 kubenswrapper[36504]: I1203 22:11:10.097061 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.097645 master-0 kubenswrapper[36504]: I1203 22:11:10.097607 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.097645 master-0 kubenswrapper[36504]: I1203 22:11:10.097618 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.100657 master-0 kubenswrapper[36504]: I1203 22:11:10.099287 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.100657 master-0 kubenswrapper[36504]: I1203 22:11:10.099405 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.100657 master-0 kubenswrapper[36504]: I1203 22:11:10.099953 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-config-out\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.100657 master-0 kubenswrapper[36504]: I1203 22:11:10.100568 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-config\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.100657 master-0 kubenswrapper[36504]: I1203 22:11:10.100622 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-web-config\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.102033 master-0 kubenswrapper[36504]: I1203 22:11:10.100789 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.102859 master-0 kubenswrapper[36504]: I1203 22:11:10.102748 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.102859 master-0 kubenswrapper[36504]: I1203 22:11:10.102817 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.103694 master-0 kubenswrapper[36504]: I1203 22:11:10.103644 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.103889 master-0 kubenswrapper[36504]: I1203 22:11:10.103846 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.103947 master-0 kubenswrapper[36504]: I1203 22:11:10.103852 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.105439 master-0 kubenswrapper[36504]: I1203 22:11:10.105360 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.108758 master-0 kubenswrapper[36504]: I1203 22:11:10.108708 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.207816 master-0 kubenswrapper[36504]: I1203 22:11:10.207759 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7f4\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-kube-api-access-4g7f4\") pod \"prometheus-k8s-0\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.261317 master-0 kubenswrapper[36504]: I1203 22:11:10.261269 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:10.368017 master-0 kubenswrapper[36504]: I1203 22:11:10.367968 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79c7874b5-2k2ld"] Dec 03 22:11:10.368943 master-0 kubenswrapper[36504]: I1203 22:11:10.368913 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.370396 master-0 kubenswrapper[36504]: I1203 22:11:10.369871 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" event={"ID":"f6498ac1-7d07-4a5f-a968-d8bda72d1002","Type":"ContainerDied","Data":"86b3974531c12f65b34f84b67c60273ab31a4569e9157b7dd04d59eef5e8591d"} Dec 03 22:11:10.370396 master-0 kubenswrapper[36504]: I1203 22:11:10.369952 36504 scope.go:117] "RemoveContainer" containerID="c83609fd10c7378baa8493e0ab13c737c56d73cf8e97ee62fc24dbfcfaebb463" Dec 03 22:11:10.370396 master-0 kubenswrapper[36504]: I1203 22:11:10.370115 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77778bd57c-xdhvs" Dec 03 22:11:10.381739 master-0 kubenswrapper[36504]: I1203 22:11:10.381555 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="293aa3834074cb9659d77f03111207113b1608183a914bd8c4318a151eaec038" exitCode=0 Dec 03 22:11:10.381739 master-0 kubenswrapper[36504]: I1203 22:11:10.381691 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"293aa3834074cb9659d77f03111207113b1608183a914bd8c4318a151eaec038"} Dec 03 22:11:10.383905 master-0 kubenswrapper[36504]: I1203 22:11:10.383872 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" event={"ID":"64856d96-023f-46db-819c-02f1adea5aab","Type":"ContainerDied","Data":"ab4d3d9867608a716d1a07ef53b79868ef1de9f69ebee20baa3f154072a5867f"} Dec 03 22:11:10.383980 master-0 kubenswrapper[36504]: I1203 22:11:10.383952 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc" Dec 03 22:11:10.389021 master-0 kubenswrapper[36504]: I1203 22:11:10.388967 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"0cc316529e1f94bdc043db995b8427509ac4df8c2ae1d71b3766bcc155e44ab4"} Dec 03 22:11:10.389021 master-0 kubenswrapper[36504]: I1203 22:11:10.389018 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"b634ea3c2dbaf3b9c405987b2a62021ef549b9913fd2d69d000df6395455b9d9"} Dec 03 22:11:10.389203 master-0 kubenswrapper[36504]: I1203 22:11:10.389035 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"5b05b1e5e3f0bb1ac693ba8323d2224e49462e0150ed01e14b1ff65e554559f8"} Dec 03 22:11:10.408817 master-0 kubenswrapper[36504]: I1203 22:11:10.397621 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c7874b5-2k2ld"] Dec 03 22:11:10.408817 master-0 kubenswrapper[36504]: I1203 22:11:10.401463 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wxx\" (UniqueName: \"kubernetes.io/projected/7887a864-f2b0-4b3e-ba92-90afae8011ec-kube-api-access-88wxx\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.408817 master-0 kubenswrapper[36504]: I1203 22:11:10.401581 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-proxy-ca-bundles\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.408817 master-0 kubenswrapper[36504]: I1203 22:11:10.401649 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-client-ca\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.408817 master-0 kubenswrapper[36504]: I1203 22:11:10.401703 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-config\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.408817 master-0 kubenswrapper[36504]: I1203 22:11:10.401826 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7887a864-f2b0-4b3e-ba92-90afae8011ec-serving-cert\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.425613 master-0 kubenswrapper[36504]: I1203 22:11:10.425562 36504 scope.go:117] "RemoveContainer" containerID="65df04b5fb3a155ca93bd902fc9d9e8367e9f5c39df53b66ea5a1b06073ed946" Dec 03 22:11:10.503764 master-0 kubenswrapper[36504]: I1203 22:11:10.502915 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-client-ca\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.503764 master-0 kubenswrapper[36504]: I1203 22:11:10.502988 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-config\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.503764 master-0 kubenswrapper[36504]: I1203 22:11:10.503066 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7887a864-f2b0-4b3e-ba92-90afae8011ec-serving-cert\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.503764 master-0 kubenswrapper[36504]: I1203 22:11:10.503098 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88wxx\" (UniqueName: \"kubernetes.io/projected/7887a864-f2b0-4b3e-ba92-90afae8011ec-kube-api-access-88wxx\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.503764 master-0 kubenswrapper[36504]: I1203 22:11:10.503142 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-proxy-ca-bundles\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.504386 master-0 kubenswrapper[36504]: I1203 22:11:10.504177 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-proxy-ca-bundles\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.505554 master-0 kubenswrapper[36504]: I1203 22:11:10.505516 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-client-ca\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.506260 master-0 kubenswrapper[36504]: I1203 22:11:10.506221 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7887a864-f2b0-4b3e-ba92-90afae8011ec-config\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.508328 master-0 kubenswrapper[36504]: I1203 22:11:10.508282 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7887a864-f2b0-4b3e-ba92-90afae8011ec-serving-cert\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.530371 master-0 kubenswrapper[36504]: I1203 22:11:10.530316 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77778bd57c-xdhvs"] Dec 03 22:11:10.532081 master-0 kubenswrapper[36504]: I1203 22:11:10.532050 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wxx\" (UniqueName: \"kubernetes.io/projected/7887a864-f2b0-4b3e-ba92-90afae8011ec-kube-api-access-88wxx\") pod \"controller-manager-79c7874b5-2k2ld\" (UID: \"7887a864-f2b0-4b3e-ba92-90afae8011ec\") " pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.540390 master-0 kubenswrapper[36504]: I1203 22:11:10.537444 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77778bd57c-xdhvs"] Dec 03 22:11:10.541927 master-0 kubenswrapper[36504]: I1203 22:11:10.541857 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc"] Dec 03 22:11:10.547624 master-0 kubenswrapper[36504]: I1203 22:11:10.547569 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8667dd96f5-qf2rc"] Dec 03 22:11:10.719615 master-0 kubenswrapper[36504]: I1203 22:11:10.719491 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:10.755750 master-0 kubenswrapper[36504]: I1203 22:11:10.755688 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:11:11.104498 master-0 kubenswrapper[36504]: I1203 22:11:11.104385 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64856d96-023f-46db-819c-02f1adea5aab" path="/var/lib/kubelet/pods/64856d96-023f-46db-819c-02f1adea5aab/volumes" Dec 03 22:11:11.104952 master-0 kubenswrapper[36504]: I1203 22:11:11.104934 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6498ac1-7d07-4a5f-a968-d8bda72d1002" path="/var/lib/kubelet/pods/f6498ac1-7d07-4a5f-a968-d8bda72d1002/volumes" Dec 03 22:11:11.242066 master-0 kubenswrapper[36504]: I1203 22:11:11.242012 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm"] Dec 03 22:11:11.243144 master-0 kubenswrapper[36504]: I1203 22:11:11.243102 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.246518 master-0 kubenswrapper[36504]: I1203 22:11:11.246495 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vbccl" Dec 03 22:11:11.246714 master-0 kubenswrapper[36504]: I1203 22:11:11.246683 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:11:11.246874 master-0 kubenswrapper[36504]: I1203 22:11:11.246852 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:11:11.247116 master-0 kubenswrapper[36504]: I1203 22:11:11.247090 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:11:11.247235 master-0 kubenswrapper[36504]: I1203 22:11:11.247216 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:11:11.251008 master-0 kubenswrapper[36504]: I1203 22:11:11.250981 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:11:11.266962 master-0 kubenswrapper[36504]: I1203 22:11:11.266896 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm"] Dec 03 22:11:11.283126 master-0 kubenswrapper[36504]: I1203 22:11:11.283066 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx"] Dec 03 22:11:11.286516 master-0 kubenswrapper[36504]: I1203 22:11:11.284213 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:11.289513 master-0 kubenswrapper[36504]: I1203 22:11:11.288085 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-dg9ph" Dec 03 22:11:11.289513 master-0 kubenswrapper[36504]: I1203 22:11:11.288740 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 03 22:11:11.305461 master-0 kubenswrapper[36504]: I1203 22:11:11.304920 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx"] Dec 03 22:11:11.324720 master-0 kubenswrapper[36504]: I1203 22:11:11.324650 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-client-ca\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.324720 master-0 kubenswrapper[36504]: I1203 22:11:11.324740 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-config\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.325089 master-0 kubenswrapper[36504]: I1203 22:11:11.324798 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-serving-cert\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.325089 master-0 kubenswrapper[36504]: I1203 22:11:11.324826 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktvk\" (UniqueName: \"kubernetes.io/projected/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-kube-api-access-wktvk\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.325089 master-0 kubenswrapper[36504]: I1203 22:11:11.324883 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/abe93580-7514-4a69-a8af-c11a0f3c8f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-688c48b94d-mj6wx\" (UID: \"abe93580-7514-4a69-a8af-c11a0f3c8f8a\") " pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:11.426035 master-0 kubenswrapper[36504]: I1203 22:11:11.425959 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/abe93580-7514-4a69-a8af-c11a0f3c8f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-688c48b94d-mj6wx\" (UID: \"abe93580-7514-4a69-a8af-c11a0f3c8f8a\") " pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:11.426695 master-0 kubenswrapper[36504]: I1203 22:11:11.426081 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-client-ca\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.426695 master-0 kubenswrapper[36504]: I1203 22:11:11.426143 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-config\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.426695 master-0 kubenswrapper[36504]: I1203 22:11:11.426179 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-serving-cert\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.426695 master-0 kubenswrapper[36504]: I1203 22:11:11.426202 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktvk\" (UniqueName: \"kubernetes.io/projected/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-kube-api-access-wktvk\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.427785 master-0 kubenswrapper[36504]: I1203 22:11:11.427726 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-client-ca\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.428013 master-0 kubenswrapper[36504]: I1203 22:11:11.427968 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-config\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.429848 master-0 kubenswrapper[36504]: I1203 22:11:11.429821 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/abe93580-7514-4a69-a8af-c11a0f3c8f8a-monitoring-plugin-cert\") pod \"monitoring-plugin-688c48b94d-mj6wx\" (UID: \"abe93580-7514-4a69-a8af-c11a0f3c8f8a\") " pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:11.431414 master-0 kubenswrapper[36504]: I1203 22:11:11.431002 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-serving-cert\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.443959 master-0 kubenswrapper[36504]: I1203 22:11:11.443915 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktvk\" (UniqueName: \"kubernetes.io/projected/f3dca8fb-e1a1-4e6c-9946-0218a55985b3-kube-api-access-wktvk\") pod \"route-controller-manager-7b577896db-d4hsm\" (UID: \"f3dca8fb-e1a1-4e6c-9946-0218a55985b3\") " pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.597951 master-0 kubenswrapper[36504]: I1203 22:11:11.597885 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:11.637371 master-0 kubenswrapper[36504]: I1203 22:11:11.637313 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:11:11.637697 master-0 kubenswrapper[36504]: I1203 22:11:11.637681 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:12.365948 master-0 kubenswrapper[36504]: I1203 22:11:12.360028 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:11:12.365948 master-0 kubenswrapper[36504]: E1203 22:11:12.361004 36504 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:11:12.365948 master-0 kubenswrapper[36504]: E1203 22:11:12.361039 36504 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-5-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:11:12.365948 master-0 kubenswrapper[36504]: E1203 22:11:12.361102 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access podName:5c8c7291-3150-46a5-9d14-57a23bb51cc0 nodeName:}" failed. No retries permitted until 2025-12-03 22:11:44.361076441 +0000 UTC m=+69.580848448 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access") pod "installer-5-master-0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Dec 03 22:11:12.381905 master-0 kubenswrapper[36504]: I1203 22:11:12.381832 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-67dbd4669-gxlrj"] Dec 03 22:11:12.382845 master-0 kubenswrapper[36504]: I1203 22:11:12.382804 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.394567 master-0 kubenswrapper[36504]: I1203 22:11:12.391158 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 22:11:12.394567 master-0 kubenswrapper[36504]: W1203 22:11:12.391392 36504 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'master-0' and this object Dec 03 22:11:12.397043 master-0 kubenswrapper[36504]: I1203 22:11:12.396996 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 22:11:12.397612 master-0 kubenswrapper[36504]: I1203 22:11:12.397564 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 22:11:12.397674 master-0 kubenswrapper[36504]: I1203 22:11:12.397621 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 22:11:12.397829 master-0 kubenswrapper[36504]: I1203 22:11:12.397799 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 22:11:12.399054 master-0 kubenswrapper[36504]: I1203 22:11:12.398059 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 22:11:12.399054 master-0 kubenswrapper[36504]: I1203 22:11:12.398109 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 22:11:12.402873 master-0 kubenswrapper[36504]: E1203 22:11:12.400804 36504 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Dec 03 22:11:12.410077 master-0 kubenswrapper[36504]: I1203 22:11:12.405685 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 22:11:12.410077 master-0 kubenswrapper[36504]: I1203 22:11:12.405942 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 22:11:12.410077 master-0 kubenswrapper[36504]: I1203 22:11:12.407328 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-9tlz7" Dec 03 22:11:12.410077 master-0 kubenswrapper[36504]: I1203 22:11:12.409300 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 22:11:12.412946 master-0 kubenswrapper[36504]: I1203 22:11:12.412210 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 22:11:12.420495 master-0 kubenswrapper[36504]: I1203 22:11:12.419880 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"0e60dab54d41350ff8e69b811ea2ea2066fac62c8c8cd3514d7993b80e9503a3"} Dec 03 22:11:12.421410 master-0 kubenswrapper[36504]: I1203 22:11:12.421355 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77df56447c-s242z" event={"ID":"9d022890-8679-440b-bfb1-dbda6ee771f0","Type":"ContainerStarted","Data":"08a7787e155d720652e6a4a2855162135f102874f838465c19b27cd087e19566"} Dec 03 22:11:12.423593 master-0 kubenswrapper[36504]: I1203 22:11:12.423560 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:12.424906 master-0 kubenswrapper[36504]: I1203 22:11:12.424873 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"d9cd3ea5d36d4b61da3ba9089ff5f3d844951b2df54a6ba1b43fba52d4e57250"} Dec 03 22:11:12.424906 master-0 kubenswrapper[36504]: I1203 22:11:12.424900 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"ee5acb547e13c4549526d636347fac56d7c0cad90b7828fd68d9c5e75aaa558c"} Dec 03 22:11:12.471647 master-0 kubenswrapper[36504]: I1203 22:11:12.471598 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blncr\" (UniqueName: \"kubernetes.io/projected/7d985732-3cd0-4aff-8050-37330d01507a-kube-api-access-blncr\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.471675 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.471727 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-error\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.471816 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-service-ca\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.471853 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-session\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.471903 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.471986 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d985732-3cd0-4aff-8050-37330d01507a-audit-dir\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.472011 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-router-certs\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.472035 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-audit-policies\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.472056 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.472073 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.472091 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.500860 master-0 kubenswrapper[36504]: I1203 22:11:12.472175 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-login\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.507801 master-0 kubenswrapper[36504]: I1203 22:11:12.506101 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 22:11:12.565009 master-0 kubenswrapper[36504]: I1203 22:11:12.559527 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm"] Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615494 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-login\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615567 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blncr\" (UniqueName: \"kubernetes.io/projected/7d985732-3cd0-4aff-8050-37330d01507a-kube-api-access-blncr\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615602 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615635 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-error\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615669 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-service-ca\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615688 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-session\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615721 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615774 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-router-certs\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615808 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d985732-3cd0-4aff-8050-37330d01507a-audit-dir\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615835 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-audit-policies\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615857 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615878 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.621822 master-0 kubenswrapper[36504]: I1203 22:11:12.615894 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.626940 master-0 kubenswrapper[36504]: I1203 22:11:12.623264 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.626940 master-0 kubenswrapper[36504]: I1203 22:11:12.626812 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67dbd4669-gxlrj"] Dec 03 22:11:12.627191 master-0 kubenswrapper[36504]: I1203 22:11:12.627156 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d985732-3cd0-4aff-8050-37330d01507a-audit-dir\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.627788 master-0 kubenswrapper[36504]: I1203 22:11:12.627729 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-audit-policies\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.628192 master-0 kubenswrapper[36504]: I1203 22:11:12.628165 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-service-ca\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.628271 master-0 kubenswrapper[36504]: I1203 22:11:12.628184 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.628980 master-0 kubenswrapper[36504]: I1203 22:11:12.628919 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.637484 master-0 kubenswrapper[36504]: I1203 22:11:12.630734 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-error\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.637484 master-0 kubenswrapper[36504]: I1203 22:11:12.632554 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-session\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.637484 master-0 kubenswrapper[36504]: I1203 22:11:12.633346 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.637484 master-0 kubenswrapper[36504]: I1203 22:11:12.637420 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.637882 master-0 kubenswrapper[36504]: I1203 22:11:12.637832 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-login\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.665429 master-0 kubenswrapper[36504]: I1203 22:11:12.665331 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blncr\" (UniqueName: \"kubernetes.io/projected/7d985732-3cd0-4aff-8050-37330d01507a-kube-api-access-blncr\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:12.676327 master-0 kubenswrapper[36504]: I1203 22:11:12.676002 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx"] Dec 03 22:11:12.688623 master-0 kubenswrapper[36504]: W1203 22:11:12.688558 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe93580_7514_4a69_a8af_c11a0f3c8f8a.slice/crio-978d758af95133f2c02d9024fe8472b79907dce2be098cd33d12708a8e9d3be6 WatchSource:0}: Error finding container 978d758af95133f2c02d9024fe8472b79907dce2be098cd33d12708a8e9d3be6: Status 404 returned error can't find the container with id 978d758af95133f2c02d9024fe8472b79907dce2be098cd33d12708a8e9d3be6 Dec 03 22:11:12.710330 master-0 kubenswrapper[36504]: I1203 22:11:12.710103 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c7874b5-2k2ld"] Dec 03 22:11:12.711257 master-0 kubenswrapper[36504]: I1203 22:11:12.711144 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-77df56447c-s242z" podStartSLOduration=2.330678699 podStartE2EDuration="6.711117782s" podCreationTimestamp="2025-12-03 22:11:06 +0000 UTC" firstStartedPulling="2025-12-03 22:11:07.679875424 +0000 UTC m=+32.899647431" lastFinishedPulling="2025-12-03 22:11:12.060314497 +0000 UTC m=+37.280086514" observedRunningTime="2025-12-03 22:11:12.702838492 +0000 UTC m=+37.922610509" watchObservedRunningTime="2025-12-03 22:11:12.711117782 +0000 UTC m=+37.930889789" Dec 03 22:11:12.715774 master-0 kubenswrapper[36504]: W1203 22:11:12.715717 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7887a864_f2b0_4b3e_ba92_90afae8011ec.slice/crio-8e3db0b03aaaa36845c62224cdb12d4996a5803a5422058483186da121f06335 WatchSource:0}: Error finding container 8e3db0b03aaaa36845c62224cdb12d4996a5803a5422058483186da121f06335: Status 404 returned error can't find the container with id 8e3db0b03aaaa36845c62224cdb12d4996a5803a5422058483186da121f06335 Dec 03 22:11:12.868955 master-0 kubenswrapper[36504]: I1203 22:11:12.868895 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-77df56447c-s242z" Dec 03 22:11:13.196233 master-0 kubenswrapper[36504]: I1203 22:11:13.194119 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6f5db8559b-xzb5p"] Dec 03 22:11:13.196233 master-0 kubenswrapper[36504]: I1203 22:11:13.195123 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:11:13.197710 master-0 kubenswrapper[36504]: I1203 22:11:13.197683 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 22:11:13.198574 master-0 kubenswrapper[36504]: I1203 22:11:13.198541 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-gzjfz" Dec 03 22:11:13.198881 master-0 kubenswrapper[36504]: I1203 22:11:13.198715 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 22:11:13.208807 master-0 kubenswrapper[36504]: I1203 22:11:13.208758 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6f5db8559b-xzb5p"] Dec 03 22:11:13.330635 master-0 kubenswrapper[36504]: I1203 22:11:13.330301 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cs9g\" (UniqueName: \"kubernetes.io/projected/716a95ab-7d31-41ac-ad14-f756f17b3179-kube-api-access-2cs9g\") pod \"downloads-6f5db8559b-xzb5p\" (UID: \"716a95ab-7d31-41ac-ad14-f756f17b3179\") " pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:11:13.371007 master-0 kubenswrapper[36504]: I1203 22:11:13.368802 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 22:11:13.382341 master-0 kubenswrapper[36504]: I1203 22:11:13.382283 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-router-certs\") pod \"oauth-openshift-67dbd4669-gxlrj\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:13.431941 master-0 kubenswrapper[36504]: I1203 22:11:13.431882 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cs9g\" (UniqueName: \"kubernetes.io/projected/716a95ab-7d31-41ac-ad14-f756f17b3179-kube-api-access-2cs9g\") pod \"downloads-6f5db8559b-xzb5p\" (UID: \"716a95ab-7d31-41ac-ad14-f756f17b3179\") " pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:11:13.437015 master-0 kubenswrapper[36504]: I1203 22:11:13.436954 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"b0c243ac0211dabcbcac6e6ab9b91d8c6db77996ca4cc95242d6a37ccc5df33d"} Dec 03 22:11:13.437015 master-0 kubenswrapper[36504]: I1203 22:11:13.437012 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" event={"ID":"d0e75f01-426f-4c0d-8398-76bd3edc06cb","Type":"ContainerStarted","Data":"50511735e83c92402396c23997332923e9679fd924fe93a30494ac5c66b476f7"} Dec 03 22:11:13.437471 master-0 kubenswrapper[36504]: I1203 22:11:13.437416 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:13.441082 master-0 kubenswrapper[36504]: I1203 22:11:13.440803 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" event={"ID":"7887a864-f2b0-4b3e-ba92-90afae8011ec","Type":"ContainerStarted","Data":"484453a89c780a752e935e68b06a2aaf171204c2ac19e44459140d97051ef68c"} Dec 03 22:11:13.441082 master-0 kubenswrapper[36504]: I1203 22:11:13.440898 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" event={"ID":"7887a864-f2b0-4b3e-ba92-90afae8011ec","Type":"ContainerStarted","Data":"8e3db0b03aaaa36845c62224cdb12d4996a5803a5422058483186da121f06335"} Dec 03 22:11:13.442324 master-0 kubenswrapper[36504]: I1203 22:11:13.442054 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:13.443346 master-0 kubenswrapper[36504]: I1203 22:11:13.443321 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" event={"ID":"abe93580-7514-4a69-a8af-c11a0f3c8f8a","Type":"ContainerStarted","Data":"978d758af95133f2c02d9024fe8472b79907dce2be098cd33d12708a8e9d3be6"} Dec 03 22:11:13.445790 master-0 kubenswrapper[36504]: I1203 22:11:13.445744 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="d9cd3ea5d36d4b61da3ba9089ff5f3d844951b2df54a6ba1b43fba52d4e57250" exitCode=0 Dec 03 22:11:13.445896 master-0 kubenswrapper[36504]: I1203 22:11:13.445833 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"d9cd3ea5d36d4b61da3ba9089ff5f3d844951b2df54a6ba1b43fba52d4e57250"} Dec 03 22:11:13.450973 master-0 kubenswrapper[36504]: I1203 22:11:13.450905 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" event={"ID":"f3dca8fb-e1a1-4e6c-9946-0218a55985b3","Type":"ContainerStarted","Data":"ec70d612423a2f25e56a4fcf73e8a97ca54528432cb47b941eefe17784276373"} Dec 03 22:11:13.450973 master-0 kubenswrapper[36504]: I1203 22:11:13.450963 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:13.450973 master-0 kubenswrapper[36504]: I1203 22:11:13.450976 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" event={"ID":"f3dca8fb-e1a1-4e6c-9946-0218a55985b3","Type":"ContainerStarted","Data":"0eea12bc7e30ae09694ba571f8693b8b7ee620602b4748de513d43928cc71366"} Dec 03 22:11:13.457765 master-0 kubenswrapper[36504]: I1203 22:11:13.457111 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cs9g\" (UniqueName: \"kubernetes.io/projected/716a95ab-7d31-41ac-ad14-f756f17b3179-kube-api-access-2cs9g\") pod \"downloads-6f5db8559b-xzb5p\" (UID: \"716a95ab-7d31-41ac-ad14-f756f17b3179\") " pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:11:13.459136 master-0 kubenswrapper[36504]: I1203 22:11:13.458984 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" Dec 03 22:11:13.497847 master-0 kubenswrapper[36504]: I1203 22:11:13.493686 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b577896db-d4hsm" podStartSLOduration=4.493659526 podStartE2EDuration="4.493659526s" podCreationTimestamp="2025-12-03 22:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:13.486548603 +0000 UTC m=+38.706320620" watchObservedRunningTime="2025-12-03 22:11:13.493659526 +0000 UTC m=+38.713431533" Dec 03 22:11:13.499473 master-0 kubenswrapper[36504]: I1203 22:11:13.498608 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" podStartSLOduration=2.7724716750000002 podStartE2EDuration="9.498586922s" podCreationTimestamp="2025-12-03 22:11:04 +0000 UTC" firstStartedPulling="2025-12-03 22:11:05.333092246 +0000 UTC m=+30.552864243" lastFinishedPulling="2025-12-03 22:11:12.059207493 +0000 UTC m=+37.278979490" observedRunningTime="2025-12-03 22:11:13.467624939 +0000 UTC m=+38.687396966" watchObservedRunningTime="2025-12-03 22:11:13.498586922 +0000 UTC m=+38.718358929" Dec 03 22:11:13.507344 master-0 kubenswrapper[36504]: I1203 22:11:13.507033 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" podStartSLOduration=4.507013517 podStartE2EDuration="4.507013517s" podCreationTimestamp="2025-12-03 22:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:13.507002266 +0000 UTC m=+38.726774273" watchObservedRunningTime="2025-12-03 22:11:13.507013517 +0000 UTC m=+38.726785524" Dec 03 22:11:13.545692 master-0 kubenswrapper[36504]: I1203 22:11:13.545632 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:11:13.573814 master-0 kubenswrapper[36504]: I1203 22:11:13.573771 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79c7874b5-2k2ld" Dec 03 22:11:13.608800 master-0 kubenswrapper[36504]: I1203 22:11:13.608734 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:14.137853 master-0 kubenswrapper[36504]: I1203 22:11:14.137764 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6f5db8559b-xzb5p"] Dec 03 22:11:14.472259 master-0 kubenswrapper[36504]: I1203 22:11:14.472164 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6f5db8559b-xzb5p" event={"ID":"716a95ab-7d31-41ac-ad14-f756f17b3179","Type":"ContainerStarted","Data":"95fd0dbefb9de574e127dabe88d239cf52ca1f31a03b453a435e571c56a97826"} Dec 03 22:11:14.480026 master-0 kubenswrapper[36504]: I1203 22:11:14.479967 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-f59c45d65-v6hb2" Dec 03 22:11:16.105448 master-0 kubenswrapper[36504]: I1203 22:11:16.104902 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67dbd4669-gxlrj"] Dec 03 22:11:16.114740 master-0 kubenswrapper[36504]: W1203 22:11:16.114669 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d985732_3cd0_4aff_8050_37330d01507a.slice/crio-31c3c6d2ee557d30a0fe4fb166e592600bd64c5ff2040e73591b3be57a3494c4 WatchSource:0}: Error finding container 31c3c6d2ee557d30a0fe4fb166e592600bd64c5ff2040e73591b3be57a3494c4: Status 404 returned error can't find the container with id 31c3c6d2ee557d30a0fe4fb166e592600bd64c5ff2040e73591b3be57a3494c4 Dec 03 22:11:16.492416 master-0 kubenswrapper[36504]: I1203 22:11:16.492330 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" event={"ID":"7d985732-3cd0-4aff-8050-37330d01507a","Type":"ContainerStarted","Data":"31c3c6d2ee557d30a0fe4fb166e592600bd64c5ff2040e73591b3be57a3494c4"} Dec 03 22:11:16.496969 master-0 kubenswrapper[36504]: I1203 22:11:16.496886 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"26ec3b66599dba0723e2e891ff101c0c8c05e81fa006226174021a06e7c6f7c8"} Dec 03 22:11:16.497296 master-0 kubenswrapper[36504]: I1203 22:11:16.497276 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"bc778f4e9e81307fa7f9857972a109cf59925855518b80afb692131b95f39008"} Dec 03 22:11:16.498347 master-0 kubenswrapper[36504]: I1203 22:11:16.498303 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"3991a80b948e18fbc5646eb1553734287ea7281af63687505e0275404e8bafe9"} Dec 03 22:11:16.501777 master-0 kubenswrapper[36504]: I1203 22:11:16.501736 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" event={"ID":"abe93580-7514-4a69-a8af-c11a0f3c8f8a","Type":"ContainerStarted","Data":"14dc1cacbc3f85735758d18fb98e0c3da3eebf9fe463b0c97f5fac8e866ef7a4"} Dec 03 22:11:16.502452 master-0 kubenswrapper[36504]: I1203 22:11:16.502431 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:16.502836 master-0 kubenswrapper[36504]: I1203 22:11:16.502740 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:11:16.508716 master-0 kubenswrapper[36504]: I1203 22:11:16.508675 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" Dec 03 22:11:16.578450 master-0 kubenswrapper[36504]: I1203 22:11:16.578251 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" podStartSLOduration=2.660944234 podStartE2EDuration="5.578226502s" podCreationTimestamp="2025-12-03 22:11:11 +0000 UTC" firstStartedPulling="2025-12-03 22:11:12.694050606 +0000 UTC m=+37.913822613" lastFinishedPulling="2025-12-03 22:11:15.611332874 +0000 UTC m=+40.831104881" observedRunningTime="2025-12-03 22:11:16.528805259 +0000 UTC m=+41.748577276" watchObservedRunningTime="2025-12-03 22:11:16.578226502 +0000 UTC m=+41.797998539" Dec 03 22:11:17.527817 master-0 kubenswrapper[36504]: I1203 22:11:17.527745 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:11:17.528942 master-0 kubenswrapper[36504]: I1203 22:11:17.528719 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"3de9ee4eef380bab89092626bf758dffa1d7cea9d48eda9514ff804a4488d430"} Dec 03 22:11:17.528942 master-0 kubenswrapper[36504]: I1203 22:11:17.528757 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"68a7927c51918fe5c4fb8a44357b8b8424076604ed97acaf1ce5cef2372ca6b9"} Dec 03 22:11:18.535230 master-0 kubenswrapper[36504]: I1203 22:11:18.535157 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:11:19.943372 master-0 kubenswrapper[36504]: I1203 22:11:19.943317 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-67dbd4669-gxlrj"] Dec 03 22:11:20.462277 master-0 kubenswrapper[36504]: I1203 22:11:20.462205 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:20.463040 master-0 kubenswrapper[36504]: I1203 22:11:20.463006 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:20.699584 master-0 kubenswrapper[36504]: I1203 22:11:20.694335 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerStarted","Data":"56a2cd13eb7b79c9802629a342435d6c6c35687e7a5e6dc330c18e9a98072842"} Dec 03 22:11:20.710386 master-0 kubenswrapper[36504]: I1203 22:11:20.710320 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"f0eddaec065f93226a58ae33d3b8000bdd340ba29b6a91fbb8390d9217f4bb8d"} Dec 03 22:11:20.710550 master-0 kubenswrapper[36504]: I1203 22:11:20.710402 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"ebedcfc581bb8d540725d068b2c227219da0049c7a1c3f30a5a0fa6d8ae719ff"} Dec 03 22:11:20.710550 master-0 kubenswrapper[36504]: I1203 22:11:20.710419 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd"} Dec 03 22:11:21.722288 master-0 kubenswrapper[36504]: I1203 22:11:21.722221 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"84f355f08fea0ece11dcec4bc171a070cc9b92e6f0db0c9ffcfecf32bd0c5675"} Dec 03 22:11:21.722288 master-0 kubenswrapper[36504]: I1203 22:11:21.722293 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"acc4a3ec94f5519f67f3c9eb71620d6662bb73370e9b401bf3c6e1cfd3e2e591"} Dec 03 22:11:21.722885 master-0 kubenswrapper[36504]: I1203 22:11:21.722309 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerStarted","Data":"07b2946b0d38203c0e82d06430687c3d314865f4239cf760f29db65f0176187e"} Dec 03 22:11:21.759223 master-0 kubenswrapper[36504]: I1203 22:11:21.759143 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=11.530577663999999 podStartE2EDuration="16.759121254s" podCreationTimestamp="2025-12-03 22:11:05 +0000 UTC" firstStartedPulling="2025-12-03 22:11:10.383084533 +0000 UTC m=+35.602856540" lastFinishedPulling="2025-12-03 22:11:15.611628123 +0000 UTC m=+40.831400130" observedRunningTime="2025-12-03 22:11:20.738236689 +0000 UTC m=+45.958008696" watchObservedRunningTime="2025-12-03 22:11:21.759121254 +0000 UTC m=+46.978893261" Dec 03 22:11:21.759528 master-0 kubenswrapper[36504]: I1203 22:11:21.759495 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.270917026 podStartE2EDuration="12.759491096s" podCreationTimestamp="2025-12-03 22:11:09 +0000 UTC" firstStartedPulling="2025-12-03 22:11:13.447108364 +0000 UTC m=+38.666880371" lastFinishedPulling="2025-12-03 22:11:19.935682434 +0000 UTC m=+45.155454441" observedRunningTime="2025-12-03 22:11:21.753103445 +0000 UTC m=+46.972875472" watchObservedRunningTime="2025-12-03 22:11:21.759491096 +0000 UTC m=+46.979263103" Dec 03 22:11:22.734424 master-0 kubenswrapper[36504]: I1203 22:11:22.734354 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" event={"ID":"7d985732-3cd0-4aff-8050-37330d01507a","Type":"ContainerStarted","Data":"794c9e6ecc5560cf0401aa0a4ba5348b3fea5003687d288d4c9525259e576067"} Dec 03 22:11:22.760929 master-0 kubenswrapper[36504]: I1203 22:11:22.760814 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" podStartSLOduration=4.615500383 podStartE2EDuration="10.760792476s" podCreationTimestamp="2025-12-03 22:11:12 +0000 UTC" firstStartedPulling="2025-12-03 22:11:16.119090411 +0000 UTC m=+41.338862418" lastFinishedPulling="2025-12-03 22:11:22.264382494 +0000 UTC m=+47.484154511" observedRunningTime="2025-12-03 22:11:22.75775188 +0000 UTC m=+47.977523897" watchObservedRunningTime="2025-12-03 22:11:22.760792476 +0000 UTC m=+47.980564483" Dec 03 22:11:23.610276 master-0 kubenswrapper[36504]: I1203 22:11:23.610172 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:23.615363 master-0 kubenswrapper[36504]: I1203 22:11:23.615320 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:24.385343 master-0 kubenswrapper[36504]: I1203 22:11:24.384964 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-567f65dc5d-rbkjf"] Dec 03 22:11:24.385972 master-0 kubenswrapper[36504]: I1203 22:11:24.385809 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.388019 master-0 kubenswrapper[36504]: I1203 22:11:24.387963 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 22:11:24.388188 master-0 kubenswrapper[36504]: I1203 22:11:24.388167 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 22:11:24.388363 master-0 kubenswrapper[36504]: I1203 22:11:24.388191 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-7rlph" Dec 03 22:11:24.388363 master-0 kubenswrapper[36504]: I1203 22:11:24.388213 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 22:11:24.388363 master-0 kubenswrapper[36504]: I1203 22:11:24.388320 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 22:11:24.388866 master-0 kubenswrapper[36504]: I1203 22:11:24.388714 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 22:11:24.416179 master-0 kubenswrapper[36504]: I1203 22:11:24.416135 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-567f65dc5d-rbkjf"] Dec 03 22:11:24.435332 master-0 kubenswrapper[36504]: I1203 22:11:24.435295 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-oauth-serving-cert\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.435488 master-0 kubenswrapper[36504]: I1203 22:11:24.435340 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-serving-cert\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.435488 master-0 kubenswrapper[36504]: I1203 22:11:24.435373 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxzwx\" (UniqueName: \"kubernetes.io/projected/01c52731-571d-4eaf-9282-6d662b25102e-kube-api-access-qxzwx\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.435488 master-0 kubenswrapper[36504]: I1203 22:11:24.435398 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-service-ca\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.435580 master-0 kubenswrapper[36504]: I1203 22:11:24.435509 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-console-config\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.435580 master-0 kubenswrapper[36504]: I1203 22:11:24.435531 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-oauth-config\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.539183 master-0 kubenswrapper[36504]: I1203 22:11:24.539047 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-service-ca\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.539401 master-0 kubenswrapper[36504]: I1203 22:11:24.539342 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-console-config\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.539401 master-0 kubenswrapper[36504]: I1203 22:11:24.539383 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-oauth-config\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.539598 master-0 kubenswrapper[36504]: I1203 22:11:24.539479 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-oauth-serving-cert\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.539598 master-0 kubenswrapper[36504]: I1203 22:11:24.539525 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-serving-cert\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.539598 master-0 kubenswrapper[36504]: I1203 22:11:24.539582 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxzwx\" (UniqueName: \"kubernetes.io/projected/01c52731-571d-4eaf-9282-6d662b25102e-kube-api-access-qxzwx\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.540457 master-0 kubenswrapper[36504]: I1203 22:11:24.540403 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-service-ca\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.540531 master-0 kubenswrapper[36504]: I1203 22:11:24.540498 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-oauth-serving-cert\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.541782 master-0 kubenswrapper[36504]: I1203 22:11:24.541702 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-console-config\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.543178 master-0 kubenswrapper[36504]: I1203 22:11:24.543156 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-oauth-config\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.553938 master-0 kubenswrapper[36504]: I1203 22:11:24.547147 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-serving-cert\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.558793 master-0 kubenswrapper[36504]: I1203 22:11:24.558645 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxzwx\" (UniqueName: \"kubernetes.io/projected/01c52731-571d-4eaf-9282-6d662b25102e-kube-api-access-qxzwx\") pod \"console-567f65dc5d-rbkjf\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:24.736991 master-0 kubenswrapper[36504]: I1203 22:11:24.736831 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:25.262163 master-0 kubenswrapper[36504]: I1203 22:11:25.262085 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:11:25.375199 master-0 kubenswrapper[36504]: I1203 22:11:25.375077 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-567f65dc5d-rbkjf"] Dec 03 22:11:25.390520 master-0 kubenswrapper[36504]: W1203 22:11:25.389789 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c52731_571d_4eaf_9282_6d662b25102e.slice/crio-5b2c868b07667944d0bf06d1b1d51cfe486835ba0bf37538466727b63e6ed24e WatchSource:0}: Error finding container 5b2c868b07667944d0bf06d1b1d51cfe486835ba0bf37538466727b63e6ed24e: Status 404 returned error can't find the container with id 5b2c868b07667944d0bf06d1b1d51cfe486835ba0bf37538466727b63e6ed24e Dec 03 22:11:25.761467 master-0 kubenswrapper[36504]: I1203 22:11:25.761330 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567f65dc5d-rbkjf" event={"ID":"01c52731-571d-4eaf-9282-6d662b25102e","Type":"ContainerStarted","Data":"5b2c868b07667944d0bf06d1b1d51cfe486835ba0bf37538466727b63e6ed24e"} Dec 03 22:11:30.823463 master-0 kubenswrapper[36504]: I1203 22:11:30.823332 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567f65dc5d-rbkjf" event={"ID":"01c52731-571d-4eaf-9282-6d662b25102e","Type":"ContainerStarted","Data":"37738bf3ff90188b706f6cd5519505fb6ffa14fcd942677e11d536507841aed9"} Dec 03 22:11:30.906632 master-0 kubenswrapper[36504]: I1203 22:11:30.906542 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-567f65dc5d-rbkjf" podStartSLOduration=1.82333796 podStartE2EDuration="6.906525s" podCreationTimestamp="2025-12-03 22:11:24 +0000 UTC" firstStartedPulling="2025-12-03 22:11:25.392794858 +0000 UTC m=+50.612566855" lastFinishedPulling="2025-12-03 22:11:30.475981878 +0000 UTC m=+55.695753895" observedRunningTime="2025-12-03 22:11:30.904814756 +0000 UTC m=+56.124586783" watchObservedRunningTime="2025-12-03 22:11:30.906525 +0000 UTC m=+56.126297007" Dec 03 22:11:32.319250 master-0 kubenswrapper[36504]: I1203 22:11:32.319131 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5588d9469d-jxz4w"] Dec 03 22:11:32.320689 master-0 kubenswrapper[36504]: I1203 22:11:32.320643 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.349886 master-0 kubenswrapper[36504]: I1203 22:11:32.349819 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 22:11:32.350942 master-0 kubenswrapper[36504]: I1203 22:11:32.350844 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5588d9469d-jxz4w"] Dec 03 22:11:32.414238 master-0 kubenswrapper[36504]: I1203 22:11:32.414165 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-service-ca\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.414656 master-0 kubenswrapper[36504]: I1203 22:11:32.414251 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-serving-cert\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.414656 master-0 kubenswrapper[36504]: I1203 22:11:32.414320 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-trusted-ca-bundle\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.414656 master-0 kubenswrapper[36504]: I1203 22:11:32.414380 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwvr\" (UniqueName: \"kubernetes.io/projected/45869470-a94b-4424-a580-3bb4d9e0c675-kube-api-access-vcwvr\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.414656 master-0 kubenswrapper[36504]: I1203 22:11:32.414501 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-console-config\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.414911 master-0 kubenswrapper[36504]: I1203 22:11:32.414850 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-oauth-config\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.414911 master-0 kubenswrapper[36504]: I1203 22:11:32.414884 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-oauth-serving-cert\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516297 master-0 kubenswrapper[36504]: I1203 22:11:32.516222 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-oauth-config\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516297 master-0 kubenswrapper[36504]: I1203 22:11:32.516287 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-oauth-serving-cert\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516673 master-0 kubenswrapper[36504]: I1203 22:11:32.516334 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-service-ca\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516673 master-0 kubenswrapper[36504]: I1203 22:11:32.516358 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-serving-cert\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516673 master-0 kubenswrapper[36504]: I1203 22:11:32.516384 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-trusted-ca-bundle\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516673 master-0 kubenswrapper[36504]: I1203 22:11:32.516417 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwvr\" (UniqueName: \"kubernetes.io/projected/45869470-a94b-4424-a580-3bb4d9e0c675-kube-api-access-vcwvr\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.516673 master-0 kubenswrapper[36504]: I1203 22:11:32.516448 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-console-config\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.518399 master-0 kubenswrapper[36504]: I1203 22:11:32.518361 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-oauth-serving-cert\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.519748 master-0 kubenswrapper[36504]: I1203 22:11:32.518718 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-service-ca\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.519748 master-0 kubenswrapper[36504]: I1203 22:11:32.519407 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-console-config\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.519748 master-0 kubenswrapper[36504]: I1203 22:11:32.519669 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-trusted-ca-bundle\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.521181 master-0 kubenswrapper[36504]: I1203 22:11:32.521131 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-serving-cert\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.523168 master-0 kubenswrapper[36504]: I1203 22:11:32.523108 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-oauth-config\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.541731 master-0 kubenswrapper[36504]: I1203 22:11:32.541617 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwvr\" (UniqueName: \"kubernetes.io/projected/45869470-a94b-4424-a580-3bb4d9e0c675-kube-api-access-vcwvr\") pod \"console-5588d9469d-jxz4w\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:32.660031 master-0 kubenswrapper[36504]: I1203 22:11:32.659861 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:33.470963 master-0 kubenswrapper[36504]: I1203 22:11:33.461058 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5588d9469d-jxz4w"] Dec 03 22:11:33.470963 master-0 kubenswrapper[36504]: W1203 22:11:33.463964 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45869470_a94b_4424_a580_3bb4d9e0c675.slice/crio-d2e070e2ff15917e1d78b349c90ca91d64b28b2a86b2b1426d51bccbf65b6ed1 WatchSource:0}: Error finding container d2e070e2ff15917e1d78b349c90ca91d64b28b2a86b2b1426d51bccbf65b6ed1: Status 404 returned error can't find the container with id d2e070e2ff15917e1d78b349c90ca91d64b28b2a86b2b1426d51bccbf65b6ed1 Dec 03 22:11:33.856117 master-0 kubenswrapper[36504]: I1203 22:11:33.855963 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5588d9469d-jxz4w" event={"ID":"45869470-a94b-4424-a580-3bb4d9e0c675","Type":"ContainerStarted","Data":"0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2"} Dec 03 22:11:33.856117 master-0 kubenswrapper[36504]: I1203 22:11:33.856016 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5588d9469d-jxz4w" event={"ID":"45869470-a94b-4424-a580-3bb4d9e0c675","Type":"ContainerStarted","Data":"d2e070e2ff15917e1d78b349c90ca91d64b28b2a86b2b1426d51bccbf65b6ed1"} Dec 03 22:11:33.902344 master-0 kubenswrapper[36504]: I1203 22:11:33.902239 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5588d9469d-jxz4w" podStartSLOduration=1.902214912 podStartE2EDuration="1.902214912s" podCreationTimestamp="2025-12-03 22:11:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:33.897548765 +0000 UTC m=+59.117320782" watchObservedRunningTime="2025-12-03 22:11:33.902214912 +0000 UTC m=+59.121986919" Dec 03 22:11:34.737807 master-0 kubenswrapper[36504]: I1203 22:11:34.737327 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:34.737807 master-0 kubenswrapper[36504]: I1203 22:11:34.737383 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:11:34.738419 master-0 kubenswrapper[36504]: I1203 22:11:34.738204 36504 patch_prober.go:28] interesting pod/console-567f65dc5d-rbkjf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Dec 03 22:11:34.738419 master-0 kubenswrapper[36504]: I1203 22:11:34.738249 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-567f65dc5d-rbkjf" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Dec 03 22:11:35.091244 master-0 kubenswrapper[36504]: I1203 22:11:35.091053 36504 scope.go:117] "RemoveContainer" containerID="ebd2204fe249e4ff0cd7edb5edf639c7a4971288017457fe2c22236d90e0112a" Dec 03 22:11:35.131980 master-0 kubenswrapper[36504]: I1203 22:11:35.131926 36504 scope.go:117] "RemoveContainer" containerID="e6e266ea8a0d197e496699cd337eabb6f247b692621f965213e4454b3e59b018" Dec 03 22:11:35.154038 master-0 kubenswrapper[36504]: I1203 22:11:35.153989 36504 scope.go:117] "RemoveContainer" containerID="70791961c08656ffa1017f6f966c1f2ba603d16610929ed1a69c881343d1bfec" Dec 03 22:11:35.177949 master-0 kubenswrapper[36504]: I1203 22:11:35.177905 36504 scope.go:117] "RemoveContainer" containerID="6c639bc69c380d96355863b219dedf37b52fea28072ae5913704ecc349c5d8c1" Dec 03 22:11:35.691172 master-0 kubenswrapper[36504]: E1203 22:11:35.691120 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:11:40.124101 master-0 kubenswrapper[36504]: I1203 22:11:40.123927 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Dec 03 22:11:40.128159 master-0 kubenswrapper[36504]: I1203 22:11:40.128104 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.130515 master-0 kubenswrapper[36504]: I1203 22:11:40.130480 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Dec 03 22:11:40.132367 master-0 kubenswrapper[36504]: I1203 22:11:40.132325 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-wkdbt" Dec 03 22:11:40.132557 master-0 kubenswrapper[36504]: I1203 22:11:40.132529 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 22:11:40.244962 master-0 kubenswrapper[36504]: I1203 22:11:40.244912 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.245757 master-0 kubenswrapper[36504]: I1203 22:11:40.245255 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17294011-9784-4261-89e6-e60d36aecf74-kube-api-access\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.245757 master-0 kubenswrapper[36504]: I1203 22:11:40.245330 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-var-lock\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.347177 master-0 kubenswrapper[36504]: I1203 22:11:40.347100 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.347177 master-0 kubenswrapper[36504]: I1203 22:11:40.347181 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17294011-9784-4261-89e6-e60d36aecf74-kube-api-access\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.347511 master-0 kubenswrapper[36504]: I1203 22:11:40.347238 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-var-lock\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.347511 master-0 kubenswrapper[36504]: I1203 22:11:40.347359 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-var-lock\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.347511 master-0 kubenswrapper[36504]: I1203 22:11:40.347405 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-kubelet-dir\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.364566 master-0 kubenswrapper[36504]: I1203 22:11:40.364530 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17294011-9784-4261-89e6-e60d36aecf74-kube-api-access\") pod \"installer-6-master-0\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.449816 master-0 kubenswrapper[36504]: I1203 22:11:40.449662 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:40.468375 master-0 kubenswrapper[36504]: I1203 22:11:40.468216 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:40.472280 master-0 kubenswrapper[36504]: I1203 22:11:40.472237 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" Dec 03 22:11:42.660490 master-0 kubenswrapper[36504]: I1203 22:11:42.660416 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:42.661425 master-0 kubenswrapper[36504]: I1203 22:11:42.661393 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:11:42.664105 master-0 kubenswrapper[36504]: I1203 22:11:42.664072 36504 patch_prober.go:28] interesting pod/console-5588d9469d-jxz4w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Dec 03 22:11:42.664187 master-0 kubenswrapper[36504]: I1203 22:11:42.664114 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5588d9469d-jxz4w" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Dec 03 22:11:44.426527 master-0 kubenswrapper[36504]: I1203 22:11:44.426434 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:11:44.433034 master-0 kubenswrapper[36504]: I1203 22:11:44.432986 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " pod="openshift-kube-apiserver/installer-5-master-0" Dec 03 22:11:44.527992 master-0 kubenswrapper[36504]: I1203 22:11:44.527933 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") pod \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\" (UID: \"5c8c7291-3150-46a5-9d14-57a23bb51cc0\") " Dec 03 22:11:44.532689 master-0 kubenswrapper[36504]: I1203 22:11:44.532602 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c8c7291-3150-46a5-9d14-57a23bb51cc0" (UID: "5c8c7291-3150-46a5-9d14-57a23bb51cc0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:44.629541 master-0 kubenswrapper[36504]: I1203 22:11:44.629463 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c8c7291-3150-46a5-9d14-57a23bb51cc0-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:44.738203 master-0 kubenswrapper[36504]: I1203 22:11:44.738060 36504 patch_prober.go:28] interesting pod/console-567f65dc5d-rbkjf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Dec 03 22:11:44.738203 master-0 kubenswrapper[36504]: I1203 22:11:44.738134 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-567f65dc5d-rbkjf" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Dec 03 22:11:47.759955 master-0 kubenswrapper[36504]: I1203 22:11:47.759838 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" podUID="7d985732-3cd0-4aff-8050-37330d01507a" containerName="oauth-openshift" containerID="cri-o://794c9e6ecc5560cf0401aa0a4ba5348b3fea5003687d288d4c9525259e576067" gracePeriod=15 Dec 03 22:11:48.108646 master-0 kubenswrapper[36504]: I1203 22:11:48.108508 36504 generic.go:334] "Generic (PLEG): container finished" podID="7d985732-3cd0-4aff-8050-37330d01507a" containerID="794c9e6ecc5560cf0401aa0a4ba5348b3fea5003687d288d4c9525259e576067" exitCode=0 Dec 03 22:11:48.108646 master-0 kubenswrapper[36504]: I1203 22:11:48.108560 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" event={"ID":"7d985732-3cd0-4aff-8050-37330d01507a","Type":"ContainerDied","Data":"794c9e6ecc5560cf0401aa0a4ba5348b3fea5003687d288d4c9525259e576067"} Dec 03 22:11:52.662869 master-0 kubenswrapper[36504]: I1203 22:11:52.660868 36504 patch_prober.go:28] interesting pod/console-5588d9469d-jxz4w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Dec 03 22:11:52.662869 master-0 kubenswrapper[36504]: I1203 22:11:52.660944 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5588d9469d-jxz4w" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Dec 03 22:11:53.610545 master-0 kubenswrapper[36504]: I1203 22:11:53.610428 36504 patch_prober.go:28] interesting pod/oauth-openshift-67dbd4669-gxlrj container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.128.0.101:6443/healthz\": dial tcp 10.128.0.101:6443: connect: connection refused" start-of-body= Dec 03 22:11:53.610545 master-0 kubenswrapper[36504]: I1203 22:11:53.610518 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" podUID="7d985732-3cd0-4aff-8050-37330d01507a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.128.0.101:6443/healthz\": dial tcp 10.128.0.101:6443: connect: connection refused" Dec 03 22:11:54.738205 master-0 kubenswrapper[36504]: I1203 22:11:54.738109 36504 patch_prober.go:28] interesting pod/console-567f65dc5d-rbkjf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" start-of-body= Dec 03 22:11:54.739039 master-0 kubenswrapper[36504]: I1203 22:11:54.738227 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-567f65dc5d-rbkjf" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" probeResult="failure" output="Get \"https://10.128.0.103:8443/health\": dial tcp 10.128.0.103:8443: connect: connection refused" Dec 03 22:11:55.709786 master-0 kubenswrapper[36504]: I1203 22:11:55.709696 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Dec 03 22:11:56.668838 master-0 kubenswrapper[36504]: I1203 22:11:56.668395 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-567f65dc5d-rbkjf"] Dec 03 22:11:56.810799 master-0 kubenswrapper[36504]: I1203 22:11:56.809312 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55b8756475-lz989"] Dec 03 22:11:56.812128 master-0 kubenswrapper[36504]: I1203 22:11:56.811866 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.825632 master-0 kubenswrapper[36504]: I1203 22:11:56.825575 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b8756475-lz989"] Dec 03 22:11:56.965615 master-0 kubenswrapper[36504]: I1203 22:11:56.965511 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-config\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.965726 master-0 kubenswrapper[36504]: I1203 22:11:56.965664 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-oauth-config\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.965797 master-0 kubenswrapper[36504]: I1203 22:11:56.965754 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-oauth-serving-cert\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.965859 master-0 kubenswrapper[36504]: I1203 22:11:56.965823 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-trusted-ca-bundle\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.965859 master-0 kubenswrapper[36504]: I1203 22:11:56.965847 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrh8c\" (UniqueName: \"kubernetes.io/projected/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-kube-api-access-nrh8c\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.965957 master-0 kubenswrapper[36504]: I1203 22:11:56.965895 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-serving-cert\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:56.965957 master-0 kubenswrapper[36504]: I1203 22:11:56.965914 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-service-ca\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.066963 master-0 kubenswrapper[36504]: I1203 22:11:57.066903 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-config\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067174 master-0 kubenswrapper[36504]: I1203 22:11:57.067062 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-oauth-config\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067174 master-0 kubenswrapper[36504]: I1203 22:11:57.067126 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-oauth-serving-cert\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067174 master-0 kubenswrapper[36504]: I1203 22:11:57.067152 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-trusted-ca-bundle\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067282 master-0 kubenswrapper[36504]: I1203 22:11:57.067179 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrh8c\" (UniqueName: \"kubernetes.io/projected/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-kube-api-access-nrh8c\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067282 master-0 kubenswrapper[36504]: I1203 22:11:57.067215 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-serving-cert\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067282 master-0 kubenswrapper[36504]: I1203 22:11:57.067241 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-service-ca\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.067984 master-0 kubenswrapper[36504]: I1203 22:11:57.067935 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-config\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.068491 master-0 kubenswrapper[36504]: I1203 22:11:57.068463 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-service-ca\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.070121 master-0 kubenswrapper[36504]: I1203 22:11:57.070088 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-oauth-serving-cert\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.071282 master-0 kubenswrapper[36504]: I1203 22:11:57.071245 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-oauth-config\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.075114 master-0 kubenswrapper[36504]: I1203 22:11:57.075054 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-trusted-ca-bundle\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.080583 master-0 kubenswrapper[36504]: I1203 22:11:57.080549 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-serving-cert\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.086908 master-0 kubenswrapper[36504]: I1203 22:11:57.086831 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrh8c\" (UniqueName: \"kubernetes.io/projected/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-kube-api-access-nrh8c\") pod \"console-55b8756475-lz989\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.139752 master-0 kubenswrapper[36504]: I1203 22:11:57.138106 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b8756475-lz989" Dec 03 22:11:57.451576 master-0 kubenswrapper[36504]: I1203 22:11:57.451533 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:57.485950 master-0 kubenswrapper[36504]: I1203 22:11:57.485832 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn"] Dec 03 22:11:57.486227 master-0 kubenswrapper[36504]: E1203 22:11:57.486204 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d985732-3cd0-4aff-8050-37330d01507a" containerName="oauth-openshift" Dec 03 22:11:57.486227 master-0 kubenswrapper[36504]: I1203 22:11:57.486226 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d985732-3cd0-4aff-8050-37330d01507a" containerName="oauth-openshift" Dec 03 22:11:57.486461 master-0 kubenswrapper[36504]: I1203 22:11:57.486438 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d985732-3cd0-4aff-8050-37330d01507a" containerName="oauth-openshift" Dec 03 22:11:57.487160 master-0 kubenswrapper[36504]: I1203 22:11:57.487126 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.517203 master-0 kubenswrapper[36504]: I1203 22:11:57.516837 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn"] Dec 03 22:11:57.528468 master-0 kubenswrapper[36504]: I1203 22:11:57.528422 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Dec 03 22:11:57.583557 master-0 kubenswrapper[36504]: I1203 22:11:57.583457 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-router-certs\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.583557 master-0 kubenswrapper[36504]: I1203 22:11:57.583540 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-provider-selection\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.583557 master-0 kubenswrapper[36504]: I1203 22:11:57.583576 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-serving-cert\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.583645 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-error\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.583693 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-login\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.583746 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-ocp-branding-template\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.583830 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-audit-policies\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.583885 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-trusted-ca-bundle\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.583962 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-session\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.584024 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d985732-3cd0-4aff-8050-37330d01507a-audit-dir\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.584077 master-0 kubenswrapper[36504]: I1203 22:11:57.584071 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-cliconfig\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584100 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blncr\" (UniqueName: \"kubernetes.io/projected/7d985732-3cd0-4aff-8050-37330d01507a-kube-api-access-blncr\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584146 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-service-ca\") pod \"7d985732-3cd0-4aff-8050-37330d01507a\" (UID: \"7d985732-3cd0-4aff-8050-37330d01507a\") " Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584390 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ad757cc-9780-4081-89ba-415ce2751b01-audit-dir\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584444 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-login\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584523 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584567 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-router-certs\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584600 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584628 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-audit-policies\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584654 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584684 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-service-ca\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584709 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-error\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584748 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584820 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584851 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-session\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.585508 master-0 kubenswrapper[36504]: I1203 22:11:57.584936 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnn8g\" (UniqueName: \"kubernetes.io/projected/8ad757cc-9780-4081-89ba-415ce2751b01-kube-api-access-xnn8g\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.586614 master-0 kubenswrapper[36504]: I1203 22:11:57.586572 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:57.586668 master-0 kubenswrapper[36504]: I1203 22:11:57.586633 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d985732-3cd0-4aff-8050-37330d01507a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:11:57.586703 master-0 kubenswrapper[36504]: I1203 22:11:57.586654 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:57.587022 master-0 kubenswrapper[36504]: I1203 22:11:57.586958 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.587214 master-0 kubenswrapper[36504]: I1203 22:11:57.587113 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:57.587547 master-0 kubenswrapper[36504]: I1203 22:11:57.587504 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:11:57.588577 master-0 kubenswrapper[36504]: I1203 22:11:57.588539 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.588836 master-0 kubenswrapper[36504]: I1203 22:11:57.588750 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.589746 master-0 kubenswrapper[36504]: I1203 22:11:57.589711 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.589937 master-0 kubenswrapper[36504]: I1203 22:11:57.589906 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.589987 master-0 kubenswrapper[36504]: I1203 22:11:57.589951 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.589987 master-0 kubenswrapper[36504]: I1203 22:11:57.589977 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:11:57.590492 master-0 kubenswrapper[36504]: I1203 22:11:57.590423 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d985732-3cd0-4aff-8050-37330d01507a-kube-api-access-blncr" (OuterVolumeSpecName: "kube-api-access-blncr") pod "7d985732-3cd0-4aff-8050-37330d01507a" (UID: "7d985732-3cd0-4aff-8050-37330d01507a"). InnerVolumeSpecName "kube-api-access-blncr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:57.686380 master-0 kubenswrapper[36504]: I1203 22:11:57.686307 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ad757cc-9780-4081-89ba-415ce2751b01-audit-dir\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686407 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-login\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686446 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686487 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-router-certs\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686486 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ad757cc-9780-4081-89ba-415ce2751b01-audit-dir\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686523 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686602 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-audit-policies\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686646 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686703 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-service-ca\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686760 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-error\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686875 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.686984 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687028 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-session\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687093 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnn8g\" (UniqueName: \"kubernetes.io/projected/8ad757cc-9780-4081-89ba-415ce2751b01-kube-api-access-xnn8g\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687254 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687278 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blncr\" (UniqueName: \"kubernetes.io/projected/7d985732-3cd0-4aff-8050-37330d01507a-kube-api-access-blncr\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687301 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687323 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687344 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687367 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687389 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-audit-policies\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687392 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687429 36504 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-audit-policies\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687443 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687453 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687465 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687475 36504 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7d985732-3cd0-4aff-8050-37330d01507a-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687485 36504 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d985732-3cd0-4aff-8050-37330d01507a-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687532 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-service-ca\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687995 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.687992 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.690466 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-login\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.691509 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.692096 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-session\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.694095 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.694371 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-router-certs\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.694493 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b8756475-lz989"] Dec 03 22:11:57.696230 master-0 kubenswrapper[36504]: I1203 22:11:57.694893 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.698407 master-0 kubenswrapper[36504]: I1203 22:11:57.697017 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8ad757cc-9780-4081-89ba-415ce2751b01-v4-0-config-user-template-error\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.701209 master-0 kubenswrapper[36504]: W1203 22:11:57.701160 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15b7b7c_48b5_417b_8f41_58306b2f0e9a.slice/crio-b0579da37a183fbc7b3196e6ef131d8b2fe0fcb480f346328b7ae5fd8a70ab60 WatchSource:0}: Error finding container b0579da37a183fbc7b3196e6ef131d8b2fe0fcb480f346328b7ae5fd8a70ab60: Status 404 returned error can't find the container with id b0579da37a183fbc7b3196e6ef131d8b2fe0fcb480f346328b7ae5fd8a70ab60 Dec 03 22:11:57.703484 master-0 kubenswrapper[36504]: I1203 22:11:57.703380 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnn8g\" (UniqueName: \"kubernetes.io/projected/8ad757cc-9780-4081-89ba-415ce2751b01-kube-api-access-xnn8g\") pod \"oauth-openshift-65b5c9bf6b-s2qsn\" (UID: \"8ad757cc-9780-4081-89ba-415ce2751b01\") " pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:57.822899 master-0 kubenswrapper[36504]: I1203 22:11:57.819372 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:58.198105 master-0 kubenswrapper[36504]: I1203 22:11:58.197988 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6f5db8559b-xzb5p" event={"ID":"716a95ab-7d31-41ac-ad14-f756f17b3179","Type":"ContainerStarted","Data":"097b39ee1ca11cd33527a661491b5776628f9841ad5bc743bdd04fae4722ff81"} Dec 03 22:11:58.198442 master-0 kubenswrapper[36504]: I1203 22:11:58.198200 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:11:58.199957 master-0 kubenswrapper[36504]: I1203 22:11:58.199899 36504 patch_prober.go:28] interesting pod/downloads-6f5db8559b-xzb5p container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" start-of-body= Dec 03 22:11:58.199957 master-0 kubenswrapper[36504]: I1203 22:11:58.199941 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6f5db8559b-xzb5p" podUID="716a95ab-7d31-41ac-ad14-f756f17b3179" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" Dec 03 22:11:58.210542 master-0 kubenswrapper[36504]: I1203 22:11:58.210422 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" event={"ID":"7d985732-3cd0-4aff-8050-37330d01507a","Type":"ContainerDied","Data":"31c3c6d2ee557d30a0fe4fb166e592600bd64c5ff2040e73591b3be57a3494c4"} Dec 03 22:11:58.210729 master-0 kubenswrapper[36504]: I1203 22:11:58.210560 36504 scope.go:117] "RemoveContainer" containerID="794c9e6ecc5560cf0401aa0a4ba5348b3fea5003687d288d4c9525259e576067" Dec 03 22:11:58.210729 master-0 kubenswrapper[36504]: I1203 22:11:58.210466 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67dbd4669-gxlrj" Dec 03 22:11:58.219708 master-0 kubenswrapper[36504]: I1203 22:11:58.219595 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6f5db8559b-xzb5p" podStartSLOduration=2.477428125 podStartE2EDuration="45.219570916s" podCreationTimestamp="2025-12-03 22:11:13 +0000 UTC" firstStartedPulling="2025-12-03 22:11:14.424368658 +0000 UTC m=+39.644140665" lastFinishedPulling="2025-12-03 22:11:57.166511449 +0000 UTC m=+82.386283456" observedRunningTime="2025-12-03 22:11:58.214856358 +0000 UTC m=+83.434628395" watchObservedRunningTime="2025-12-03 22:11:58.219570916 +0000 UTC m=+83.439342923" Dec 03 22:11:58.225547 master-0 kubenswrapper[36504]: I1203 22:11:58.225482 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"17294011-9784-4261-89e6-e60d36aecf74","Type":"ContainerStarted","Data":"fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3"} Dec 03 22:11:58.225547 master-0 kubenswrapper[36504]: I1203 22:11:58.225541 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"17294011-9784-4261-89e6-e60d36aecf74","Type":"ContainerStarted","Data":"9632a5acc2d3c4d692e6a513e7c32cfa5632db0898fe0d3ed86806b694a2e7e2"} Dec 03 22:11:58.225830 master-0 kubenswrapper[36504]: I1203 22:11:58.225674 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-6-master-0" podUID="17294011-9784-4261-89e6-e60d36aecf74" containerName="installer" containerID="cri-o://fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3" gracePeriod=30 Dec 03 22:11:58.229697 master-0 kubenswrapper[36504]: I1203 22:11:58.229633 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b8756475-lz989" event={"ID":"a15b7b7c-48b5-417b-8f41-58306b2f0e9a","Type":"ContainerStarted","Data":"4f12e1f86dff5c2700df5dddb011aaeafcef35ccf1913bff972c8c1e62e3eb1c"} Dec 03 22:11:58.229697 master-0 kubenswrapper[36504]: I1203 22:11:58.229696 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b8756475-lz989" event={"ID":"a15b7b7c-48b5-417b-8f41-58306b2f0e9a","Type":"ContainerStarted","Data":"b0579da37a183fbc7b3196e6ef131d8b2fe0fcb480f346328b7ae5fd8a70ab60"} Dec 03 22:11:58.252973 master-0 kubenswrapper[36504]: I1203 22:11:58.252873 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-6-master-0" podStartSLOduration=18.252847932 podStartE2EDuration="18.252847932s" podCreationTimestamp="2025-12-03 22:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:58.250267031 +0000 UTC m=+83.470039038" watchObservedRunningTime="2025-12-03 22:11:58.252847932 +0000 UTC m=+83.472619939" Dec 03 22:11:58.271699 master-0 kubenswrapper[36504]: I1203 22:11:58.271617 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn"] Dec 03 22:11:58.292626 master-0 kubenswrapper[36504]: I1203 22:11:58.292553 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55b8756475-lz989" podStartSLOduration=2.292535859 podStartE2EDuration="2.292535859s" podCreationTimestamp="2025-12-03 22:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:58.277752236 +0000 UTC m=+83.497524243" watchObservedRunningTime="2025-12-03 22:11:58.292535859 +0000 UTC m=+83.512307866" Dec 03 22:11:58.302611 master-0 kubenswrapper[36504]: I1203 22:11:58.300313 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-67dbd4669-gxlrj"] Dec 03 22:11:58.302611 master-0 kubenswrapper[36504]: W1203 22:11:58.301307 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad757cc_9780_4081_89ba_415ce2751b01.slice/crio-888cfafc9b8754c1058433ae069f7d2889fde00db45adcb9bd826d11e7474dbb WatchSource:0}: Error finding container 888cfafc9b8754c1058433ae069f7d2889fde00db45adcb9bd826d11e7474dbb: Status 404 returned error can't find the container with id 888cfafc9b8754c1058433ae069f7d2889fde00db45adcb9bd826d11e7474dbb Dec 03 22:11:58.303111 master-0 kubenswrapper[36504]: I1203 22:11:58.302810 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-67dbd4669-gxlrj"] Dec 03 22:11:58.671160 master-0 kubenswrapper[36504]: I1203 22:11:58.671111 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_17294011-9784-4261-89e6-e60d36aecf74/installer/0.log" Dec 03 22:11:58.671553 master-0 kubenswrapper[36504]: I1203 22:11:58.671191 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:58.806937 master-0 kubenswrapper[36504]: I1203 22:11:58.806739 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17294011-9784-4261-89e6-e60d36aecf74-kube-api-access\") pod \"17294011-9784-4261-89e6-e60d36aecf74\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " Dec 03 22:11:58.806937 master-0 kubenswrapper[36504]: I1203 22:11:58.806925 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-var-lock\") pod \"17294011-9784-4261-89e6-e60d36aecf74\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " Dec 03 22:11:58.807614 master-0 kubenswrapper[36504]: I1203 22:11:58.806991 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-kubelet-dir\") pod \"17294011-9784-4261-89e6-e60d36aecf74\" (UID: \"17294011-9784-4261-89e6-e60d36aecf74\") " Dec 03 22:11:58.807614 master-0 kubenswrapper[36504]: I1203 22:11:58.807031 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-var-lock" (OuterVolumeSpecName: "var-lock") pod "17294011-9784-4261-89e6-e60d36aecf74" (UID: "17294011-9784-4261-89e6-e60d36aecf74"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:11:58.807614 master-0 kubenswrapper[36504]: I1203 22:11:58.807232 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17294011-9784-4261-89e6-e60d36aecf74" (UID: "17294011-9784-4261-89e6-e60d36aecf74"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:11:58.807614 master-0 kubenswrapper[36504]: I1203 22:11:58.807341 36504 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:58.810784 master-0 kubenswrapper[36504]: I1203 22:11:58.810713 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17294011-9784-4261-89e6-e60d36aecf74-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17294011-9784-4261-89e6-e60d36aecf74" (UID: "17294011-9784-4261-89e6-e60d36aecf74"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:11:58.869817 master-0 kubenswrapper[36504]: I1203 22:11:58.869708 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Dec 03 22:11:58.870161 master-0 kubenswrapper[36504]: E1203 22:11:58.870131 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17294011-9784-4261-89e6-e60d36aecf74" containerName="installer" Dec 03 22:11:58.870161 master-0 kubenswrapper[36504]: I1203 22:11:58.870152 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="17294011-9784-4261-89e6-e60d36aecf74" containerName="installer" Dec 03 22:11:58.870340 master-0 kubenswrapper[36504]: I1203 22:11:58.870299 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="17294011-9784-4261-89e6-e60d36aecf74" containerName="installer" Dec 03 22:11:58.871145 master-0 kubenswrapper[36504]: I1203 22:11:58.871091 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:58.898861 master-0 kubenswrapper[36504]: I1203 22:11:58.898035 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Dec 03 22:11:58.912034 master-0 kubenswrapper[36504]: I1203 22:11:58.911979 36504 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17294011-9784-4261-89e6-e60d36aecf74-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:58.912252 master-0 kubenswrapper[36504]: I1203 22:11:58.912030 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17294011-9784-4261-89e6-e60d36aecf74-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:11:59.013512 master-0 kubenswrapper[36504]: I1203 22:11:59.013438 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kube-api-access\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.013847 master-0 kubenswrapper[36504]: I1203 22:11:59.013683 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-var-lock\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.013847 master-0 kubenswrapper[36504]: I1203 22:11:59.013735 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.104608 master-0 kubenswrapper[36504]: I1203 22:11:59.104471 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d985732-3cd0-4aff-8050-37330d01507a" path="/var/lib/kubelet/pods/7d985732-3cd0-4aff-8050-37330d01507a/volumes" Dec 03 22:11:59.115681 master-0 kubenswrapper[36504]: I1203 22:11:59.115625 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kube-api-access\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.115840 master-0 kubenswrapper[36504]: I1203 22:11:59.115702 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-var-lock\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.115840 master-0 kubenswrapper[36504]: I1203 22:11:59.115722 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.115943 master-0 kubenswrapper[36504]: I1203 22:11:59.115845 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kubelet-dir\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.115943 master-0 kubenswrapper[36504]: I1203 22:11:59.115854 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-var-lock\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.139320 master-0 kubenswrapper[36504]: I1203 22:11:59.139268 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kube-api-access\") pod \"installer-7-master-0\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.213802 master-0 kubenswrapper[36504]: I1203 22:11:59.213729 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:11:59.243042 master-0 kubenswrapper[36504]: I1203 22:11:59.242991 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" event={"ID":"8ad757cc-9780-4081-89ba-415ce2751b01","Type":"ContainerStarted","Data":"ddf2a5998a69e6e648424c8c6904bc0072dc9a73dea664dcc90fbb8f1ae5577e"} Dec 03 22:11:59.243042 master-0 kubenswrapper[36504]: I1203 22:11:59.243029 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" event={"ID":"8ad757cc-9780-4081-89ba-415ce2751b01","Type":"ContainerStarted","Data":"888cfafc9b8754c1058433ae069f7d2889fde00db45adcb9bd826d11e7474dbb"} Dec 03 22:11:59.243333 master-0 kubenswrapper[36504]: I1203 22:11:59.243315 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:59.244696 master-0 kubenswrapper[36504]: I1203 22:11:59.244664 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-6-master-0_17294011-9784-4261-89e6-e60d36aecf74/installer/0.log" Dec 03 22:11:59.244758 master-0 kubenswrapper[36504]: I1203 22:11:59.244697 36504 generic.go:334] "Generic (PLEG): container finished" podID="17294011-9784-4261-89e6-e60d36aecf74" containerID="fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3" exitCode=2 Dec 03 22:11:59.244758 master-0 kubenswrapper[36504]: I1203 22:11:59.244747 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-6-master-0" Dec 03 22:11:59.244845 master-0 kubenswrapper[36504]: I1203 22:11:59.244786 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"17294011-9784-4261-89e6-e60d36aecf74","Type":"ContainerDied","Data":"fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3"} Dec 03 22:11:59.244878 master-0 kubenswrapper[36504]: I1203 22:11:59.244850 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-6-master-0" event={"ID":"17294011-9784-4261-89e6-e60d36aecf74","Type":"ContainerDied","Data":"9632a5acc2d3c4d692e6a513e7c32cfa5632db0898fe0d3ed86806b694a2e7e2"} Dec 03 22:11:59.244921 master-0 kubenswrapper[36504]: I1203 22:11:59.244877 36504 scope.go:117] "RemoveContainer" containerID="fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3" Dec 03 22:11:59.245577 master-0 kubenswrapper[36504]: I1203 22:11:59.245546 36504 patch_prober.go:28] interesting pod/downloads-6f5db8559b-xzb5p container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" start-of-body= Dec 03 22:11:59.245628 master-0 kubenswrapper[36504]: I1203 22:11:59.245582 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6f5db8559b-xzb5p" podUID="716a95ab-7d31-41ac-ad14-f756f17b3179" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" Dec 03 22:11:59.259460 master-0 kubenswrapper[36504]: I1203 22:11:59.259417 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" Dec 03 22:11:59.275601 master-0 kubenswrapper[36504]: I1203 22:11:59.275295 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65b5c9bf6b-s2qsn" podStartSLOduration=17.275271336 podStartE2EDuration="17.275271336s" podCreationTimestamp="2025-12-03 22:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:11:59.269507595 +0000 UTC m=+84.489279612" watchObservedRunningTime="2025-12-03 22:11:59.275271336 +0000 UTC m=+84.495043343" Dec 03 22:11:59.304698 master-0 kubenswrapper[36504]: I1203 22:11:59.304648 36504 scope.go:117] "RemoveContainer" containerID="fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3" Dec 03 22:11:59.305724 master-0 kubenswrapper[36504]: E1203 22:11:59.305652 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3\": container with ID starting with fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3 not found: ID does not exist" containerID="fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3" Dec 03 22:11:59.305820 master-0 kubenswrapper[36504]: I1203 22:11:59.305735 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3"} err="failed to get container status \"fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3\": rpc error: code = NotFound desc = could not find container \"fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3\": container with ID starting with fabf44d884e21a24269638304bc819e69e032727612322cc52a302a98d3b0fb3 not found: ID does not exist" Dec 03 22:11:59.386854 master-0 kubenswrapper[36504]: I1203 22:11:59.386557 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Dec 03 22:11:59.390922 master-0 kubenswrapper[36504]: I1203 22:11:59.390863 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-6-master-0"] Dec 03 22:11:59.913509 master-0 kubenswrapper[36504]: I1203 22:11:59.913402 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-7-master-0"] Dec 03 22:11:59.920283 master-0 kubenswrapper[36504]: W1203 22:11:59.920231 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e0ffab3_4f70_4bc8_9464_598c0668a4b8.slice/crio-b947da5065bf745ebbd9eaca8b8328c1762ab378324b0943d0c669546a9364fb WatchSource:0}: Error finding container b947da5065bf745ebbd9eaca8b8328c1762ab378324b0943d0c669546a9364fb: Status 404 returned error can't find the container with id b947da5065bf745ebbd9eaca8b8328c1762ab378324b0943d0c669546a9364fb Dec 03 22:12:00.284212 master-0 kubenswrapper[36504]: I1203 22:12:00.284141 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"0e0ffab3-4f70-4bc8-9464-598c0668a4b8","Type":"ContainerStarted","Data":"b947da5065bf745ebbd9eaca8b8328c1762ab378324b0943d0c669546a9364fb"} Dec 03 22:12:01.124001 master-0 kubenswrapper[36504]: I1203 22:12:01.123910 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17294011-9784-4261-89e6-e60d36aecf74" path="/var/lib/kubelet/pods/17294011-9784-4261-89e6-e60d36aecf74/volumes" Dec 03 22:12:01.422187 master-0 kubenswrapper[36504]: I1203 22:12:01.422013 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"0e0ffab3-4f70-4bc8-9464-598c0668a4b8","Type":"ContainerStarted","Data":"f475b284f57f52fc0c7eb702e9328890537fd74a9dd50bbac04f195c30c41055"} Dec 03 22:12:01.764712 master-0 kubenswrapper[36504]: I1203 22:12:01.764465 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-7-master-0" podStartSLOduration=3.764431928 podStartE2EDuration="3.764431928s" podCreationTimestamp="2025-12-03 22:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:12:01.763992754 +0000 UTC m=+86.983764771" watchObservedRunningTime="2025-12-03 22:12:01.764431928 +0000 UTC m=+86.984203935" Dec 03 22:12:02.661561 master-0 kubenswrapper[36504]: I1203 22:12:02.661483 36504 patch_prober.go:28] interesting pod/console-5588d9469d-jxz4w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Dec 03 22:12:02.661561 master-0 kubenswrapper[36504]: I1203 22:12:02.661550 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5588d9469d-jxz4w" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Dec 03 22:12:03.546681 master-0 kubenswrapper[36504]: I1203 22:12:03.546622 36504 patch_prober.go:28] interesting pod/downloads-6f5db8559b-xzb5p container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" start-of-body= Dec 03 22:12:03.546988 master-0 kubenswrapper[36504]: I1203 22:12:03.546692 36504 patch_prober.go:28] interesting pod/downloads-6f5db8559b-xzb5p container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" start-of-body= Dec 03 22:12:03.546988 master-0 kubenswrapper[36504]: I1203 22:12:03.546697 36504 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-6f5db8559b-xzb5p" podUID="716a95ab-7d31-41ac-ad14-f756f17b3179" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" Dec 03 22:12:03.546988 master-0 kubenswrapper[36504]: I1203 22:12:03.546783 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6f5db8559b-xzb5p" podUID="716a95ab-7d31-41ac-ad14-f756f17b3179" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.102:8080/\": dial tcp 10.128.0.102:8080: connect: connection refused" Dec 03 22:12:05.223803 master-0 kubenswrapper[36504]: I1203 22:12:05.223699 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:12:05.267890 master-0 kubenswrapper[36504]: I1203 22:12:05.267817 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:12:05.483459 master-0 kubenswrapper[36504]: I1203 22:12:05.483295 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:12:07.139290 master-0 kubenswrapper[36504]: I1203 22:12:07.139218 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55b8756475-lz989" Dec 03 22:12:07.139949 master-0 kubenswrapper[36504]: I1203 22:12:07.139370 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55b8756475-lz989" Dec 03 22:12:07.141118 master-0 kubenswrapper[36504]: I1203 22:12:07.141068 36504 patch_prober.go:28] interesting pod/console-55b8756475-lz989 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" start-of-body= Dec 03 22:12:07.141182 master-0 kubenswrapper[36504]: I1203 22:12:07.141138 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-55b8756475-lz989" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" Dec 03 22:12:12.095861 master-0 kubenswrapper[36504]: I1203 22:12:12.095665 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:12:12.661122 master-0 kubenswrapper[36504]: I1203 22:12:12.661031 36504 patch_prober.go:28] interesting pod/console-5588d9469d-jxz4w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Dec 03 22:12:12.661442 master-0 kubenswrapper[36504]: I1203 22:12:12.661117 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5588d9469d-jxz4w" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Dec 03 22:12:13.561989 master-0 kubenswrapper[36504]: I1203 22:12:13.561871 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6f5db8559b-xzb5p" Dec 03 22:12:17.139106 master-0 kubenswrapper[36504]: I1203 22:12:17.139035 36504 patch_prober.go:28] interesting pod/console-55b8756475-lz989 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" start-of-body= Dec 03 22:12:17.139628 master-0 kubenswrapper[36504]: I1203 22:12:17.139141 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-55b8756475-lz989" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" Dec 03 22:12:21.753669 master-0 kubenswrapper[36504]: I1203 22:12:21.753507 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-567f65dc5d-rbkjf" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" containerID="cri-o://37738bf3ff90188b706f6cd5519505fb6ffa14fcd942677e11d536507841aed9" gracePeriod=15 Dec 03 22:12:22.663336 master-0 kubenswrapper[36504]: I1203 22:12:22.663215 36504 patch_prober.go:28] interesting pod/console-5588d9469d-jxz4w container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" start-of-body= Dec 03 22:12:22.663336 master-0 kubenswrapper[36504]: I1203 22:12:22.663330 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-5588d9469d-jxz4w" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" probeResult="failure" output="Get \"https://10.128.0.104:8443/health\": dial tcp 10.128.0.104:8443: connect: connection refused" Dec 03 22:12:22.700530 master-0 kubenswrapper[36504]: I1203 22:12:22.700443 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567f65dc5d-rbkjf_01c52731-571d-4eaf-9282-6d662b25102e/console/0.log" Dec 03 22:12:22.700872 master-0 kubenswrapper[36504]: I1203 22:12:22.700557 36504 generic.go:334] "Generic (PLEG): container finished" podID="01c52731-571d-4eaf-9282-6d662b25102e" containerID="37738bf3ff90188b706f6cd5519505fb6ffa14fcd942677e11d536507841aed9" exitCode=2 Dec 03 22:12:22.700872 master-0 kubenswrapper[36504]: I1203 22:12:22.700622 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567f65dc5d-rbkjf" event={"ID":"01c52731-571d-4eaf-9282-6d662b25102e","Type":"ContainerDied","Data":"37738bf3ff90188b706f6cd5519505fb6ffa14fcd942677e11d536507841aed9"} Dec 03 22:12:23.622182 master-0 kubenswrapper[36504]: I1203 22:12:23.622147 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567f65dc5d-rbkjf_01c52731-571d-4eaf-9282-6d662b25102e/console/0.log" Dec 03 22:12:23.622799 master-0 kubenswrapper[36504]: I1203 22:12:23.622758 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:12:23.710690 master-0 kubenswrapper[36504]: I1203 22:12:23.710617 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-567f65dc5d-rbkjf_01c52731-571d-4eaf-9282-6d662b25102e/console/0.log" Dec 03 22:12:23.711049 master-0 kubenswrapper[36504]: I1203 22:12:23.710699 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-567f65dc5d-rbkjf" event={"ID":"01c52731-571d-4eaf-9282-6d662b25102e","Type":"ContainerDied","Data":"5b2c868b07667944d0bf06d1b1d51cfe486835ba0bf37538466727b63e6ed24e"} Dec 03 22:12:23.711049 master-0 kubenswrapper[36504]: I1203 22:12:23.710753 36504 scope.go:117] "RemoveContainer" containerID="37738bf3ff90188b706f6cd5519505fb6ffa14fcd942677e11d536507841aed9" Dec 03 22:12:23.711049 master-0 kubenswrapper[36504]: I1203 22:12:23.710958 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-567f65dc5d-rbkjf" Dec 03 22:12:23.731536 master-0 kubenswrapper[36504]: I1203 22:12:23.731470 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxzwx\" (UniqueName: \"kubernetes.io/projected/01c52731-571d-4eaf-9282-6d662b25102e-kube-api-access-qxzwx\") pod \"01c52731-571d-4eaf-9282-6d662b25102e\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " Dec 03 22:12:23.731693 master-0 kubenswrapper[36504]: I1203 22:12:23.731542 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-oauth-config\") pod \"01c52731-571d-4eaf-9282-6d662b25102e\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " Dec 03 22:12:23.731693 master-0 kubenswrapper[36504]: I1203 22:12:23.731671 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-console-config\") pod \"01c52731-571d-4eaf-9282-6d662b25102e\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " Dec 03 22:12:23.731809 master-0 kubenswrapper[36504]: I1203 22:12:23.731743 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-serving-cert\") pod \"01c52731-571d-4eaf-9282-6d662b25102e\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " Dec 03 22:12:23.731870 master-0 kubenswrapper[36504]: I1203 22:12:23.731840 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-oauth-serving-cert\") pod \"01c52731-571d-4eaf-9282-6d662b25102e\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " Dec 03 22:12:23.731983 master-0 kubenswrapper[36504]: I1203 22:12:23.731950 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-service-ca\") pod \"01c52731-571d-4eaf-9282-6d662b25102e\" (UID: \"01c52731-571d-4eaf-9282-6d662b25102e\") " Dec 03 22:12:23.732800 master-0 kubenswrapper[36504]: I1203 22:12:23.732698 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "01c52731-571d-4eaf-9282-6d662b25102e" (UID: "01c52731-571d-4eaf-9282-6d662b25102e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:23.732918 master-0 kubenswrapper[36504]: I1203 22:12:23.732834 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-service-ca" (OuterVolumeSpecName: "service-ca") pod "01c52731-571d-4eaf-9282-6d662b25102e" (UID: "01c52731-571d-4eaf-9282-6d662b25102e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:23.732918 master-0 kubenswrapper[36504]: I1203 22:12:23.732846 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-console-config" (OuterVolumeSpecName: "console-config") pod "01c52731-571d-4eaf-9282-6d662b25102e" (UID: "01c52731-571d-4eaf-9282-6d662b25102e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:23.736416 master-0 kubenswrapper[36504]: I1203 22:12:23.736362 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c52731-571d-4eaf-9282-6d662b25102e-kube-api-access-qxzwx" (OuterVolumeSpecName: "kube-api-access-qxzwx") pod "01c52731-571d-4eaf-9282-6d662b25102e" (UID: "01c52731-571d-4eaf-9282-6d662b25102e"). InnerVolumeSpecName "kube-api-access-qxzwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:23.737058 master-0 kubenswrapper[36504]: I1203 22:12:23.736939 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "01c52731-571d-4eaf-9282-6d662b25102e" (UID: "01c52731-571d-4eaf-9282-6d662b25102e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:23.737441 master-0 kubenswrapper[36504]: I1203 22:12:23.737238 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "01c52731-571d-4eaf-9282-6d662b25102e" (UID: "01c52731-571d-4eaf-9282-6d662b25102e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:23.835109 master-0 kubenswrapper[36504]: I1203 22:12:23.834821 36504 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:23.835109 master-0 kubenswrapper[36504]: I1203 22:12:23.834894 36504 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:23.835109 master-0 kubenswrapper[36504]: I1203 22:12:23.834909 36504 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:23.835109 master-0 kubenswrapper[36504]: I1203 22:12:23.834921 36504 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c52731-571d-4eaf-9282-6d662b25102e-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:23.835109 master-0 kubenswrapper[36504]: I1203 22:12:23.834934 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxzwx\" (UniqueName: \"kubernetes.io/projected/01c52731-571d-4eaf-9282-6d662b25102e-kube-api-access-qxzwx\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:23.835109 master-0 kubenswrapper[36504]: I1203 22:12:23.834948 36504 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c52731-571d-4eaf-9282-6d662b25102e-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:25.396283 master-0 kubenswrapper[36504]: I1203 22:12:25.396205 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5588d9469d-jxz4w"] Dec 03 22:12:25.443611 master-0 kubenswrapper[36504]: I1203 22:12:25.443526 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-567f65dc5d-rbkjf"] Dec 03 22:12:25.463934 master-0 kubenswrapper[36504]: I1203 22:12:25.461569 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-567f65dc5d-rbkjf"] Dec 03 22:12:25.478805 master-0 kubenswrapper[36504]: I1203 22:12:25.473304 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c4b7b5d77-2z5zb"] Dec 03 22:12:25.478805 master-0 kubenswrapper[36504]: E1203 22:12:25.473631 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" Dec 03 22:12:25.478805 master-0 kubenswrapper[36504]: I1203 22:12:25.473647 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" Dec 03 22:12:25.478805 master-0 kubenswrapper[36504]: I1203 22:12:25.473834 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="01c52731-571d-4eaf-9282-6d662b25102e" containerName="console" Dec 03 22:12:25.478805 master-0 kubenswrapper[36504]: I1203 22:12:25.474310 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.501026 master-0 kubenswrapper[36504]: I1203 22:12:25.500406 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4b7b5d77-2z5zb"] Dec 03 22:12:25.563965 master-0 kubenswrapper[36504]: I1203 22:12:25.563882 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-trusted-ca-bundle\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.564595 master-0 kubenswrapper[36504]: I1203 22:12:25.564036 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv2rr\" (UniqueName: \"kubernetes.io/projected/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-kube-api-access-vv2rr\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.564595 master-0 kubenswrapper[36504]: I1203 22:12:25.564141 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-service-ca\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.564595 master-0 kubenswrapper[36504]: I1203 22:12:25.564255 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-oauth-config\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.564595 master-0 kubenswrapper[36504]: I1203 22:12:25.564310 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-serving-cert\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.564595 master-0 kubenswrapper[36504]: I1203 22:12:25.564356 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-oauth-serving-cert\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.564595 master-0 kubenswrapper[36504]: I1203 22:12:25.564391 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-config\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.665902 master-0 kubenswrapper[36504]: I1203 22:12:25.665745 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-oauth-config\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.665902 master-0 kubenswrapper[36504]: I1203 22:12:25.665872 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-serving-cert\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.666150 master-0 kubenswrapper[36504]: I1203 22:12:25.665925 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-oauth-serving-cert\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.666150 master-0 kubenswrapper[36504]: I1203 22:12:25.665974 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-config\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.666150 master-0 kubenswrapper[36504]: I1203 22:12:25.666044 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-trusted-ca-bundle\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.666150 master-0 kubenswrapper[36504]: I1203 22:12:25.666086 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv2rr\" (UniqueName: \"kubernetes.io/projected/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-kube-api-access-vv2rr\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.666150 master-0 kubenswrapper[36504]: I1203 22:12:25.666148 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-service-ca\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.667537 master-0 kubenswrapper[36504]: I1203 22:12:25.667003 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-config\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.667537 master-0 kubenswrapper[36504]: I1203 22:12:25.667505 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-trusted-ca-bundle\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.667841 master-0 kubenswrapper[36504]: I1203 22:12:25.667806 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-oauth-serving-cert\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.669505 master-0 kubenswrapper[36504]: I1203 22:12:25.669477 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-oauth-config\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.669562 master-0 kubenswrapper[36504]: I1203 22:12:25.669512 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-serving-cert\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.671267 master-0 kubenswrapper[36504]: I1203 22:12:25.671213 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-service-ca\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.680176 master-0 kubenswrapper[36504]: I1203 22:12:25.680138 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv2rr\" (UniqueName: \"kubernetes.io/projected/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-kube-api-access-vv2rr\") pod \"console-6c4b7b5d77-2z5zb\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:25.828039 master-0 kubenswrapper[36504]: I1203 22:12:25.827987 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:26.249548 master-0 kubenswrapper[36504]: I1203 22:12:26.249474 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4b7b5d77-2z5zb"] Dec 03 22:12:26.548618 master-0 kubenswrapper[36504]: I1203 22:12:26.548423 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:12:26.549454 master-0 kubenswrapper[36504]: I1203 22:12:26.548804 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="alertmanager" containerID="cri-o://3991a80b948e18fbc5646eb1553734287ea7281af63687505e0275404e8bafe9" gracePeriod=120 Dec 03 22:12:26.549454 master-0 kubenswrapper[36504]: I1203 22:12:26.548814 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-metric" containerID="cri-o://3de9ee4eef380bab89092626bf758dffa1d7cea9d48eda9514ff804a4488d430" gracePeriod=120 Dec 03 22:12:26.549454 master-0 kubenswrapper[36504]: I1203 22:12:26.548845 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy" containerID="cri-o://68a7927c51918fe5c4fb8a44357b8b8424076604ed97acaf1ce5cef2372ca6b9" gracePeriod=120 Dec 03 22:12:26.549454 master-0 kubenswrapper[36504]: I1203 22:12:26.548827 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-web" containerID="cri-o://26ec3b66599dba0723e2e891ff101c0c8c05e81fa006226174021a06e7c6f7c8" gracePeriod=120 Dec 03 22:12:26.549454 master-0 kubenswrapper[36504]: I1203 22:12:26.548938 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="prom-label-proxy" containerID="cri-o://56a2cd13eb7b79c9802629a342435d6c6c35687e7a5e6dc330c18e9a98072842" gracePeriod=120 Dec 03 22:12:26.549454 master-0 kubenswrapper[36504]: I1203 22:12:26.548948 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="config-reloader" containerID="cri-o://bc778f4e9e81307fa7f9857972a109cf59925855518b80afb692131b95f39008" gracePeriod=120 Dec 03 22:12:26.736492 master-0 kubenswrapper[36504]: I1203 22:12:26.736427 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4b7b5d77-2z5zb" event={"ID":"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0","Type":"ContainerStarted","Data":"be290e513743319bbbcc4bff45a64a81baa694de12cf0da73461a9cbd2d70f83"} Dec 03 22:12:26.736492 master-0 kubenswrapper[36504]: I1203 22:12:26.736481 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4b7b5d77-2z5zb" event={"ID":"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0","Type":"ContainerStarted","Data":"e3c3b0a64a288f9a6eb4fb457b73d8e090e6fd73caf9aac80031c9cdbdd68ba1"} Dec 03 22:12:26.742479 master-0 kubenswrapper[36504]: I1203 22:12:26.742427 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="56a2cd13eb7b79c9802629a342435d6c6c35687e7a5e6dc330c18e9a98072842" exitCode=0 Dec 03 22:12:26.742479 master-0 kubenswrapper[36504]: I1203 22:12:26.742453 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"56a2cd13eb7b79c9802629a342435d6c6c35687e7a5e6dc330c18e9a98072842"} Dec 03 22:12:26.742479 master-0 kubenswrapper[36504]: I1203 22:12:26.742490 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"3de9ee4eef380bab89092626bf758dffa1d7cea9d48eda9514ff804a4488d430"} Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742465 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="3de9ee4eef380bab89092626bf758dffa1d7cea9d48eda9514ff804a4488d430" exitCode=0 Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742514 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="68a7927c51918fe5c4fb8a44357b8b8424076604ed97acaf1ce5cef2372ca6b9" exitCode=0 Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742528 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="26ec3b66599dba0723e2e891ff101c0c8c05e81fa006226174021a06e7c6f7c8" exitCode=0 Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742535 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="bc778f4e9e81307fa7f9857972a109cf59925855518b80afb692131b95f39008" exitCode=0 Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742542 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerID="3991a80b948e18fbc5646eb1553734287ea7281af63687505e0275404e8bafe9" exitCode=0 Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742557 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"68a7927c51918fe5c4fb8a44357b8b8424076604ed97acaf1ce5cef2372ca6b9"} Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742568 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"26ec3b66599dba0723e2e891ff101c0c8c05e81fa006226174021a06e7c6f7c8"} Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742577 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"bc778f4e9e81307fa7f9857972a109cf59925855518b80afb692131b95f39008"} Dec 03 22:12:26.742729 master-0 kubenswrapper[36504]: I1203 22:12:26.742590 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"3991a80b948e18fbc5646eb1553734287ea7281af63687505e0275404e8bafe9"} Dec 03 22:12:26.764818 master-0 kubenswrapper[36504]: I1203 22:12:26.764719 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c4b7b5d77-2z5zb" podStartSLOduration=1.7646999349999999 podStartE2EDuration="1.764699935s" podCreationTimestamp="2025-12-03 22:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:12:26.763211889 +0000 UTC m=+111.982983906" watchObservedRunningTime="2025-12-03 22:12:26.764699935 +0000 UTC m=+111.984471942" Dec 03 22:12:27.045749 master-0 kubenswrapper[36504]: I1203 22:12:27.045673 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:27.108496 master-0 kubenswrapper[36504]: I1203 22:12:27.107013 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01c52731-571d-4eaf-9282-6d662b25102e" path="/var/lib/kubelet/pods/01c52731-571d-4eaf-9282-6d662b25102e/volumes" Dec 03 22:12:27.141595 master-0 kubenswrapper[36504]: I1203 22:12:27.141500 36504 patch_prober.go:28] interesting pod/console-55b8756475-lz989 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" start-of-body= Dec 03 22:12:27.142013 master-0 kubenswrapper[36504]: I1203 22:12:27.141607 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-55b8756475-lz989" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" probeResult="failure" output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" Dec 03 22:12:27.193530 master-0 kubenswrapper[36504]: I1203 22:12:27.193464 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-out\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193530 master-0 kubenswrapper[36504]: I1203 22:12:27.193535 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193574 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-web\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193620 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-metrics-client-ca\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193643 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-volume\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193673 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-main-db\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193716 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-trusted-ca-bundle\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193748 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-web-config\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193785 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-main-tls\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193840 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.193874 master-0 kubenswrapper[36504]: I1203 22:12:27.193867 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-tls-assets\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.194695 master-0 kubenswrapper[36504]: I1203 22:12:27.193888 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddspb\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-kube-api-access-ddspb\") pod \"ef63e073-73a7-4ac5-9077-c6a8491440a8\" (UID: \"ef63e073-73a7-4ac5-9077-c6a8491440a8\") " Dec 03 22:12:27.194695 master-0 kubenswrapper[36504]: I1203 22:12:27.194253 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:27.194993 master-0 kubenswrapper[36504]: I1203 22:12:27.194941 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:27.195150 master-0 kubenswrapper[36504]: I1203 22:12:27.195094 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:12:27.197482 master-0 kubenswrapper[36504]: I1203 22:12:27.197429 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-kube-api-access-ddspb" (OuterVolumeSpecName: "kube-api-access-ddspb") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "kube-api-access-ddspb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:27.197874 master-0 kubenswrapper[36504]: I1203 22:12:27.197797 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:27.198365 master-0 kubenswrapper[36504]: I1203 22:12:27.198324 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:27.198365 master-0 kubenswrapper[36504]: I1203 22:12:27.198345 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:27.198579 master-0 kubenswrapper[36504]: I1203 22:12:27.198536 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:27.198653 master-0 kubenswrapper[36504]: I1203 22:12:27.198617 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:27.198786 master-0 kubenswrapper[36504]: I1203 22:12:27.198720 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-out" (OuterVolumeSpecName: "config-out") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:12:27.200391 master-0 kubenswrapper[36504]: I1203 22:12:27.200352 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:27.248111 master-0 kubenswrapper[36504]: I1203 22:12:27.248018 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-web-config" (OuterVolumeSpecName: "web-config") pod "ef63e073-73a7-4ac5-9077-c6a8491440a8" (UID: "ef63e073-73a7-4ac5-9077-c6a8491440a8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:27.296652 master-0 kubenswrapper[36504]: I1203 22:12:27.296557 36504 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296652 master-0 kubenswrapper[36504]: I1203 22:12:27.296630 36504 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-web-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296652 master-0 kubenswrapper[36504]: I1203 22:12:27.296655 36504 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296676 36504 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296696 36504 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296714 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddspb\" (UniqueName: \"kubernetes.io/projected/ef63e073-73a7-4ac5-9077-c6a8491440a8-kube-api-access-ddspb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296733 36504 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-out\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296751 36504 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296793 36504 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296812 36504 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ef63e073-73a7-4ac5-9077-c6a8491440a8-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296830 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ef63e073-73a7-4ac5-9077-c6a8491440a8-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.296999 master-0 kubenswrapper[36504]: I1203 22:12:27.296848 36504 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ef63e073-73a7-4ac5-9077-c6a8491440a8-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:27.758086 master-0 kubenswrapper[36504]: I1203 22:12:27.758029 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ef63e073-73a7-4ac5-9077-c6a8491440a8","Type":"ContainerDied","Data":"547a7ab480497e319d1d714117c8bc3e504a86f60e3b65f35661301426122e98"} Dec 03 22:12:27.758855 master-0 kubenswrapper[36504]: I1203 22:12:27.758119 36504 scope.go:117] "RemoveContainer" containerID="56a2cd13eb7b79c9802629a342435d6c6c35687e7a5e6dc330c18e9a98072842" Dec 03 22:12:27.759062 master-0 kubenswrapper[36504]: I1203 22:12:27.759034 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:27.780302 master-0 kubenswrapper[36504]: I1203 22:12:27.780265 36504 scope.go:117] "RemoveContainer" containerID="3de9ee4eef380bab89092626bf758dffa1d7cea9d48eda9514ff804a4488d430" Dec 03 22:12:27.814055 master-0 kubenswrapper[36504]: I1203 22:12:27.813673 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:12:27.817633 master-0 kubenswrapper[36504]: I1203 22:12:27.817560 36504 scope.go:117] "RemoveContainer" containerID="68a7927c51918fe5c4fb8a44357b8b8424076604ed97acaf1ce5cef2372ca6b9" Dec 03 22:12:27.821593 master-0 kubenswrapper[36504]: I1203 22:12:27.821498 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:12:27.840278 master-0 kubenswrapper[36504]: I1203 22:12:27.840228 36504 scope.go:117] "RemoveContainer" containerID="26ec3b66599dba0723e2e891ff101c0c8c05e81fa006226174021a06e7c6f7c8" Dec 03 22:12:27.850590 master-0 kubenswrapper[36504]: I1203 22:12:27.850551 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:12:27.851065 master-0 kubenswrapper[36504]: E1203 22:12:27.851049 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="init-config-reloader" Dec 03 22:12:27.851165 master-0 kubenswrapper[36504]: I1203 22:12:27.851155 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="init-config-reloader" Dec 03 22:12:27.851232 master-0 kubenswrapper[36504]: E1203 22:12:27.851223 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-metric" Dec 03 22:12:27.851287 master-0 kubenswrapper[36504]: I1203 22:12:27.851276 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-metric" Dec 03 22:12:27.851448 master-0 kubenswrapper[36504]: E1203 22:12:27.851436 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy" Dec 03 22:12:27.851507 master-0 kubenswrapper[36504]: I1203 22:12:27.851497 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy" Dec 03 22:12:27.851577 master-0 kubenswrapper[36504]: E1203 22:12:27.851567 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-web" Dec 03 22:12:27.851636 master-0 kubenswrapper[36504]: I1203 22:12:27.851627 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-web" Dec 03 22:12:27.851698 master-0 kubenswrapper[36504]: E1203 22:12:27.851688 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="alertmanager" Dec 03 22:12:27.851748 master-0 kubenswrapper[36504]: I1203 22:12:27.851740 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="alertmanager" Dec 03 22:12:27.851836 master-0 kubenswrapper[36504]: E1203 22:12:27.851825 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="config-reloader" Dec 03 22:12:27.851897 master-0 kubenswrapper[36504]: I1203 22:12:27.851888 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="config-reloader" Dec 03 22:12:27.851961 master-0 kubenswrapper[36504]: E1203 22:12:27.851951 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="prom-label-proxy" Dec 03 22:12:27.852024 master-0 kubenswrapper[36504]: I1203 22:12:27.852015 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="prom-label-proxy" Dec 03 22:12:27.852216 master-0 kubenswrapper[36504]: I1203 22:12:27.852202 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy" Dec 03 22:12:27.852287 master-0 kubenswrapper[36504]: I1203 22:12:27.852277 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="config-reloader" Dec 03 22:12:27.852361 master-0 kubenswrapper[36504]: I1203 22:12:27.852352 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="alertmanager" Dec 03 22:12:27.852431 master-0 kubenswrapper[36504]: I1203 22:12:27.852422 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="prom-label-proxy" Dec 03 22:12:27.852489 master-0 kubenswrapper[36504]: I1203 22:12:27.852480 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-web" Dec 03 22:12:27.852556 master-0 kubenswrapper[36504]: I1203 22:12:27.852547 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" containerName="kube-rbac-proxy-metric" Dec 03 22:12:27.858081 master-0 kubenswrapper[36504]: I1203 22:12:27.858048 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:27.863241 master-0 kubenswrapper[36504]: I1203 22:12:27.861697 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 22:12:27.863241 master-0 kubenswrapper[36504]: I1203 22:12:27.861975 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-mjccd" Dec 03 22:12:27.863241 master-0 kubenswrapper[36504]: I1203 22:12:27.862389 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 22:12:27.863241 master-0 kubenswrapper[36504]: I1203 22:12:27.862530 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 22:12:27.863241 master-0 kubenswrapper[36504]: I1203 22:12:27.863153 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 22:12:27.864208 master-0 kubenswrapper[36504]: I1203 22:12:27.864096 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 22:12:27.864507 master-0 kubenswrapper[36504]: I1203 22:12:27.864393 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 22:12:27.865850 master-0 kubenswrapper[36504]: I1203 22:12:27.865098 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 22:12:27.875219 master-0 kubenswrapper[36504]: I1203 22:12:27.875183 36504 scope.go:117] "RemoveContainer" containerID="bc778f4e9e81307fa7f9857972a109cf59925855518b80afb692131b95f39008" Dec 03 22:12:27.877526 master-0 kubenswrapper[36504]: I1203 22:12:27.877489 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 22:12:27.884840 master-0 kubenswrapper[36504]: I1203 22:12:27.884788 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:12:27.891879 master-0 kubenswrapper[36504]: I1203 22:12:27.891133 36504 scope.go:117] "RemoveContainer" containerID="3991a80b948e18fbc5646eb1553734287ea7281af63687505e0275404e8bafe9" Dec 03 22:12:27.907949 master-0 kubenswrapper[36504]: I1203 22:12:27.907866 36504 scope.go:117] "RemoveContainer" containerID="293aa3834074cb9659d77f03111207113b1608183a914bd8c4318a151eaec038" Dec 03 22:12:28.009439 master-0 kubenswrapper[36504]: I1203 22:12:28.009251 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009439 master-0 kubenswrapper[36504]: I1203 22:12:28.009372 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009439 master-0 kubenswrapper[36504]: I1203 22:12:28.009422 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-config-volume\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009754 master-0 kubenswrapper[36504]: I1203 22:12:28.009459 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009754 master-0 kubenswrapper[36504]: I1203 22:12:28.009503 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009754 master-0 kubenswrapper[36504]: I1203 22:12:28.009530 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-web-config\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009754 master-0 kubenswrapper[36504]: I1203 22:12:28.009606 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-config-out\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009754 master-0 kubenswrapper[36504]: I1203 22:12:28.009636 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.009973 master-0 kubenswrapper[36504]: I1203 22:12:28.009850 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.010582 master-0 kubenswrapper[36504]: I1203 22:12:28.010501 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.011348 master-0 kubenswrapper[36504]: I1203 22:12:28.011304 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k78m\" (UniqueName: \"kubernetes.io/projected/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-kube-api-access-2k78m\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.011431 master-0 kubenswrapper[36504]: I1203 22:12:28.011360 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113075 master-0 kubenswrapper[36504]: I1203 22:12:28.112992 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113364 master-0 kubenswrapper[36504]: I1203 22:12:28.113088 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113364 master-0 kubenswrapper[36504]: I1203 22:12:28.113153 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-config-volume\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113364 master-0 kubenswrapper[36504]: I1203 22:12:28.113204 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113364 master-0 kubenswrapper[36504]: I1203 22:12:28.113270 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113364 master-0 kubenswrapper[36504]: I1203 22:12:28.113312 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-web-config\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113364 master-0 kubenswrapper[36504]: I1203 22:12:28.113354 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-config-out\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113554 master-0 kubenswrapper[36504]: I1203 22:12:28.113393 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113554 master-0 kubenswrapper[36504]: I1203 22:12:28.113452 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113554 master-0 kubenswrapper[36504]: I1203 22:12:28.113506 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113554 master-0 kubenswrapper[36504]: I1203 22:12:28.113540 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k78m\" (UniqueName: \"kubernetes.io/projected/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-kube-api-access-2k78m\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.113716 master-0 kubenswrapper[36504]: I1203 22:12:28.113575 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.115005 master-0 kubenswrapper[36504]: I1203 22:12:28.114409 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.115005 master-0 kubenswrapper[36504]: I1203 22:12:28.114668 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.115005 master-0 kubenswrapper[36504]: I1203 22:12:28.114938 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.117367 master-0 kubenswrapper[36504]: I1203 22:12:28.117335 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-web-config\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.119111 master-0 kubenswrapper[36504]: I1203 22:12:28.119062 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-config-out\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.119721 master-0 kubenswrapper[36504]: I1203 22:12:28.119672 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.121267 master-0 kubenswrapper[36504]: I1203 22:12:28.121227 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.121696 master-0 kubenswrapper[36504]: I1203 22:12:28.121628 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.121837 master-0 kubenswrapper[36504]: I1203 22:12:28.121752 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.122690 master-0 kubenswrapper[36504]: I1203 22:12:28.122637 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.124044 master-0 kubenswrapper[36504]: I1203 22:12:28.123984 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-config-volume\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.136993 master-0 kubenswrapper[36504]: I1203 22:12:28.136926 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k78m\" (UniqueName: \"kubernetes.io/projected/bdfca75d-cacc-4a25-83c9-a1fe7f2f140a-kube-api-access-2k78m\") pod \"alertmanager-main-0\" (UID: \"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.185112 master-0 kubenswrapper[36504]: I1203 22:12:28.185019 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 22:12:28.658823 master-0 kubenswrapper[36504]: I1203 22:12:28.658738 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 22:12:28.667363 master-0 kubenswrapper[36504]: W1203 22:12:28.667286 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdfca75d_cacc_4a25_83c9_a1fe7f2f140a.slice/crio-c025844261c62a655bb9576d0f99225a699179a1fd6d5a4f975aa9d71a83aca1 WatchSource:0}: Error finding container c025844261c62a655bb9576d0f99225a699179a1fd6d5a4f975aa9d71a83aca1: Status 404 returned error can't find the container with id c025844261c62a655bb9576d0f99225a699179a1fd6d5a4f975aa9d71a83aca1 Dec 03 22:12:28.779304 master-0 kubenswrapper[36504]: I1203 22:12:28.779223 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"c025844261c62a655bb9576d0f99225a699179a1fd6d5a4f975aa9d71a83aca1"} Dec 03 22:12:28.857102 master-0 kubenswrapper[36504]: I1203 22:12:28.853757 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-7c696657b7-9p9js"] Dec 03 22:12:28.857102 master-0 kubenswrapper[36504]: I1203 22:12:28.856684 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:28.860638 master-0 kubenswrapper[36504]: I1203 22:12:28.860220 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 22:12:28.860638 master-0 kubenswrapper[36504]: I1203 22:12:28.860141 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 22:12:28.866379 master-0 kubenswrapper[36504]: I1203 22:12:28.865261 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c696657b7-9p9js"] Dec 03 22:12:29.040465 master-0 kubenswrapper[36504]: I1203 22:12:29.040392 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff72c202-1084-4ae4-94ed-b71e32105ac5-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.040690 master-0 kubenswrapper[36504]: I1203 22:12:29.040544 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ff72c202-1084-4ae4-94ed-b71e32105ac5-nginx-conf\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.104257 master-0 kubenswrapper[36504]: I1203 22:12:29.104202 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef63e073-73a7-4ac5-9077-c6a8491440a8" path="/var/lib/kubelet/pods/ef63e073-73a7-4ac5-9077-c6a8491440a8/volumes" Dec 03 22:12:29.142035 master-0 kubenswrapper[36504]: I1203 22:12:29.141923 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff72c202-1084-4ae4-94ed-b71e32105ac5-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.142195 master-0 kubenswrapper[36504]: I1203 22:12:29.142046 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ff72c202-1084-4ae4-94ed-b71e32105ac5-nginx-conf\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.142342 master-0 kubenswrapper[36504]: E1203 22:12:29.142279 36504 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Dec 03 22:12:29.142420 master-0 kubenswrapper[36504]: E1203 22:12:29.142400 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff72c202-1084-4ae4-94ed-b71e32105ac5-networking-console-plugin-cert podName:ff72c202-1084-4ae4-94ed-b71e32105ac5 nodeName:}" failed. No retries permitted until 2025-12-03 22:12:29.642370445 +0000 UTC m=+114.862142482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ff72c202-1084-4ae4-94ed-b71e32105ac5-networking-console-plugin-cert") pod "networking-console-plugin-7c696657b7-9p9js" (UID: "ff72c202-1084-4ae4-94ed-b71e32105ac5") : secret "networking-console-plugin-cert" not found Dec 03 22:12:29.142892 master-0 kubenswrapper[36504]: I1203 22:12:29.142849 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ff72c202-1084-4ae4-94ed-b71e32105ac5-nginx-conf\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.650581 master-0 kubenswrapper[36504]: I1203 22:12:29.650493 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff72c202-1084-4ae4-94ed-b71e32105ac5-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.656049 master-0 kubenswrapper[36504]: I1203 22:12:29.655856 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ff72c202-1084-4ae4-94ed-b71e32105ac5-networking-console-plugin-cert\") pod \"networking-console-plugin-7c696657b7-9p9js\" (UID: \"ff72c202-1084-4ae4-94ed-b71e32105ac5\") " pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.787195 master-0 kubenswrapper[36504]: I1203 22:12:29.787120 36504 generic.go:334] "Generic (PLEG): container finished" podID="bdfca75d-cacc-4a25-83c9-a1fe7f2f140a" containerID="1afa5d5a32d26acd08d6b8eb96f8678eb97a5a22356139c29f46f30cd24c0992" exitCode=0 Dec 03 22:12:29.787195 master-0 kubenswrapper[36504]: I1203 22:12:29.787182 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerDied","Data":"1afa5d5a32d26acd08d6b8eb96f8678eb97a5a22356139c29f46f30cd24c0992"} Dec 03 22:12:29.793369 master-0 kubenswrapper[36504]: I1203 22:12:29.793299 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" Dec 03 22:12:29.891815 master-0 kubenswrapper[36504]: I1203 22:12:29.891181 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b8756475-lz989"] Dec 03 22:12:29.963879 master-0 kubenswrapper[36504]: I1203 22:12:29.963781 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66896f8657-9276x"] Dec 03 22:12:29.971982 master-0 kubenswrapper[36504]: I1203 22:12:29.964731 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:29.993181 master-0 kubenswrapper[36504]: I1203 22:12:29.993100 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66896f8657-9276x"] Dec 03 22:12:30.060649 master-0 kubenswrapper[36504]: I1203 22:12:30.060593 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-oauth-config\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.060803 master-0 kubenswrapper[36504]: I1203 22:12:30.060742 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-trusted-ca-bundle\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.060877 master-0 kubenswrapper[36504]: I1203 22:12:30.060825 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-oauth-serving-cert\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.060924 master-0 kubenswrapper[36504]: I1203 22:12:30.060888 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-serving-cert\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.060924 master-0 kubenswrapper[36504]: I1203 22:12:30.060915 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-service-ca\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.061002 master-0 kubenswrapper[36504]: I1203 22:12:30.060962 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkqvg\" (UniqueName: \"kubernetes.io/projected/311227bb-eeea-40dd-b57d-ca50551be7d3-kube-api-access-rkqvg\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.061184 master-0 kubenswrapper[36504]: I1203 22:12:30.061157 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-console-config\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164069 master-0 kubenswrapper[36504]: I1203 22:12:30.163953 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-oauth-config\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164069 master-0 kubenswrapper[36504]: I1203 22:12:30.164023 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-trusted-ca-bundle\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164069 master-0 kubenswrapper[36504]: I1203 22:12:30.164049 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-oauth-serving-cert\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164294 master-0 kubenswrapper[36504]: I1203 22:12:30.164076 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-serving-cert\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164294 master-0 kubenswrapper[36504]: I1203 22:12:30.164101 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-service-ca\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164294 master-0 kubenswrapper[36504]: I1203 22:12:30.164132 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkqvg\" (UniqueName: \"kubernetes.io/projected/311227bb-eeea-40dd-b57d-ca50551be7d3-kube-api-access-rkqvg\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.164294 master-0 kubenswrapper[36504]: I1203 22:12:30.164187 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-console-config\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.165313 master-0 kubenswrapper[36504]: I1203 22:12:30.165274 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-console-config\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.165855 master-0 kubenswrapper[36504]: I1203 22:12:30.165813 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-oauth-serving-cert\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.167410 master-0 kubenswrapper[36504]: I1203 22:12:30.167246 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-trusted-ca-bundle\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.169561 master-0 kubenswrapper[36504]: I1203 22:12:30.168443 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-service-ca\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.171056 master-0 kubenswrapper[36504]: I1203 22:12:30.171021 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-serving-cert\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.173465 master-0 kubenswrapper[36504]: I1203 22:12:30.173418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-oauth-config\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.188309 master-0 kubenswrapper[36504]: I1203 22:12:30.188258 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkqvg\" (UniqueName: \"kubernetes.io/projected/311227bb-eeea-40dd-b57d-ca50551be7d3-kube-api-access-rkqvg\") pod \"console-66896f8657-9276x\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.282103 master-0 kubenswrapper[36504]: I1203 22:12:30.281790 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-7c696657b7-9p9js"] Dec 03 22:12:30.326652 master-0 kubenswrapper[36504]: I1203 22:12:30.326602 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:30.798939 master-0 kubenswrapper[36504]: I1203 22:12:30.798131 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" event={"ID":"ff72c202-1084-4ae4-94ed-b71e32105ac5","Type":"ContainerStarted","Data":"346e2a07ac1e5943ff959f50671e5f8ac2868d78a6d189034cb6f107c8e8af09"} Dec 03 22:12:30.801711 master-0 kubenswrapper[36504]: I1203 22:12:30.801686 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"f3e4ee67fa3ad904cb6503644de20b409ccbfb7a101dcfcd4214d92cdb595c2b"} Dec 03 22:12:30.801818 master-0 kubenswrapper[36504]: I1203 22:12:30.801732 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"e6c38a32a577999f65176f2cb03d9c62b556299c13520d92ea7ddf7c74a4e240"} Dec 03 22:12:30.801818 master-0 kubenswrapper[36504]: I1203 22:12:30.801798 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"663faba2005a5d08d381c8ffde23e7f69f0d29d2a380bf8f413f8a59330ea3aa"} Dec 03 22:12:30.801818 master-0 kubenswrapper[36504]: I1203 22:12:30.801809 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"7c944bdfac7b9d172c7c426157fd318fc5207fd243c9fcebb5c483d98dd47830"} Dec 03 22:12:31.089896 master-0 kubenswrapper[36504]: I1203 22:12:31.088663 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66896f8657-9276x"] Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.168450 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.168885 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="prometheus" containerID="cri-o://d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd" gracePeriod=600 Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.168958 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-thanos" containerID="cri-o://84f355f08fea0ece11dcec4bc171a070cc9b92e6f0db0c9ffcfecf32bd0c5675" gracePeriod=600 Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.169062 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="config-reloader" containerID="cri-o://ebedcfc581bb8d540725d068b2c227219da0049c7a1c3f30a5a0fa6d8ae719ff" gracePeriod=600 Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.169043 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-web" containerID="cri-o://07b2946b0d38203c0e82d06430687c3d314865f4239cf760f29db65f0176187e" gracePeriod=600 Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.169054 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="thanos-sidecar" containerID="cri-o://f0eddaec065f93226a58ae33d3b8000bdd340ba29b6a91fbb8390d9217f4bb8d" gracePeriod=600 Dec 03 22:12:31.169901 master-0 kubenswrapper[36504]: I1203 22:12:31.169028 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy" containerID="cri-o://acc4a3ec94f5519f67f3c9eb71620d6662bb73370e9b401bf3c6e1cfd3e2e591" gracePeriod=600 Dec 03 22:12:31.813077 master-0 kubenswrapper[36504]: I1203 22:12:31.813029 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="84f355f08fea0ece11dcec4bc171a070cc9b92e6f0db0c9ffcfecf32bd0c5675" exitCode=0 Dec 03 22:12:31.813077 master-0 kubenswrapper[36504]: I1203 22:12:31.813069 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="07b2946b0d38203c0e82d06430687c3d314865f4239cf760f29db65f0176187e" exitCode=0 Dec 03 22:12:31.813077 master-0 kubenswrapper[36504]: I1203 22:12:31.813079 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="f0eddaec065f93226a58ae33d3b8000bdd340ba29b6a91fbb8390d9217f4bb8d" exitCode=0 Dec 03 22:12:31.813582 master-0 kubenswrapper[36504]: I1203 22:12:31.813090 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="ebedcfc581bb8d540725d068b2c227219da0049c7a1c3f30a5a0fa6d8ae719ff" exitCode=0 Dec 03 22:12:31.813582 master-0 kubenswrapper[36504]: I1203 22:12:31.813105 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"84f355f08fea0ece11dcec4bc171a070cc9b92e6f0db0c9ffcfecf32bd0c5675"} Dec 03 22:12:31.813582 master-0 kubenswrapper[36504]: I1203 22:12:31.813154 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"07b2946b0d38203c0e82d06430687c3d314865f4239cf760f29db65f0176187e"} Dec 03 22:12:31.813582 master-0 kubenswrapper[36504]: I1203 22:12:31.813171 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"f0eddaec065f93226a58ae33d3b8000bdd340ba29b6a91fbb8390d9217f4bb8d"} Dec 03 22:12:31.813582 master-0 kubenswrapper[36504]: I1203 22:12:31.813183 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"ebedcfc581bb8d540725d068b2c227219da0049c7a1c3f30a5a0fa6d8ae719ff"} Dec 03 22:12:31.816491 master-0 kubenswrapper[36504]: I1203 22:12:31.816461 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"e87c5a4e4f56a4901f91c8016898956be53dc460d3c1f2932af7d894ae79ea6c"} Dec 03 22:12:31.816552 master-0 kubenswrapper[36504]: I1203 22:12:31.816493 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bdfca75d-cacc-4a25-83c9-a1fe7f2f140a","Type":"ContainerStarted","Data":"e6221d273879d27977f5c2001795095fe097e3aacbea87720411764790a15d01"} Dec 03 22:12:31.818087 master-0 kubenswrapper[36504]: I1203 22:12:31.818051 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66896f8657-9276x" event={"ID":"311227bb-eeea-40dd-b57d-ca50551be7d3","Type":"ContainerStarted","Data":"2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc"} Dec 03 22:12:31.818087 master-0 kubenswrapper[36504]: I1203 22:12:31.818078 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66896f8657-9276x" event={"ID":"311227bb-eeea-40dd-b57d-ca50551be7d3","Type":"ContainerStarted","Data":"cd3c4e79c48822b910d0a1710fddeecb42e1ffcb815d6472b9e2605a23c48de7"} Dec 03 22:12:31.873896 master-0 kubenswrapper[36504]: I1203 22:12:31.872727 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.872694789 podStartE2EDuration="4.872694789s" podCreationTimestamp="2025-12-03 22:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:12:31.844590284 +0000 UTC m=+117.064362311" watchObservedRunningTime="2025-12-03 22:12:31.872694789 +0000 UTC m=+117.092466796" Dec 03 22:12:31.875907 master-0 kubenswrapper[36504]: I1203 22:12:31.875839 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66896f8657-9276x" podStartSLOduration=2.875826567 podStartE2EDuration="2.875826567s" podCreationTimestamp="2025-12-03 22:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:12:31.870457109 +0000 UTC m=+117.090229116" watchObservedRunningTime="2025-12-03 22:12:31.875826567 +0000 UTC m=+117.095598574" Dec 03 22:12:32.834423 master-0 kubenswrapper[36504]: I1203 22:12:32.834345 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="acc4a3ec94f5519f67f3c9eb71620d6662bb73370e9b401bf3c6e1cfd3e2e591" exitCode=0 Dec 03 22:12:32.834423 master-0 kubenswrapper[36504]: I1203 22:12:32.834426 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"acc4a3ec94f5519f67f3c9eb71620d6662bb73370e9b401bf3c6e1cfd3e2e591"} Dec 03 22:12:32.847195 master-0 kubenswrapper[36504]: I1203 22:12:32.847093 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" event={"ID":"ff72c202-1084-4ae4-94ed-b71e32105ac5","Type":"ContainerStarted","Data":"66d787f262fc5b14c99fb28c317fe0c12e494f80b0cf5412df7cfaf958398538"} Dec 03 22:12:32.874647 master-0 kubenswrapper[36504]: I1203 22:12:32.874527 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-7c696657b7-9p9js" podStartSLOduration=3.2642978830000002 podStartE2EDuration="4.874495429s" podCreationTimestamp="2025-12-03 22:12:28 +0000 UTC" firstStartedPulling="2025-12-03 22:12:30.295328647 +0000 UTC m=+115.515100654" lastFinishedPulling="2025-12-03 22:12:31.905526193 +0000 UTC m=+117.125298200" observedRunningTime="2025-12-03 22:12:32.867294854 +0000 UTC m=+118.087066921" watchObservedRunningTime="2025-12-03 22:12:32.874495429 +0000 UTC m=+118.094267476" Dec 03 22:12:35.102541 master-0 kubenswrapper[36504]: I1203 22:12:35.102462 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:12:35.264295 master-0 kubenswrapper[36504]: E1203 22:12:35.264165 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 22:12:35.266238 master-0 kubenswrapper[36504]: E1203 22:12:35.266141 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 22:12:35.268223 master-0 kubenswrapper[36504]: E1203 22:12:35.268142 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 22:12:35.268223 master-0 kubenswrapper[36504]: E1203 22:12:35.268213 36504 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="prometheus" Dec 03 22:12:35.686828 master-0 kubenswrapper[36504]: E1203 22:12:35.686682 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:12:35.828306 master-0 kubenswrapper[36504]: I1203 22:12:35.828233 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:35.828598 master-0 kubenswrapper[36504]: I1203 22:12:35.828330 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:12:35.830932 master-0 kubenswrapper[36504]: I1203 22:12:35.830875 36504 patch_prober.go:28] interesting pod/console-6c4b7b5d77-2z5zb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Dec 03 22:12:35.831088 master-0 kubenswrapper[36504]: I1203 22:12:35.830968 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Dec 03 22:12:38.903862 master-0 kubenswrapper[36504]: I1203 22:12:38.903665 36504 generic.go:334] "Generic (PLEG): container finished" podID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerID="d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd" exitCode=0 Dec 03 22:12:38.903862 master-0 kubenswrapper[36504]: I1203 22:12:38.903712 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd"} Dec 03 22:12:39.103893 master-0 kubenswrapper[36504]: I1203 22:12:39.103369 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:12:39.223858 master-0 kubenswrapper[36504]: I1203 22:12:39.223704 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-web-config\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.223858 master-0 kubenswrapper[36504]: I1203 22:12:39.223763 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-kube-rbac-proxy\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.223858 master-0 kubenswrapper[36504]: I1203 22:12:39.223833 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-serving-certs-ca-bundle\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.223858 master-0 kubenswrapper[36504]: I1203 22:12:39.223855 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-thanos-prometheus-http-client-file\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.223894 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-tls\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.223919 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-tls-assets\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.223945 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-trusted-ca-bundle\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.223970 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.223999 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-metrics-client-ca\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224019 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-metrics-client-certs\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224065 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-db\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224089 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224112 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-config\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224137 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g7f4\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-kube-api-access-4g7f4\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224159 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-grpc-tls\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224182 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-kubelet-serving-ca-bundle\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.224199 master-0 kubenswrapper[36504]: I1203 22:12:39.224201 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-config-out\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.225023 master-0 kubenswrapper[36504]: I1203 22:12:39.224239 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-rulefiles-0\") pod \"6f9d9300-e389-4941-904a-51dbdcc14c71\" (UID: \"6f9d9300-e389-4941-904a-51dbdcc14c71\") " Dec 03 22:12:39.225596 master-0 kubenswrapper[36504]: I1203 22:12:39.225538 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:39.225733 master-0 kubenswrapper[36504]: I1203 22:12:39.225699 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:39.226631 master-0 kubenswrapper[36504]: I1203 22:12:39.226556 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:12:39.227712 master-0 kubenswrapper[36504]: I1203 22:12:39.227664 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.228175 master-0 kubenswrapper[36504]: I1203 22:12:39.228126 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-kube-api-access-4g7f4" (OuterVolumeSpecName: "kube-api-access-4g7f4") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "kube-api-access-4g7f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:39.228549 master-0 kubenswrapper[36504]: I1203 22:12:39.228500 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:39.228696 master-0 kubenswrapper[36504]: I1203 22:12:39.228547 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:39.228696 master-0 kubenswrapper[36504]: I1203 22:12:39.228611 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.228945 master-0 kubenswrapper[36504]: I1203 22:12:39.228746 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.228945 master-0 kubenswrapper[36504]: I1203 22:12:39.228756 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:39.229131 master-0 kubenswrapper[36504]: I1203 22:12:39.229020 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:39.229318 master-0 kubenswrapper[36504]: I1203 22:12:39.229246 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.229445 master-0 kubenswrapper[36504]: I1203 22:12:39.229318 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.231086 master-0 kubenswrapper[36504]: I1203 22:12:39.231042 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.231662 master-0 kubenswrapper[36504]: I1203 22:12:39.231609 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.232206 master-0 kubenswrapper[36504]: I1203 22:12:39.232150 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-config-out" (OuterVolumeSpecName: "config-out") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:12:39.232341 master-0 kubenswrapper[36504]: I1203 22:12:39.232306 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-config" (OuterVolumeSpecName: "config") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.285009 master-0 kubenswrapper[36504]: I1203 22:12:39.284951 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-web-config" (OuterVolumeSpecName: "web-config") pod "6f9d9300-e389-4941-904a-51dbdcc14c71" (UID: "6f9d9300-e389-4941-904a-51dbdcc14c71"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:39.325634 master-0 kubenswrapper[36504]: I1203 22:12:39.325571 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325634 master-0 kubenswrapper[36504]: I1203 22:12:39.325615 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g7f4\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-kube-api-access-4g7f4\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325634 master-0 kubenswrapper[36504]: I1203 22:12:39.325628 36504 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325634 master-0 kubenswrapper[36504]: I1203 22:12:39.325638 36504 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325634 master-0 kubenswrapper[36504]: I1203 22:12:39.325647 36504 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-config-out\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325658 36504 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325668 36504 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-web-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325677 36504 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325688 36504 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325698 36504 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325707 36504 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325716 36504 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f9d9300-e389-4941-904a-51dbdcc14c71-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325725 36504 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325734 36504 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325744 36504 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f9d9300-e389-4941-904a-51dbdcc14c71-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325754 36504 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325763 36504 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f9d9300-e389-4941-904a-51dbdcc14c71-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.325972 master-0 kubenswrapper[36504]: I1203 22:12:39.325784 36504 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f9d9300-e389-4941-904a-51dbdcc14c71-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:39.356833 master-0 kubenswrapper[36504]: I1203 22:12:39.356702 36504 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 22:12:39.357085 master-0 kubenswrapper[36504]: E1203 22:12:39.356856 36504 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Dec 03 22:12:39.357085 master-0 kubenswrapper[36504]: I1203 22:12:39.357041 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver" containerID="cri-o://0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b" gracePeriod=15 Dec 03 22:12:39.357335 master-0 kubenswrapper[36504]: I1203 22:12:39.357103 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37" gracePeriod=15 Dec 03 22:12:39.357335 master-0 kubenswrapper[36504]: I1203 22:12:39.357171 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e" gracePeriod=15 Dec 03 22:12:39.357335 master-0 kubenswrapper[36504]: I1203 22:12:39.357220 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef" gracePeriod=15 Dec 03 22:12:39.357335 master-0 kubenswrapper[36504]: I1203 22:12:39.357265 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-syncer" containerID="cri-o://55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35" gracePeriod=15 Dec 03 22:12:39.360857 master-0 kubenswrapper[36504]: I1203 22:12:39.360753 36504 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 22:12:39.361436 master-0 kubenswrapper[36504]: E1203 22:12:39.361393 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: I1203 22:12:39.361437 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: E1203 22:12:39.361468 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: I1203 22:12:39.361487 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: E1203 22:12:39.361852 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: I1203 22:12:39.361877 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: E1203 22:12:39.361918 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="init-config-reloader" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: I1203 22:12:39.361936 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="init-config-reloader" Dec 03 22:12:39.361948 master-0 kubenswrapper[36504]: E1203 22:12:39.361957 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="setup" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.361972 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="setup" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.361995 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="thanos-sidecar" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362011 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="thanos-sidecar" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362030 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-syncer" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362047 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-syncer" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362078 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="prometheus" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362093 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="prometheus" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362123 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="config-reloader" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362140 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="config-reloader" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362169 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-check-endpoints" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362186 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-check-endpoints" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362208 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-thanos" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362223 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-thanos" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362265 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362285 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: E1203 22:12:39.362327 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-web" Dec 03 22:12:39.362589 master-0 kubenswrapper[36504]: I1203 22:12:39.362344 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-web" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362667 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-web" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362729 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362746 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="config-reloader" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362810 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="thanos-sidecar" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362841 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362862 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-insecure-readyz" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362887 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362912 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="prometheus" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362937 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" containerName="kube-rbac-proxy-thanos" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362957 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-check-endpoints" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.362987 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="setup" Dec 03 22:12:39.364171 master-0 kubenswrapper[36504]: I1203 22:12:39.363010 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a00233b22d19df39b2e1c8ba133b3c2" containerName="kube-apiserver-cert-syncer" Dec 03 22:12:39.369930 master-0 kubenswrapper[36504]: I1203 22:12:39.367837 36504 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 22:12:39.371182 master-0 kubenswrapper[36504]: I1203 22:12:39.370361 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.387419 master-0 kubenswrapper[36504]: I1203 22:12:39.387038 36504 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="8a00233b22d19df39b2e1c8ba133b3c2" podUID="7035982c3bafe5c31d1c2515ed3c2b7b" Dec 03 22:12:39.427285 master-0 kubenswrapper[36504]: I1203 22:12:39.427202 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.427567 master-0 kubenswrapper[36504]: I1203 22:12:39.427302 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.427567 master-0 kubenswrapper[36504]: I1203 22:12:39.427371 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.427567 master-0 kubenswrapper[36504]: I1203 22:12:39.427408 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.427567 master-0 kubenswrapper[36504]: I1203 22:12:39.427443 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.427702 master-0 kubenswrapper[36504]: I1203 22:12:39.427574 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.427702 master-0 kubenswrapper[36504]: I1203 22:12:39.427676 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.427783 master-0 kubenswrapper[36504]: I1203 22:12:39.427729 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529066 master-0 kubenswrapper[36504]: I1203 22:12:39.528975 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.529215 master-0 kubenswrapper[36504]: I1203 22:12:39.529085 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529215 master-0 kubenswrapper[36504]: I1203 22:12:39.529115 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529215 master-0 kubenswrapper[36504]: I1203 22:12:39.529138 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.529215 master-0 kubenswrapper[36504]: I1203 22:12:39.529085 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.529215 master-0 kubenswrapper[36504]: I1203 22:12:39.529201 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529487 master-0 kubenswrapper[36504]: I1203 22:12:39.529225 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529487 master-0 kubenswrapper[36504]: I1203 22:12:39.529233 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529487 master-0 kubenswrapper[36504]: I1203 22:12:39.529255 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529487 master-0 kubenswrapper[36504]: I1203 22:12:39.529344 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.529487 master-0 kubenswrapper[36504]: I1203 22:12:39.529402 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.529487 master-0 kubenswrapper[36504]: I1203 22:12:39.529455 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529661 master-0 kubenswrapper[36504]: I1203 22:12:39.529500 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529661 master-0 kubenswrapper[36504]: I1203 22:12:39.529538 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7035982c3bafe5c31d1c2515ed3c2b7b-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"7035982c3bafe5c31d1c2515ed3c2b7b\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:39.529661 master-0 kubenswrapper[36504]: I1203 22:12:39.529592 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.529661 master-0 kubenswrapper[36504]: I1203 22:12:39.529576 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:39.927062 master-0 kubenswrapper[36504]: I1203 22:12:39.926978 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_8a00233b22d19df39b2e1c8ba133b3c2/kube-apiserver-cert-syncer/0.log" Dec 03 22:12:39.928676 master-0 kubenswrapper[36504]: I1203 22:12:39.928608 36504 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37" exitCode=0 Dec 03 22:12:39.929075 master-0 kubenswrapper[36504]: I1203 22:12:39.928680 36504 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e" exitCode=0 Dec 03 22:12:39.929075 master-0 kubenswrapper[36504]: I1203 22:12:39.928704 36504 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef" exitCode=0 Dec 03 22:12:39.929075 master-0 kubenswrapper[36504]: I1203 22:12:39.928725 36504 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35" exitCode=2 Dec 03 22:12:39.932377 master-0 kubenswrapper[36504]: I1203 22:12:39.932325 36504 generic.go:334] "Generic (PLEG): container finished" podID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" containerID="f475b284f57f52fc0c7eb702e9328890537fd74a9dd50bbac04f195c30c41055" exitCode=0 Dec 03 22:12:39.932563 master-0 kubenswrapper[36504]: I1203 22:12:39.932452 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"0e0ffab3-4f70-4bc8-9464-598c0668a4b8","Type":"ContainerDied","Data":"f475b284f57f52fc0c7eb702e9328890537fd74a9dd50bbac04f195c30c41055"} Dec 03 22:12:39.934630 master-0 kubenswrapper[36504]: I1203 22:12:39.934541 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:39.939980 master-0 kubenswrapper[36504]: I1203 22:12:39.939913 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f9d9300-e389-4941-904a-51dbdcc14c71","Type":"ContainerDied","Data":"ee5acb547e13c4549526d636347fac56d7c0cad90b7828fd68d9c5e75aaa558c"} Dec 03 22:12:39.940136 master-0 kubenswrapper[36504]: I1203 22:12:39.939992 36504 scope.go:117] "RemoveContainer" containerID="84f355f08fea0ece11dcec4bc171a070cc9b92e6f0db0c9ffcfecf32bd0c5675" Dec 03 22:12:39.940212 master-0 kubenswrapper[36504]: I1203 22:12:39.940179 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:12:39.943466 master-0 kubenswrapper[36504]: I1203 22:12:39.943386 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:39.945325 master-0 kubenswrapper[36504]: I1203 22:12:39.945232 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:39.969685 master-0 kubenswrapper[36504]: I1203 22:12:39.969609 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:39.970062 master-0 kubenswrapper[36504]: I1203 22:12:39.970021 36504 scope.go:117] "RemoveContainer" containerID="acc4a3ec94f5519f67f3c9eb71620d6662bb73370e9b401bf3c6e1cfd3e2e591" Dec 03 22:12:39.970617 master-0 kubenswrapper[36504]: I1203 22:12:39.970532 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:40.327637 master-0 kubenswrapper[36504]: I1203 22:12:40.327542 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:40.327637 master-0 kubenswrapper[36504]: I1203 22:12:40.327640 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:12:40.330727 master-0 kubenswrapper[36504]: I1203 22:12:40.330691 36504 patch_prober.go:28] interesting pod/console-66896f8657-9276x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" start-of-body= Dec 03 22:12:40.330727 master-0 kubenswrapper[36504]: I1203 22:12:40.330738 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66896f8657-9276x" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" probeResult="failure" output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" Dec 03 22:12:40.331935 master-0 kubenswrapper[36504]: E1203 22:12:40.331637 36504 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Dec 03 22:12:40.331935 master-0 kubenswrapper[36504]: &Event{ObjectMeta:{console-66896f8657-9276x.187dd436af22e1f1 openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-66896f8657-9276x,UID:311227bb-eeea-40dd-b57d-ca50551be7d3,APIVersion:v1,ResourceVersion:17623,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.112:8443/health": dial tcp 10.128.0.112:8443: connect: connection refused Dec 03 22:12:40.331935 master-0 kubenswrapper[36504]: body: Dec 03 22:12:40.331935 master-0 kubenswrapper[36504]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 22:12:40.330723825 +0000 UTC m=+125.550495832,LastTimestamp:2025-12-03 22:12:40.330723825 +0000 UTC m=+125.550495832,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 03 22:12:40.331935 master-0 kubenswrapper[36504]: > Dec 03 22:12:41.282258 master-0 kubenswrapper[36504]: I1203 22:12:41.281662 36504 scope.go:117] "RemoveContainer" containerID="07b2946b0d38203c0e82d06430687c3d314865f4239cf760f29db65f0176187e" Dec 03 22:12:41.305489 master-0 kubenswrapper[36504]: I1203 22:12:41.305397 36504 scope.go:117] "RemoveContainer" containerID="f0eddaec065f93226a58ae33d3b8000bdd340ba29b6a91fbb8390d9217f4bb8d" Dec 03 22:12:41.335371 master-0 kubenswrapper[36504]: I1203 22:12:41.335318 36504 scope.go:117] "RemoveContainer" containerID="ebedcfc581bb8d540725d068b2c227219da0049c7a1c3f30a5a0fa6d8ae719ff" Dec 03 22:12:41.359560 master-0 kubenswrapper[36504]: I1203 22:12:41.359508 36504 scope.go:117] "RemoveContainer" containerID="d398d1215dba19fc46ec829344f96609f1847f0ad18f4133a39646bfb735fabd" Dec 03 22:12:41.381573 master-0 kubenswrapper[36504]: I1203 22:12:41.381521 36504 scope.go:117] "RemoveContainer" containerID="d9cd3ea5d36d4b61da3ba9089ff5f3d844951b2df54a6ba1b43fba52d4e57250" Dec 03 22:12:41.626144 master-0 kubenswrapper[36504]: I1203 22:12:41.626086 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:12:41.627499 master-0 kubenswrapper[36504]: I1203 22:12:41.627444 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:41.628174 master-0 kubenswrapper[36504]: I1203 22:12:41.628123 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:41.766314 master-0 kubenswrapper[36504]: I1203 22:12:41.766256 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_8a00233b22d19df39b2e1c8ba133b3c2/kube-apiserver-cert-syncer/0.log" Dec 03 22:12:41.767288 master-0 kubenswrapper[36504]: I1203 22:12:41.767247 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:41.768857 master-0 kubenswrapper[36504]: I1203 22:12:41.768762 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:41.769728 master-0 kubenswrapper[36504]: I1203 22:12:41.769669 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:41.770490 master-0 kubenswrapper[36504]: I1203 22:12:41.770433 36504 status_manager.go:851] "Failed to get status for pod" podUID="8a00233b22d19df39b2e1c8ba133b3c2" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:41.778294 master-0 kubenswrapper[36504]: I1203 22:12:41.778247 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-var-lock\") pod \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " Dec 03 22:12:41.778418 master-0 kubenswrapper[36504]: I1203 22:12:41.778377 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kube-api-access\") pod \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " Dec 03 22:12:41.778464 master-0 kubenswrapper[36504]: I1203 22:12:41.778424 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-var-lock" (OuterVolumeSpecName: "var-lock") pod "0e0ffab3-4f70-4bc8-9464-598c0668a4b8" (UID: "0e0ffab3-4f70-4bc8-9464-598c0668a4b8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:12:41.778499 master-0 kubenswrapper[36504]: I1203 22:12:41.778436 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kubelet-dir\") pod \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\" (UID: \"0e0ffab3-4f70-4bc8-9464-598c0668a4b8\") " Dec 03 22:12:41.778532 master-0 kubenswrapper[36504]: I1203 22:12:41.778484 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e0ffab3-4f70-4bc8-9464-598c0668a4b8" (UID: "0e0ffab3-4f70-4bc8-9464-598c0668a4b8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:12:41.779128 master-0 kubenswrapper[36504]: I1203 22:12:41.779084 36504 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:41.779180 master-0 kubenswrapper[36504]: I1203 22:12:41.779130 36504 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:41.783073 master-0 kubenswrapper[36504]: I1203 22:12:41.783010 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e0ffab3-4f70-4bc8-9464-598c0668a4b8" (UID: "0e0ffab3-4f70-4bc8-9464-598c0668a4b8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:41.879975 master-0 kubenswrapper[36504]: I1203 22:12:41.879891 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") pod \"8a00233b22d19df39b2e1c8ba133b3c2\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " Dec 03 22:12:41.879975 master-0 kubenswrapper[36504]: I1203 22:12:41.879973 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") pod \"8a00233b22d19df39b2e1c8ba133b3c2\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " Dec 03 22:12:41.879975 master-0 kubenswrapper[36504]: I1203 22:12:41.879967 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8a00233b22d19df39b2e1c8ba133b3c2" (UID: "8a00233b22d19df39b2e1c8ba133b3c2"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:12:41.880478 master-0 kubenswrapper[36504]: I1203 22:12:41.880083 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") pod \"8a00233b22d19df39b2e1c8ba133b3c2\" (UID: \"8a00233b22d19df39b2e1c8ba133b3c2\") " Dec 03 22:12:41.880478 master-0 kubenswrapper[36504]: I1203 22:12:41.880106 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8a00233b22d19df39b2e1c8ba133b3c2" (UID: "8a00233b22d19df39b2e1c8ba133b3c2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:12:41.880478 master-0 kubenswrapper[36504]: I1203 22:12:41.880268 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8a00233b22d19df39b2e1c8ba133b3c2" (UID: "8a00233b22d19df39b2e1c8ba133b3c2"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:12:41.880885 master-0 kubenswrapper[36504]: I1203 22:12:41.880840 36504 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:41.880885 master-0 kubenswrapper[36504]: I1203 22:12:41.880874 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e0ffab3-4f70-4bc8-9464-598c0668a4b8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:41.881052 master-0 kubenswrapper[36504]: I1203 22:12:41.880893 36504 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:41.881052 master-0 kubenswrapper[36504]: I1203 22:12:41.880905 36504 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8a00233b22d19df39b2e1c8ba133b3c2-audit-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:41.972592 master-0 kubenswrapper[36504]: I1203 22:12:41.972474 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_8a00233b22d19df39b2e1c8ba133b3c2/kube-apiserver-cert-syncer/0.log" Dec 03 22:12:41.973797 master-0 kubenswrapper[36504]: I1203 22:12:41.973703 36504 generic.go:334] "Generic (PLEG): container finished" podID="8a00233b22d19df39b2e1c8ba133b3c2" containerID="0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b" exitCode=0 Dec 03 22:12:41.973953 master-0 kubenswrapper[36504]: I1203 22:12:41.973871 36504 scope.go:117] "RemoveContainer" containerID="cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37" Dec 03 22:12:41.974067 master-0 kubenswrapper[36504]: I1203 22:12:41.973942 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:41.976686 master-0 kubenswrapper[36504]: I1203 22:12:41.976627 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-7-master-0" event={"ID":"0e0ffab3-4f70-4bc8-9464-598c0668a4b8","Type":"ContainerDied","Data":"b947da5065bf745ebbd9eaca8b8328c1762ab378324b0943d0c669546a9364fb"} Dec 03 22:12:41.976815 master-0 kubenswrapper[36504]: I1203 22:12:41.976695 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b947da5065bf745ebbd9eaca8b8328c1762ab378324b0943d0c669546a9364fb" Dec 03 22:12:41.976815 master-0 kubenswrapper[36504]: I1203 22:12:41.976699 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-7-master-0" Dec 03 22:12:42.007065 master-0 kubenswrapper[36504]: I1203 22:12:42.007018 36504 scope.go:117] "RemoveContainer" containerID="c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e" Dec 03 22:12:42.008533 master-0 kubenswrapper[36504]: I1203 22:12:42.008467 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:42.009720 master-0 kubenswrapper[36504]: I1203 22:12:42.009635 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:42.010946 master-0 kubenswrapper[36504]: I1203 22:12:42.010860 36504 status_manager.go:851] "Failed to get status for pod" podUID="8a00233b22d19df39b2e1c8ba133b3c2" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:42.012444 master-0 kubenswrapper[36504]: I1203 22:12:42.012374 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:42.013356 master-0 kubenswrapper[36504]: I1203 22:12:42.013311 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:42.014155 master-0 kubenswrapper[36504]: I1203 22:12:42.014075 36504 status_manager.go:851] "Failed to get status for pod" podUID="8a00233b22d19df39b2e1c8ba133b3c2" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:42.030288 master-0 kubenswrapper[36504]: I1203 22:12:42.030228 36504 scope.go:117] "RemoveContainer" containerID="cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef" Dec 03 22:12:42.048099 master-0 kubenswrapper[36504]: I1203 22:12:42.047983 36504 scope.go:117] "RemoveContainer" containerID="55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35" Dec 03 22:12:42.061406 master-0 kubenswrapper[36504]: I1203 22:12:42.061362 36504 scope.go:117] "RemoveContainer" containerID="0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b" Dec 03 22:12:42.077836 master-0 kubenswrapper[36504]: I1203 22:12:42.077807 36504 scope.go:117] "RemoveContainer" containerID="d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402" Dec 03 22:12:42.096925 master-0 kubenswrapper[36504]: I1203 22:12:42.096877 36504 scope.go:117] "RemoveContainer" containerID="cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37" Dec 03 22:12:42.097306 master-0 kubenswrapper[36504]: E1203 22:12:42.097265 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37\": container with ID starting with cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37 not found: ID does not exist" containerID="cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37" Dec 03 22:12:42.097383 master-0 kubenswrapper[36504]: I1203 22:12:42.097296 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37"} err="failed to get container status \"cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37\": rpc error: code = NotFound desc = could not find container \"cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37\": container with ID starting with cef81f6d0670f6c2c9d6e15f22b5c9ccdb5cfcc0cd2736c4b0519aea84146a37 not found: ID does not exist" Dec 03 22:12:42.097383 master-0 kubenswrapper[36504]: I1203 22:12:42.097320 36504 scope.go:117] "RemoveContainer" containerID="c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e" Dec 03 22:12:42.097739 master-0 kubenswrapper[36504]: E1203 22:12:42.097669 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e\": container with ID starting with c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e not found: ID does not exist" containerID="c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e" Dec 03 22:12:42.097836 master-0 kubenswrapper[36504]: I1203 22:12:42.097741 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e"} err="failed to get container status \"c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e\": rpc error: code = NotFound desc = could not find container \"c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e\": container with ID starting with c6a5e0cab10221d7f0b829255a325d9adfb97c241fbf33e6641de730b68e040e not found: ID does not exist" Dec 03 22:12:42.097836 master-0 kubenswrapper[36504]: I1203 22:12:42.097791 36504 scope.go:117] "RemoveContainer" containerID="cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef" Dec 03 22:12:42.098199 master-0 kubenswrapper[36504]: E1203 22:12:42.098144 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef\": container with ID starting with cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef not found: ID does not exist" containerID="cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef" Dec 03 22:12:42.098249 master-0 kubenswrapper[36504]: I1203 22:12:42.098209 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef"} err="failed to get container status \"cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef\": rpc error: code = NotFound desc = could not find container \"cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef\": container with ID starting with cde28f5ac86c4e1591c9ae45cfb78a803d42a5c2c40340eb8e099be4e4bd50ef not found: ID does not exist" Dec 03 22:12:42.098295 master-0 kubenswrapper[36504]: I1203 22:12:42.098246 36504 scope.go:117] "RemoveContainer" containerID="55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35" Dec 03 22:12:42.098693 master-0 kubenswrapper[36504]: E1203 22:12:42.098658 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35\": container with ID starting with 55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35 not found: ID does not exist" containerID="55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35" Dec 03 22:12:42.098739 master-0 kubenswrapper[36504]: I1203 22:12:42.098687 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35"} err="failed to get container status \"55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35\": rpc error: code = NotFound desc = could not find container \"55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35\": container with ID starting with 55cc9c5a328877be2d43a8af809736f346925e9d545f64ec4d9764844e44ab35 not found: ID does not exist" Dec 03 22:12:42.098739 master-0 kubenswrapper[36504]: I1203 22:12:42.098707 36504 scope.go:117] "RemoveContainer" containerID="0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b" Dec 03 22:12:42.099045 master-0 kubenswrapper[36504]: E1203 22:12:42.099004 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b\": container with ID starting with 0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b not found: ID does not exist" containerID="0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b" Dec 03 22:12:42.099089 master-0 kubenswrapper[36504]: I1203 22:12:42.099043 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b"} err="failed to get container status \"0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b\": rpc error: code = NotFound desc = could not find container \"0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b\": container with ID starting with 0e2a7ca74c5bdd88b6c4ce25a492b937c0545f3634b5d9b5d4194f37418b752b not found: ID does not exist" Dec 03 22:12:42.099089 master-0 kubenswrapper[36504]: I1203 22:12:42.099066 36504 scope.go:117] "RemoveContainer" containerID="d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402" Dec 03 22:12:42.099384 master-0 kubenswrapper[36504]: E1203 22:12:42.099341 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402\": container with ID starting with d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402 not found: ID does not exist" containerID="d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402" Dec 03 22:12:42.099436 master-0 kubenswrapper[36504]: I1203 22:12:42.099379 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402"} err="failed to get container status \"d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402\": rpc error: code = NotFound desc = could not find container \"d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402\": container with ID starting with d88c38d75a4c7ade68501b681392f9b300e48be0f5aec4172afcd0181c089402 not found: ID does not exist" Dec 03 22:12:42.974722 master-0 kubenswrapper[36504]: E1203 22:12:42.974590 36504 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Dec 03 22:12:42.974722 master-0 kubenswrapper[36504]: &Event{ObjectMeta:{console-66896f8657-9276x.187dd436af22e1f1 openshift-console 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-66896f8657-9276x,UID:311227bb-eeea-40dd-b57d-ca50551be7d3,APIVersion:v1,ResourceVersion:17623,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.112:8443/health": dial tcp 10.128.0.112:8443: connect: connection refused Dec 03 22:12:42.974722 master-0 kubenswrapper[36504]: body: Dec 03 22:12:42.974722 master-0 kubenswrapper[36504]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2025-12-03 22:12:40.330723825 +0000 UTC m=+125.550495832,LastTimestamp:2025-12-03 22:12:40.330723825 +0000 UTC m=+125.550495832,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Dec 03 22:12:42.974722 master-0 kubenswrapper[36504]: > Dec 03 22:12:43.106655 master-0 kubenswrapper[36504]: I1203 22:12:43.106603 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a00233b22d19df39b2e1c8ba133b3c2" path="/var/lib/kubelet/pods/8a00233b22d19df39b2e1c8ba133b3c2/volumes" Dec 03 22:12:44.428084 master-0 kubenswrapper[36504]: E1203 22:12:44.427987 36504 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:44.428989 master-0 kubenswrapper[36504]: I1203 22:12:44.428689 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:44.463250 master-0 kubenswrapper[36504]: W1203 22:12:44.463168 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda767ff0baecd8145c3bd0a8b3c1771e4.slice/crio-cdf3aeeebfe5109b67386bf589ab8e8227f278182199e814e3f85943eed8fb63 WatchSource:0}: Error finding container cdf3aeeebfe5109b67386bf589ab8e8227f278182199e814e3f85943eed8fb63: Status 404 returned error can't find the container with id cdf3aeeebfe5109b67386bf589ab8e8227f278182199e814e3f85943eed8fb63 Dec 03 22:12:45.004215 master-0 kubenswrapper[36504]: I1203 22:12:45.004056 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a767ff0baecd8145c3bd0a8b3c1771e4","Type":"ContainerStarted","Data":"38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d"} Dec 03 22:12:45.004215 master-0 kubenswrapper[36504]: I1203 22:12:45.004119 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a767ff0baecd8145c3bd0a8b3c1771e4","Type":"ContainerStarted","Data":"cdf3aeeebfe5109b67386bf589ab8e8227f278182199e814e3f85943eed8fb63"} Dec 03 22:12:45.005015 master-0 kubenswrapper[36504]: I1203 22:12:45.004971 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:45.005105 master-0 kubenswrapper[36504]: E1203 22:12:45.004972 36504 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:12:45.005599 master-0 kubenswrapper[36504]: I1203 22:12:45.005536 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:45.099633 master-0 kubenswrapper[36504]: I1203 22:12:45.099545 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:45.100384 master-0 kubenswrapper[36504]: I1203 22:12:45.100283 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:45.828634 master-0 kubenswrapper[36504]: I1203 22:12:45.828586 36504 patch_prober.go:28] interesting pod/console-6c4b7b5d77-2z5zb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Dec 03 22:12:45.829569 master-0 kubenswrapper[36504]: I1203 22:12:45.828671 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Dec 03 22:12:46.190949 master-0 kubenswrapper[36504]: E1203 22:12:46.190847 36504 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:46.191642 master-0 kubenswrapper[36504]: E1203 22:12:46.191588 36504 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:46.192365 master-0 kubenswrapper[36504]: E1203 22:12:46.192292 36504 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:46.193131 master-0 kubenswrapper[36504]: E1203 22:12:46.193079 36504 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:46.193797 master-0 kubenswrapper[36504]: E1203 22:12:46.193718 36504 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:46.193918 master-0 kubenswrapper[36504]: I1203 22:12:46.193855 36504 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 22:12:46.194522 master-0 kubenswrapper[36504]: E1203 22:12:46.194463 36504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Dec 03 22:12:46.396417 master-0 kubenswrapper[36504]: E1203 22:12:46.396318 36504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Dec 03 22:12:46.797978 master-0 kubenswrapper[36504]: E1203 22:12:46.797869 36504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Dec 03 22:12:47.599712 master-0 kubenswrapper[36504]: E1203 22:12:47.599641 36504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Dec 03 22:12:49.201183 master-0 kubenswrapper[36504]: E1203 22:12:49.201038 36504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Dec 03 22:12:50.328469 master-0 kubenswrapper[36504]: I1203 22:12:50.328367 36504 patch_prober.go:28] interesting pod/console-66896f8657-9276x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" start-of-body= Dec 03 22:12:50.329336 master-0 kubenswrapper[36504]: I1203 22:12:50.328489 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66896f8657-9276x" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" probeResult="failure" output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" Dec 03 22:12:50.451417 master-0 kubenswrapper[36504]: I1203 22:12:50.451360 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5588d9469d-jxz4w" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" containerID="cri-o://0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2" gracePeriod=15 Dec 03 22:12:51.005664 master-0 kubenswrapper[36504]: I1203 22:12:51.005592 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5588d9469d-jxz4w_45869470-a94b-4424-a580-3bb4d9e0c675/console/0.log" Dec 03 22:12:51.005918 master-0 kubenswrapper[36504]: I1203 22:12:51.005722 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:12:51.006842 master-0 kubenswrapper[36504]: I1203 22:12:51.006787 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.007192 master-0 kubenswrapper[36504]: I1203 22:12:51.007153 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.007486 master-0 kubenswrapper[36504]: I1203 22:12:51.007451 36504 status_manager.go:851] "Failed to get status for pod" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" pod="openshift-console/console-5588d9469d-jxz4w" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/pods/console-5588d9469d-jxz4w\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.095527 master-0 kubenswrapper[36504]: I1203 22:12:51.095457 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:51.096534 master-0 kubenswrapper[36504]: I1203 22:12:51.096482 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.097406 master-0 kubenswrapper[36504]: I1203 22:12:51.097230 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.098835 master-0 kubenswrapper[36504]: I1203 22:12:51.098734 36504 status_manager.go:851] "Failed to get status for pod" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" pod="openshift-console/console-5588d9469d-jxz4w" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/pods/console-5588d9469d-jxz4w\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.105974 master-0 kubenswrapper[36504]: I1203 22:12:51.105920 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5588d9469d-jxz4w_45869470-a94b-4424-a580-3bb4d9e0c675/console/0.log" Dec 03 22:12:51.106208 master-0 kubenswrapper[36504]: I1203 22:12:51.106026 36504 generic.go:334] "Generic (PLEG): container finished" podID="45869470-a94b-4424-a580-3bb4d9e0c675" containerID="0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2" exitCode=2 Dec 03 22:12:51.106208 master-0 kubenswrapper[36504]: I1203 22:12:51.106094 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5588d9469d-jxz4w" Dec 03 22:12:51.107318 master-0 kubenswrapper[36504]: I1203 22:12:51.107244 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.107894 master-0 kubenswrapper[36504]: I1203 22:12:51.107815 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.108457 master-0 kubenswrapper[36504]: I1203 22:12:51.108415 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5588d9469d-jxz4w" event={"ID":"45869470-a94b-4424-a580-3bb4d9e0c675","Type":"ContainerDied","Data":"0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2"} Dec 03 22:12:51.108576 master-0 kubenswrapper[36504]: I1203 22:12:51.108461 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5588d9469d-jxz4w" event={"ID":"45869470-a94b-4424-a580-3bb4d9e0c675","Type":"ContainerDied","Data":"d2e070e2ff15917e1d78b349c90ca91d64b28b2a86b2b1426d51bccbf65b6ed1"} Dec 03 22:12:51.108576 master-0 kubenswrapper[36504]: I1203 22:12:51.108489 36504 scope.go:117] "RemoveContainer" containerID="0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2" Dec 03 22:12:51.109385 master-0 kubenswrapper[36504]: I1203 22:12:51.109319 36504 status_manager.go:851] "Failed to get status for pod" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" pod="openshift-console/console-5588d9469d-jxz4w" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/pods/console-5588d9469d-jxz4w\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.129023 master-0 kubenswrapper[36504]: I1203 22:12:51.128957 36504 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:51.129023 master-0 kubenswrapper[36504]: I1203 22:12:51.129005 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:51.130064 master-0 kubenswrapper[36504]: E1203 22:12:51.129992 36504 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:51.130673 master-0 kubenswrapper[36504]: I1203 22:12:51.130629 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:51.135110 master-0 kubenswrapper[36504]: I1203 22:12:51.135062 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-console-config\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135235 master-0 kubenswrapper[36504]: I1203 22:12:51.135143 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-oauth-serving-cert\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135235 master-0 kubenswrapper[36504]: I1203 22:12:51.135173 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcwvr\" (UniqueName: \"kubernetes.io/projected/45869470-a94b-4424-a580-3bb4d9e0c675-kube-api-access-vcwvr\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135369 master-0 kubenswrapper[36504]: I1203 22:12:51.135244 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-oauth-config\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135369 master-0 kubenswrapper[36504]: I1203 22:12:51.135319 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-service-ca\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135535 master-0 kubenswrapper[36504]: I1203 22:12:51.135374 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-serving-cert\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135535 master-0 kubenswrapper[36504]: I1203 22:12:51.135420 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-trusted-ca-bundle\") pod \"45869470-a94b-4424-a580-3bb4d9e0c675\" (UID: \"45869470-a94b-4424-a580-3bb4d9e0c675\") " Dec 03 22:12:51.135942 master-0 kubenswrapper[36504]: I1203 22:12:51.135661 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-console-config" (OuterVolumeSpecName: "console-config") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:51.135942 master-0 kubenswrapper[36504]: I1203 22:12:51.135876 36504 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.135942 master-0 kubenswrapper[36504]: I1203 22:12:51.135871 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:51.136567 master-0 kubenswrapper[36504]: I1203 22:12:51.136326 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:51.136567 master-0 kubenswrapper[36504]: I1203 22:12:51.136319 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-service-ca" (OuterVolumeSpecName: "service-ca") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:51.138166 master-0 kubenswrapper[36504]: I1203 22:12:51.138116 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45869470-a94b-4424-a580-3bb4d9e0c675-kube-api-access-vcwvr" (OuterVolumeSpecName: "kube-api-access-vcwvr") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "kube-api-access-vcwvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:51.140920 master-0 kubenswrapper[36504]: I1203 22:12:51.140818 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:51.142151 master-0 kubenswrapper[36504]: I1203 22:12:51.142037 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "45869470-a94b-4424-a580-3bb4d9e0c675" (UID: "45869470-a94b-4424-a580-3bb4d9e0c675"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:51.146747 master-0 kubenswrapper[36504]: I1203 22:12:51.146715 36504 scope.go:117] "RemoveContainer" containerID="0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2" Dec 03 22:12:51.147360 master-0 kubenswrapper[36504]: E1203 22:12:51.147287 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2\": container with ID starting with 0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2 not found: ID does not exist" containerID="0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2" Dec 03 22:12:51.147531 master-0 kubenswrapper[36504]: I1203 22:12:51.147365 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2"} err="failed to get container status \"0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2\": rpc error: code = NotFound desc = could not find container \"0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2\": container with ID starting with 0229493925f0167d3d1a5b8ed6d2aac3400456743f674f96fe0a88cc615a59d2 not found: ID does not exist" Dec 03 22:12:51.180423 master-0 kubenswrapper[36504]: W1203 22:12:51.180359 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7035982c3bafe5c31d1c2515ed3c2b7b.slice/crio-38788100e3f6aac0857fc92ac5e55179b0805abca9c9aadb1d56b0458b497ff6 WatchSource:0}: Error finding container 38788100e3f6aac0857fc92ac5e55179b0805abca9c9aadb1d56b0458b497ff6: Status 404 returned error can't find the container with id 38788100e3f6aac0857fc92ac5e55179b0805abca9c9aadb1d56b0458b497ff6 Dec 03 22:12:51.238055 master-0 kubenswrapper[36504]: I1203 22:12:51.237884 36504 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.238055 master-0 kubenswrapper[36504]: I1203 22:12:51.237923 36504 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.238055 master-0 kubenswrapper[36504]: I1203 22:12:51.237939 36504 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.238055 master-0 kubenswrapper[36504]: I1203 22:12:51.237956 36504 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45869470-a94b-4424-a580-3bb4d9e0c675-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.238055 master-0 kubenswrapper[36504]: I1203 22:12:51.237969 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcwvr\" (UniqueName: \"kubernetes.io/projected/45869470-a94b-4424-a580-3bb4d9e0c675-kube-api-access-vcwvr\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.238055 master-0 kubenswrapper[36504]: I1203 22:12:51.237982 36504 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45869470-a94b-4424-a580-3bb4d9e0c675-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:51.423234 master-0 kubenswrapper[36504]: I1203 22:12:51.423151 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.423883 master-0 kubenswrapper[36504]: I1203 22:12:51.423831 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:51.424651 master-0 kubenswrapper[36504]: I1203 22:12:51.424595 36504 status_manager.go:851] "Failed to get status for pod" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" pod="openshift-console/console-5588d9469d-jxz4w" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/pods/console-5588d9469d-jxz4w\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:52.120423 master-0 kubenswrapper[36504]: I1203 22:12:52.120322 36504 generic.go:334] "Generic (PLEG): container finished" podID="7035982c3bafe5c31d1c2515ed3c2b7b" containerID="d2f7ce9cba8cf9ed23de77d97df3b962c5afbd4108cacda986a7ddf6d2b22a14" exitCode=0 Dec 03 22:12:52.120706 master-0 kubenswrapper[36504]: I1203 22:12:52.120454 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerDied","Data":"d2f7ce9cba8cf9ed23de77d97df3b962c5afbd4108cacda986a7ddf6d2b22a14"} Dec 03 22:12:52.120706 master-0 kubenswrapper[36504]: I1203 22:12:52.120507 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerStarted","Data":"38788100e3f6aac0857fc92ac5e55179b0805abca9c9aadb1d56b0458b497ff6"} Dec 03 22:12:52.121025 master-0 kubenswrapper[36504]: I1203 22:12:52.120978 36504 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:52.121025 master-0 kubenswrapper[36504]: I1203 22:12:52.121012 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:52.122240 master-0 kubenswrapper[36504]: E1203 22:12:52.122166 36504 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:52.123345 master-0 kubenswrapper[36504]: I1203 22:12:52.123265 36504 status_manager.go:851] "Failed to get status for pod" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:52.125072 master-0 kubenswrapper[36504]: I1203 22:12:52.124974 36504 status_manager.go:851] "Failed to get status for pod" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" pod="openshift-kube-apiserver/installer-7-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-7-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:52.126073 master-0 kubenswrapper[36504]: I1203 22:12:52.125991 36504 status_manager.go:851] "Failed to get status for pod" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" pod="openshift-console/console-5588d9469d-jxz4w" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/pods/console-5588d9469d-jxz4w\": dial tcp 192.168.32.10:6443: connect: connection refused" Dec 03 22:12:52.402284 master-0 kubenswrapper[36504]: E1203 22:12:52.402173 36504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Dec 03 22:12:53.160794 master-0 kubenswrapper[36504]: I1203 22:12:53.160632 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerStarted","Data":"65b5f79ca7b96ed93db29aba3ae8bc241455646d2e1529bdd643eae30fe1f668"} Dec 03 22:12:53.160794 master-0 kubenswrapper[36504]: I1203 22:12:53.160680 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerStarted","Data":"8c7639960cf9c49e9622092221b6170e88fd2b93ed11b2c4ea659569ecee395e"} Dec 03 22:12:53.160794 master-0 kubenswrapper[36504]: I1203 22:12:53.160690 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerStarted","Data":"3272cc6886cf112de9d20011e390a8a35fac24de99f8ded9102adf6bdcb3c6c9"} Dec 03 22:12:54.175490 master-0 kubenswrapper[36504]: I1203 22:12:54.175416 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerStarted","Data":"955271c7013622323152b1ed3e8ef6837a03e28fba160cb3700fc501e00be066"} Dec 03 22:12:54.175490 master-0 kubenswrapper[36504]: I1203 22:12:54.175494 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"7035982c3bafe5c31d1c2515ed3c2b7b","Type":"ContainerStarted","Data":"42c10f8f264ec3e98ef146e8a892100d9ac508c4e49eb9423e0fb5e62132834c"} Dec 03 22:12:54.176100 master-0 kubenswrapper[36504]: I1203 22:12:54.176068 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:54.176607 master-0 kubenswrapper[36504]: I1203 22:12:54.176523 36504 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:54.176607 master-0 kubenswrapper[36504]: I1203 22:12:54.176555 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:54.178730 master-0 kubenswrapper[36504]: I1203 22:12:54.178699 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager/0.log" Dec 03 22:12:54.178811 master-0 kubenswrapper[36504]: I1203 22:12:54.178750 36504 generic.go:334] "Generic (PLEG): container finished" podID="287a70e59cc5430b23b208b9a03b5ac7" containerID="7f76635edb60bf491a762b76e0a7b8207d158079fd706ba1977dae6cce9c45ed" exitCode=1 Dec 03 22:12:54.178855 master-0 kubenswrapper[36504]: I1203 22:12:54.178804 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerDied","Data":"7f76635edb60bf491a762b76e0a7b8207d158079fd706ba1977dae6cce9c45ed"} Dec 03 22:12:54.179563 master-0 kubenswrapper[36504]: I1203 22:12:54.179524 36504 scope.go:117] "RemoveContainer" containerID="7f76635edb60bf491a762b76e0a7b8207d158079fd706ba1977dae6cce9c45ed" Dec 03 22:12:54.947687 master-0 kubenswrapper[36504]: I1203 22:12:54.947610 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-55b8756475-lz989" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" containerID="cri-o://4f12e1f86dff5c2700df5dddb011aaeafcef35ccf1913bff972c8c1e62e3eb1c" gracePeriod=15 Dec 03 22:12:55.195302 master-0 kubenswrapper[36504]: I1203 22:12:55.194998 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b8756475-lz989_a15b7b7c-48b5-417b-8f41-58306b2f0e9a/console/0.log" Dec 03 22:12:55.195302 master-0 kubenswrapper[36504]: I1203 22:12:55.195081 36504 generic.go:334] "Generic (PLEG): container finished" podID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerID="4f12e1f86dff5c2700df5dddb011aaeafcef35ccf1913bff972c8c1e62e3eb1c" exitCode=2 Dec 03 22:12:55.195302 master-0 kubenswrapper[36504]: I1203 22:12:55.195207 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b8756475-lz989" event={"ID":"a15b7b7c-48b5-417b-8f41-58306b2f0e9a","Type":"ContainerDied","Data":"4f12e1f86dff5c2700df5dddb011aaeafcef35ccf1913bff972c8c1e62e3eb1c"} Dec 03 22:12:55.204991 master-0 kubenswrapper[36504]: I1203 22:12:55.204919 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager/0.log" Dec 03 22:12:55.205188 master-0 kubenswrapper[36504]: I1203 22:12:55.205035 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"287a70e59cc5430b23b208b9a03b5ac7","Type":"ContainerStarted","Data":"43a0ef9d17ea7612b3d56ad1047e202234ab9843605fa56bd910e99667c96ddf"} Dec 03 22:12:55.581338 master-0 kubenswrapper[36504]: I1203 22:12:55.581290 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b8756475-lz989_a15b7b7c-48b5-417b-8f41-58306b2f0e9a/console/0.log" Dec 03 22:12:55.581514 master-0 kubenswrapper[36504]: I1203 22:12:55.581361 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b8756475-lz989" Dec 03 22:12:55.622094 master-0 kubenswrapper[36504]: I1203 22:12:55.622010 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-oauth-serving-cert\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.622094 master-0 kubenswrapper[36504]: I1203 22:12:55.622097 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-config\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.622149 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-service-ca\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.622205 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-oauth-config\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.622276 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-trusted-ca-bundle\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.622319 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrh8c\" (UniqueName: \"kubernetes.io/projected/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-kube-api-access-nrh8c\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.622372 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-serving-cert\") pod \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\" (UID: \"a15b7b7c-48b5-417b-8f41-58306b2f0e9a\") " Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.623969 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.623999 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-service-ca" (OuterVolumeSpecName: "service-ca") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.624006 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:55.625168 master-0 kubenswrapper[36504]: I1203 22:12:55.624112 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-config" (OuterVolumeSpecName: "console-config") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:12:55.626062 master-0 kubenswrapper[36504]: I1203 22:12:55.625975 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-kube-api-access-nrh8c" (OuterVolumeSpecName: "kube-api-access-nrh8c") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "kube-api-access-nrh8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:12:55.626798 master-0 kubenswrapper[36504]: I1203 22:12:55.626739 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:55.627171 master-0 kubenswrapper[36504]: I1203 22:12:55.627136 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a15b7b7c-48b5-417b-8f41-58306b2f0e9a" (UID: "a15b7b7c-48b5-417b-8f41-58306b2f0e9a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724425 36504 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724469 36504 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724479 36504 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724491 36504 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724501 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrh8c\" (UniqueName: \"kubernetes.io/projected/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-kube-api-access-nrh8c\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724514 36504 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.724493 master-0 kubenswrapper[36504]: I1203 22:12:55.724522 36504 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a15b7b7c-48b5-417b-8f41-58306b2f0e9a-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:12:55.829925 master-0 kubenswrapper[36504]: I1203 22:12:55.829692 36504 patch_prober.go:28] interesting pod/console-6c4b7b5d77-2z5zb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Dec 03 22:12:55.829925 master-0 kubenswrapper[36504]: I1203 22:12:55.829757 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Dec 03 22:12:56.131468 master-0 kubenswrapper[36504]: I1203 22:12:56.131364 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:56.131468 master-0 kubenswrapper[36504]: I1203 22:12:56.131466 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:56.139870 master-0 kubenswrapper[36504]: I1203 22:12:56.139803 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:56.218496 master-0 kubenswrapper[36504]: I1203 22:12:56.218434 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b8756475-lz989_a15b7b7c-48b5-417b-8f41-58306b2f0e9a/console/0.log" Dec 03 22:12:56.219012 master-0 kubenswrapper[36504]: I1203 22:12:56.218514 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b8756475-lz989" event={"ID":"a15b7b7c-48b5-417b-8f41-58306b2f0e9a","Type":"ContainerDied","Data":"b0579da37a183fbc7b3196e6ef131d8b2fe0fcb480f346328b7ae5fd8a70ab60"} Dec 03 22:12:56.219012 master-0 kubenswrapper[36504]: I1203 22:12:56.218556 36504 scope.go:117] "RemoveContainer" containerID="4f12e1f86dff5c2700df5dddb011aaeafcef35ccf1913bff972c8c1e62e3eb1c" Dec 03 22:12:56.219012 master-0 kubenswrapper[36504]: I1203 22:12:56.218625 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b8756475-lz989" Dec 03 22:12:58.399425 master-0 kubenswrapper[36504]: I1203 22:12:58.398675 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:12:58.402543 master-0 kubenswrapper[36504]: I1203 22:12:58.402485 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:12:59.192614 master-0 kubenswrapper[36504]: I1203 22:12:59.192554 36504 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:59.243914 master-0 kubenswrapper[36504]: I1203 22:12:59.243849 36504 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:59.243914 master-0 kubenswrapper[36504]: I1203 22:12:59.243909 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:12:59.244172 master-0 kubenswrapper[36504]: I1203 22:12:59.244127 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:12:59.249081 master-0 kubenswrapper[36504]: I1203 22:12:59.249026 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:12:59.336074 master-0 kubenswrapper[36504]: I1203 22:12:59.335996 36504 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7035982c3bafe5c31d1c2515ed3c2b7b" podUID="38b28adc-5b89-41ff-a922-3f7bee5f8b6f" Dec 03 22:13:00.254123 master-0 kubenswrapper[36504]: I1203 22:13:00.254038 36504 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:13:00.254123 master-0 kubenswrapper[36504]: I1203 22:13:00.254094 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="3d5d22c6-b01c-495a-8165-9dfd3c94712b" Dec 03 22:13:00.258578 master-0 kubenswrapper[36504]: I1203 22:13:00.258305 36504 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="7035982c3bafe5c31d1c2515ed3c2b7b" podUID="38b28adc-5b89-41ff-a922-3f7bee5f8b6f" Dec 03 22:13:00.327922 master-0 kubenswrapper[36504]: I1203 22:13:00.327832 36504 patch_prober.go:28] interesting pod/console-66896f8657-9276x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" start-of-body= Dec 03 22:13:00.328350 master-0 kubenswrapper[36504]: I1203 22:13:00.327935 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66896f8657-9276x" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" probeResult="failure" output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" Dec 03 22:13:05.828851 master-0 kubenswrapper[36504]: I1203 22:13:05.828743 36504 patch_prober.go:28] interesting pod/console-6c4b7b5d77-2z5zb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Dec 03 22:13:05.829574 master-0 kubenswrapper[36504]: I1203 22:13:05.828852 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Dec 03 22:13:09.755064 master-0 kubenswrapper[36504]: I1203 22:13:09.754961 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-xm8lg" Dec 03 22:13:09.979077 master-0 kubenswrapper[36504]: I1203 22:13:09.979014 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 22:13:10.011617 master-0 kubenswrapper[36504]: I1203 22:13:10.011450 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:13:10.105222 master-0 kubenswrapper[36504]: I1203 22:13:10.105141 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 22:13:10.328152 master-0 kubenswrapper[36504]: I1203 22:13:10.327965 36504 patch_prober.go:28] interesting pod/console-66896f8657-9276x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" start-of-body= Dec 03 22:13:10.328152 master-0 kubenswrapper[36504]: I1203 22:13:10.328129 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66896f8657-9276x" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" probeResult="failure" output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" Dec 03 22:13:10.955365 master-0 kubenswrapper[36504]: I1203 22:13:10.955251 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 22:13:11.175882 master-0 kubenswrapper[36504]: I1203 22:13:11.175824 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 22:13:11.271807 master-0 kubenswrapper[36504]: I1203 22:13:11.271612 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 22:13:11.600073 master-0 kubenswrapper[36504]: I1203 22:13:11.599878 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 22:13:11.635410 master-0 kubenswrapper[36504]: I1203 22:13:11.635331 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 22:13:11.792319 master-0 kubenswrapper[36504]: I1203 22:13:11.792272 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 22:13:11.813173 master-0 kubenswrapper[36504]: I1203 22:13:11.813133 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 22:13:11.971708 master-0 kubenswrapper[36504]: I1203 22:13:11.971642 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 22:13:12.001708 master-0 kubenswrapper[36504]: I1203 22:13:12.001656 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 22:13:12.093736 master-0 kubenswrapper[36504]: I1203 22:13:12.093670 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 22:13:12.185221 master-0 kubenswrapper[36504]: I1203 22:13:12.185125 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Dec 03 22:13:12.229243 master-0 kubenswrapper[36504]: I1203 22:13:12.229076 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 22:13:12.239386 master-0 kubenswrapper[36504]: I1203 22:13:12.239323 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 22:13:12.299066 master-0 kubenswrapper[36504]: I1203 22:13:12.280659 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 22:13:12.377947 master-0 kubenswrapper[36504]: I1203 22:13:12.377891 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 22:13:12.477673 master-0 kubenswrapper[36504]: I1203 22:13:12.477615 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Dec 03 22:13:12.624367 master-0 kubenswrapper[36504]: I1203 22:13:12.624228 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 22:13:12.906839 master-0 kubenswrapper[36504]: I1203 22:13:12.906627 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-k4746" Dec 03 22:13:12.955687 master-0 kubenswrapper[36504]: I1203 22:13:12.955598 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Dec 03 22:13:12.967694 master-0 kubenswrapper[36504]: I1203 22:13:12.967644 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-gfs59jbdhk2g" Dec 03 22:13:13.025693 master-0 kubenswrapper[36504]: I1203 22:13:13.025622 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Dec 03 22:13:13.041842 master-0 kubenswrapper[36504]: I1203 22:13:13.041724 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 03 22:13:13.056906 master-0 kubenswrapper[36504]: I1203 22:13:13.056806 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-l4d7p" Dec 03 22:13:13.097074 master-0 kubenswrapper[36504]: I1203 22:13:13.097003 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 22:13:13.097615 master-0 kubenswrapper[36504]: I1203 22:13:13.097556 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 22:13:13.193188 master-0 kubenswrapper[36504]: I1203 22:13:13.193042 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 22:13:13.353340 master-0 kubenswrapper[36504]: I1203 22:13:13.353242 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-2jjj9" Dec 03 22:13:13.441622 master-0 kubenswrapper[36504]: I1203 22:13:13.441513 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 22:13:13.464987 master-0 kubenswrapper[36504]: I1203 22:13:13.464765 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 22:13:13.534307 master-0 kubenswrapper[36504]: I1203 22:13:13.534235 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 22:13:13.575402 master-0 kubenswrapper[36504]: I1203 22:13:13.575317 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 22:13:13.710459 master-0 kubenswrapper[36504]: I1203 22:13:13.710346 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 22:13:13.936410 master-0 kubenswrapper[36504]: I1203 22:13:13.936345 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-88jcx" Dec 03 22:13:13.961994 master-0 kubenswrapper[36504]: I1203 22:13:13.961913 36504 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 22:13:13.969396 master-0 kubenswrapper[36504]: I1203 22:13:13.969318 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0","openshift-console/console-5588d9469d-jxz4w","openshift-kube-apiserver/kube-apiserver-master-0","openshift-console/console-55b8756475-lz989"] Dec 03 22:13:13.969515 master-0 kubenswrapper[36504]: I1203 22:13:13.969433 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Dec 03 22:13:13.974747 master-0 kubenswrapper[36504]: I1203 22:13:13.974714 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Dec 03 22:13:14.006470 master-0 kubenswrapper[36504]: I1203 22:13:14.006261 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=15.006228611 podStartE2EDuration="15.006228611s" podCreationTimestamp="2025-12-03 22:12:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:13:13.997940632 +0000 UTC m=+159.217712649" watchObservedRunningTime="2025-12-03 22:13:14.006228611 +0000 UTC m=+159.226000648" Dec 03 22:13:14.036994 master-0 kubenswrapper[36504]: I1203 22:13:14.036954 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:13:14.086906 master-0 kubenswrapper[36504]: I1203 22:13:14.086859 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 22:13:14.095148 master-0 kubenswrapper[36504]: I1203 22:13:14.095066 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 22:13:14.142514 master-0 kubenswrapper[36504]: I1203 22:13:14.142441 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 22:13:14.145235 master-0 kubenswrapper[36504]: I1203 22:13:14.145190 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 22:13:14.156909 master-0 kubenswrapper[36504]: I1203 22:13:14.156851 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 22:13:14.286183 master-0 kubenswrapper[36504]: I1203 22:13:14.283009 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 22:13:14.312264 master-0 kubenswrapper[36504]: I1203 22:13:14.312184 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 22:13:14.337283 master-0 kubenswrapper[36504]: I1203 22:13:14.337199 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 22:13:14.343716 master-0 kubenswrapper[36504]: I1203 22:13:14.343683 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 22:13:14.351965 master-0 kubenswrapper[36504]: I1203 22:13:14.351933 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 22:13:14.371360 master-0 kubenswrapper[36504]: I1203 22:13:14.371270 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 03 22:13:14.451620 master-0 kubenswrapper[36504]: I1203 22:13:14.451556 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 22:13:14.569816 master-0 kubenswrapper[36504]: I1203 22:13:14.569543 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 22:13:14.577047 master-0 kubenswrapper[36504]: I1203 22:13:14.576975 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-lm87f" Dec 03 22:13:14.690316 master-0 kubenswrapper[36504]: I1203 22:13:14.690232 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Dec 03 22:13:14.763765 master-0 kubenswrapper[36504]: I1203 22:13:14.763644 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 22:13:14.808249 master-0 kubenswrapper[36504]: I1203 22:13:14.808181 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 22:13:14.849910 master-0 kubenswrapper[36504]: I1203 22:13:14.849682 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-x894t" Dec 03 22:13:14.850705 master-0 kubenswrapper[36504]: I1203 22:13:14.850598 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 22:13:14.945473 master-0 kubenswrapper[36504]: I1203 22:13:14.945439 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 22:13:14.949357 master-0 kubenswrapper[36504]: I1203 22:13:14.949259 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 22:13:14.998956 master-0 kubenswrapper[36504]: I1203 22:13:14.998884 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Dec 03 22:13:15.005095 master-0 kubenswrapper[36504]: I1203 22:13:15.005046 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-tphv8" Dec 03 22:13:15.013960 master-0 kubenswrapper[36504]: I1203 22:13:15.013929 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Dec 03 22:13:15.110143 master-0 kubenswrapper[36504]: I1203 22:13:15.109945 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" path="/var/lib/kubelet/pods/45869470-a94b-4424-a580-3bb4d9e0c675/volumes" Dec 03 22:13:15.111178 master-0 kubenswrapper[36504]: I1203 22:13:15.110985 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9d9300-e389-4941-904a-51dbdcc14c71" path="/var/lib/kubelet/pods/6f9d9300-e389-4941-904a-51dbdcc14c71/volumes" Dec 03 22:13:15.112766 master-0 kubenswrapper[36504]: I1203 22:13:15.112720 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-mjccd" Dec 03 22:13:15.113196 master-0 kubenswrapper[36504]: I1203 22:13:15.113155 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" path="/var/lib/kubelet/pods/a15b7b7c-48b5-417b-8f41-58306b2f0e9a/volumes" Dec 03 22:13:15.205225 master-0 kubenswrapper[36504]: I1203 22:13:15.205162 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 22:13:15.242574 master-0 kubenswrapper[36504]: I1203 22:13:15.242516 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 03 22:13:15.309032 master-0 kubenswrapper[36504]: I1203 22:13:15.308987 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 22:13:15.315808 master-0 kubenswrapper[36504]: I1203 22:13:15.313140 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 22:13:15.381523 master-0 kubenswrapper[36504]: I1203 22:13:15.381462 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 22:13:15.411040 master-0 kubenswrapper[36504]: I1203 22:13:15.410975 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 22:13:15.482957 master-0 kubenswrapper[36504]: I1203 22:13:15.482878 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 22:13:15.507222 master-0 kubenswrapper[36504]: I1203 22:13:15.507162 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 22:13:15.545307 master-0 kubenswrapper[36504]: I1203 22:13:15.545246 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 22:13:15.555146 master-0 kubenswrapper[36504]: I1203 22:13:15.555079 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-9g5tm" Dec 03 22:13:15.574123 master-0 kubenswrapper[36504]: I1203 22:13:15.574038 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 03 22:13:15.612085 master-0 kubenswrapper[36504]: I1203 22:13:15.612012 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 03 22:13:15.646279 master-0 kubenswrapper[36504]: I1203 22:13:15.646130 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 22:13:15.666301 master-0 kubenswrapper[36504]: I1203 22:13:15.666237 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 22:13:15.678903 master-0 kubenswrapper[36504]: I1203 22:13:15.678835 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 22:13:15.714750 master-0 kubenswrapper[36504]: I1203 22:13:15.714617 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 22:13:15.803451 master-0 kubenswrapper[36504]: I1203 22:13:15.803414 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 22:13:15.828246 master-0 kubenswrapper[36504]: I1203 22:13:15.828212 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 22:13:15.828476 master-0 kubenswrapper[36504]: I1203 22:13:15.828420 36504 patch_prober.go:28] interesting pod/console-6c4b7b5d77-2z5zb container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" start-of-body= Dec 03 22:13:15.828476 master-0 kubenswrapper[36504]: I1203 22:13:15.828457 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": dial tcp 10.128.0.109:8443: connect: connection refused" Dec 03 22:13:15.859954 master-0 kubenswrapper[36504]: I1203 22:13:15.859903 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 22:13:15.864073 master-0 kubenswrapper[36504]: I1203 22:13:15.864044 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 03 22:13:15.938662 master-0 kubenswrapper[36504]: I1203 22:13:15.938507 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Dec 03 22:13:16.025993 master-0 kubenswrapper[36504]: I1203 22:13:16.025921 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 22:13:16.051664 master-0 kubenswrapper[36504]: I1203 22:13:16.051615 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 22:13:16.088438 master-0 kubenswrapper[36504]: I1203 22:13:16.088398 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 03 22:13:16.227676 master-0 kubenswrapper[36504]: I1203 22:13:16.227525 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 22:13:16.237848 master-0 kubenswrapper[36504]: I1203 22:13:16.237715 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 22:13:16.238738 master-0 kubenswrapper[36504]: I1203 22:13:16.238701 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 22:13:16.279812 master-0 kubenswrapper[36504]: I1203 22:13:16.279704 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 22:13:16.314984 master-0 kubenswrapper[36504]: I1203 22:13:16.314915 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 22:13:16.367309 master-0 kubenswrapper[36504]: I1203 22:13:16.367230 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 22:13:16.386030 master-0 kubenswrapper[36504]: I1203 22:13:16.385968 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 22:13:16.473134 master-0 kubenswrapper[36504]: I1203 22:13:16.473064 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 22:13:16.499258 master-0 kubenswrapper[36504]: I1203 22:13:16.499121 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Dec 03 22:13:16.528397 master-0 kubenswrapper[36504]: I1203 22:13:16.528356 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Dec 03 22:13:16.549148 master-0 kubenswrapper[36504]: I1203 22:13:16.549070 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 22:13:16.609258 master-0 kubenswrapper[36504]: I1203 22:13:16.609179 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 03 22:13:16.640097 master-0 kubenswrapper[36504]: I1203 22:13:16.640056 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 22:13:16.659738 master-0 kubenswrapper[36504]: I1203 22:13:16.659679 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 22:13:16.684280 master-0 kubenswrapper[36504]: I1203 22:13:16.684226 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-cwgtj" Dec 03 22:13:16.694130 master-0 kubenswrapper[36504]: I1203 22:13:16.694095 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Dec 03 22:13:16.867985 master-0 kubenswrapper[36504]: I1203 22:13:16.867926 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 22:13:16.872755 master-0 kubenswrapper[36504]: I1203 22:13:16.872703 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-52h2hp3iuputk" Dec 03 22:13:16.873799 master-0 kubenswrapper[36504]: I1203 22:13:16.873733 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 22:13:16.908645 master-0 kubenswrapper[36504]: I1203 22:13:16.908594 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-5skvl" Dec 03 22:13:16.977147 master-0 kubenswrapper[36504]: I1203 22:13:16.977090 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 22:13:17.024620 master-0 kubenswrapper[36504]: I1203 22:13:17.024568 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 22:13:17.068619 master-0 kubenswrapper[36504]: I1203 22:13:17.068534 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 22:13:17.125284 master-0 kubenswrapper[36504]: I1203 22:13:17.125085 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 22:13:17.159354 master-0 kubenswrapper[36504]: I1203 22:13:17.159291 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-w6qk9" Dec 03 22:13:17.160330 master-0 kubenswrapper[36504]: I1203 22:13:17.160282 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 22:13:17.195763 master-0 kubenswrapper[36504]: I1203 22:13:17.195716 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 22:13:17.224899 master-0 kubenswrapper[36504]: I1203 22:13:17.224857 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 22:13:17.273475 master-0 kubenswrapper[36504]: I1203 22:13:17.273413 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 03 22:13:17.281974 master-0 kubenswrapper[36504]: I1203 22:13:17.281925 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 03 22:13:17.344232 master-0 kubenswrapper[36504]: I1203 22:13:17.344189 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-1c09f6l24b5bu" Dec 03 22:13:17.458345 master-0 kubenswrapper[36504]: I1203 22:13:17.458183 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:13:17.494129 master-0 kubenswrapper[36504]: I1203 22:13:17.494038 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 22:13:17.643678 master-0 kubenswrapper[36504]: I1203 22:13:17.643628 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 22:13:17.676886 master-0 kubenswrapper[36504]: I1203 22:13:17.676841 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 22:13:17.741581 master-0 kubenswrapper[36504]: I1203 22:13:17.741126 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Dec 03 22:13:17.793227 master-0 kubenswrapper[36504]: I1203 22:13:17.793166 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 22:13:17.862891 master-0 kubenswrapper[36504]: I1203 22:13:17.862834 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 22:13:17.916052 master-0 kubenswrapper[36504]: I1203 22:13:17.915980 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Dec 03 22:13:17.934238 master-0 kubenswrapper[36504]: I1203 22:13:17.934167 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Dec 03 22:13:18.231860 master-0 kubenswrapper[36504]: I1203 22:13:18.231805 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 03 22:13:18.248378 master-0 kubenswrapper[36504]: I1203 22:13:18.248278 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 22:13:18.263638 master-0 kubenswrapper[36504]: I1203 22:13:18.263582 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-9tlz7" Dec 03 22:13:18.265955 master-0 kubenswrapper[36504]: I1203 22:13:18.265924 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 22:13:18.314000 master-0 kubenswrapper[36504]: I1203 22:13:18.313948 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 22:13:18.350064 master-0 kubenswrapper[36504]: I1203 22:13:18.350018 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Dec 03 22:13:18.401297 master-0 kubenswrapper[36504]: I1203 22:13:18.401234 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 22:13:18.507856 master-0 kubenswrapper[36504]: I1203 22:13:18.507639 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 03 22:13:18.560088 master-0 kubenswrapper[36504]: I1203 22:13:18.560034 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-5qkbc" Dec 03 22:13:18.596533 master-0 kubenswrapper[36504]: I1203 22:13:18.596464 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 22:13:18.597465 master-0 kubenswrapper[36504]: I1203 22:13:18.597415 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Dec 03 22:13:18.608258 master-0 kubenswrapper[36504]: I1203 22:13:18.608199 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 03 22:13:18.644820 master-0 kubenswrapper[36504]: I1203 22:13:18.644674 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 22:13:18.710537 master-0 kubenswrapper[36504]: I1203 22:13:18.710462 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Dec 03 22:13:18.733055 master-0 kubenswrapper[36504]: I1203 22:13:18.733000 36504 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 22:13:18.814286 master-0 kubenswrapper[36504]: I1203 22:13:18.814151 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 22:13:18.876274 master-0 kubenswrapper[36504]: I1203 22:13:18.876198 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 22:13:18.878323 master-0 kubenswrapper[36504]: I1203 22:13:18.878254 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 22:13:18.898600 master-0 kubenswrapper[36504]: I1203 22:13:18.898532 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Dec 03 22:13:18.918217 master-0 kubenswrapper[36504]: I1203 22:13:18.918154 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 22:13:18.975883 master-0 kubenswrapper[36504]: I1203 22:13:18.975665 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 22:13:19.043601 master-0 kubenswrapper[36504]: I1203 22:13:19.043350 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 22:13:19.078310 master-0 kubenswrapper[36504]: I1203 22:13:19.078115 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Dec 03 22:13:19.116950 master-0 kubenswrapper[36504]: I1203 22:13:19.116860 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 22:13:19.170901 master-0 kubenswrapper[36504]: I1203 22:13:19.170853 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 22:13:19.171501 master-0 kubenswrapper[36504]: I1203 22:13:19.171456 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 22:13:19.226036 master-0 kubenswrapper[36504]: I1203 22:13:19.225948 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 22:13:19.229225 master-0 kubenswrapper[36504]: I1203 22:13:19.229176 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 22:13:19.354037 master-0 kubenswrapper[36504]: I1203 22:13:19.346910 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Dec 03 22:13:19.376742 master-0 kubenswrapper[36504]: I1203 22:13:19.376646 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-c2z8w" Dec 03 22:13:19.379427 master-0 kubenswrapper[36504]: I1203 22:13:19.379375 36504 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 22:13:19.410617 master-0 kubenswrapper[36504]: I1203 22:13:19.410562 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 22:13:19.417426 master-0 kubenswrapper[36504]: I1203 22:13:19.417383 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 22:13:19.456144 master-0 kubenswrapper[36504]: I1203 22:13:19.454858 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 22:13:19.591367 master-0 kubenswrapper[36504]: I1203 22:13:19.591309 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 22:13:19.628975 master-0 kubenswrapper[36504]: I1203 22:13:19.628884 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 22:13:19.647761 master-0 kubenswrapper[36504]: I1203 22:13:19.647713 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.737987 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: E1203 22:13:19.738333 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.738348 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: E1203 22:13:19.738357 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" containerName="installer" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.738364 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" containerName="installer" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: E1203 22:13:19.738380 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.738387 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.738510 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15b7b7c-48b5-417b-8f41-58306b2f0e9a" containerName="console" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.738539 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0ffab3-4f70-4bc8-9464-598c0668a4b8" containerName="installer" Dec 03 22:13:19.739404 master-0 kubenswrapper[36504]: I1203 22:13:19.738551 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="45869470-a94b-4424-a580-3bb4d9e0c675" containerName="console" Dec 03 22:13:19.741276 master-0 kubenswrapper[36504]: I1203 22:13:19.740498 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.741276 master-0 kubenswrapper[36504]: I1203 22:13:19.741144 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-gk9qh" Dec 03 22:13:19.744603 master-0 kubenswrapper[36504]: I1203 22:13:19.743399 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 03 22:13:19.744603 master-0 kubenswrapper[36504]: I1203 22:13:19.743583 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 03 22:13:19.744603 master-0 kubenswrapper[36504]: I1203 22:13:19.743914 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 03 22:13:19.744603 master-0 kubenswrapper[36504]: I1203 22:13:19.743987 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 03 22:13:19.744603 master-0 kubenswrapper[36504]: I1203 22:13:19.744396 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 03 22:13:19.745850 master-0 kubenswrapper[36504]: I1203 22:13:19.745710 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 03 22:13:19.745850 master-0 kubenswrapper[36504]: I1203 22:13:19.745786 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 03 22:13:19.746131 master-0 kubenswrapper[36504]: I1203 22:13:19.745945 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 03 22:13:19.746297 master-0 kubenswrapper[36504]: I1203 22:13:19.746252 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-3uob4dessmatv" Dec 03 22:13:19.746503 master-0 kubenswrapper[36504]: I1203 22:13:19.746477 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 03 22:13:19.748421 master-0 kubenswrapper[36504]: I1203 22:13:19.748394 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-26pw7" Dec 03 22:13:19.751475 master-0 kubenswrapper[36504]: I1203 22:13:19.751438 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 03 22:13:19.754389 master-0 kubenswrapper[36504]: I1203 22:13:19.754357 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 03 22:13:19.757932 master-0 kubenswrapper[36504]: I1203 22:13:19.757892 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 03 22:13:19.763182 master-0 kubenswrapper[36504]: I1203 22:13:19.763136 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 22:13:19.828129 master-0 kubenswrapper[36504]: I1203 22:13:19.828058 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-config\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.828129 master-0 kubenswrapper[36504]: I1203 22:13:19.828122 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.828445 master-0 kubenswrapper[36504]: I1203 22:13:19.828150 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.828492 master-0 kubenswrapper[36504]: I1203 22:13:19.828411 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.828572 master-0 kubenswrapper[36504]: I1203 22:13:19.828533 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.828680 master-0 kubenswrapper[36504]: I1203 22:13:19.828651 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829208 master-0 kubenswrapper[36504]: I1203 22:13:19.828744 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac5d9240-cb7d-4714-9891-dd602555b7c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829208 master-0 kubenswrapper[36504]: I1203 22:13:19.828869 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829208 master-0 kubenswrapper[36504]: I1203 22:13:19.828955 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g264d\" (UniqueName: \"kubernetes.io/projected/ac5d9240-cb7d-4714-9891-dd602555b7c1-kube-api-access-g264d\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829208 master-0 kubenswrapper[36504]: I1203 22:13:19.828991 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac5d9240-cb7d-4714-9891-dd602555b7c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829208 master-0 kubenswrapper[36504]: I1203 22:13:19.829020 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829208 master-0 kubenswrapper[36504]: I1203 22:13:19.829138 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829502 master-0 kubenswrapper[36504]: I1203 22:13:19.829215 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829502 master-0 kubenswrapper[36504]: I1203 22:13:19.829265 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829502 master-0 kubenswrapper[36504]: I1203 22:13:19.829337 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829502 master-0 kubenswrapper[36504]: I1203 22:13:19.829381 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829502 master-0 kubenswrapper[36504]: I1203 22:13:19.829411 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.829502 master-0 kubenswrapper[36504]: I1203 22:13:19.829450 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.876387 master-0 kubenswrapper[36504]: I1203 22:13:19.876313 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 03 22:13:19.888618 master-0 kubenswrapper[36504]: I1203 22:13:19.888489 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 22:13:19.904405 master-0 kubenswrapper[36504]: I1203 22:13:19.904132 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 22:13:19.922433 master-0 kubenswrapper[36504]: I1203 22:13:19.922333 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 22:13:19.931032 master-0 kubenswrapper[36504]: I1203 22:13:19.930959 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931315 master-0 kubenswrapper[36504]: I1203 22:13:19.931051 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-config\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931315 master-0 kubenswrapper[36504]: I1203 22:13:19.931099 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931315 master-0 kubenswrapper[36504]: I1203 22:13:19.931133 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931315 master-0 kubenswrapper[36504]: I1203 22:13:19.931205 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931315 master-0 kubenswrapper[36504]: I1203 22:13:19.931249 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931315 master-0 kubenswrapper[36504]: I1203 22:13:19.931308 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931345 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac5d9240-cb7d-4714-9891-dd602555b7c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931387 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931458 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g264d\" (UniqueName: \"kubernetes.io/projected/ac5d9240-cb7d-4714-9891-dd602555b7c1-kube-api-access-g264d\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931492 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac5d9240-cb7d-4714-9891-dd602555b7c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931524 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931572 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.931639 master-0 kubenswrapper[36504]: I1203 22:13:19.931606 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.932195 master-0 kubenswrapper[36504]: I1203 22:13:19.931647 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.932195 master-0 kubenswrapper[36504]: I1203 22:13:19.931694 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.932195 master-0 kubenswrapper[36504]: I1203 22:13:19.931726 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.932195 master-0 kubenswrapper[36504]: I1203 22:13:19.931758 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.933287 master-0 kubenswrapper[36504]: I1203 22:13:19.933211 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.933435 master-0 kubenswrapper[36504]: I1203 22:13:19.933395 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.933949 master-0 kubenswrapper[36504]: I1203 22:13:19.933901 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.934920 master-0 kubenswrapper[36504]: I1203 22:13:19.934882 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac5d9240-cb7d-4714-9891-dd602555b7c1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.935422 master-0 kubenswrapper[36504]: I1203 22:13:19.935367 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-config\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.936553 master-0 kubenswrapper[36504]: I1203 22:13:19.936242 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.937675 master-0 kubenswrapper[36504]: I1203 22:13:19.937623 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-web-config\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.938747 master-0 kubenswrapper[36504]: I1203 22:13:19.938680 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.938935 master-0 kubenswrapper[36504]: I1203 22:13:19.938693 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.940693 master-0 kubenswrapper[36504]: I1203 22:13:19.939240 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac5d9240-cb7d-4714-9891-dd602555b7c1-config-out\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.940693 master-0 kubenswrapper[36504]: I1203 22:13:19.940141 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.940693 master-0 kubenswrapper[36504]: I1203 22:13:19.940633 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.941607 master-0 kubenswrapper[36504]: I1203 22:13:19.941571 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.943186 master-0 kubenswrapper[36504]: I1203 22:13:19.942331 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.943186 master-0 kubenswrapper[36504]: I1203 22:13:19.943048 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.943186 master-0 kubenswrapper[36504]: I1203 22:13:19.943174 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ac5d9240-cb7d-4714-9891-dd602555b7c1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.951351 master-0 kubenswrapper[36504]: I1203 22:13:19.951296 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ac5d9240-cb7d-4714-9891-dd602555b7c1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:19.959382 master-0 kubenswrapper[36504]: I1203 22:13:19.959310 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 22:13:19.961618 master-0 kubenswrapper[36504]: I1203 22:13:19.961576 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-dg9ph" Dec 03 22:13:19.961791 master-0 kubenswrapper[36504]: I1203 22:13:19.961698 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 22:13:19.966029 master-0 kubenswrapper[36504]: I1203 22:13:19.965973 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g264d\" (UniqueName: \"kubernetes.io/projected/ac5d9240-cb7d-4714-9891-dd602555b7c1-kube-api-access-g264d\") pod \"prometheus-k8s-0\" (UID: \"ac5d9240-cb7d-4714-9891-dd602555b7c1\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:20.060942 master-0 kubenswrapper[36504]: I1203 22:13:20.060877 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 22:13:20.065237 master-0 kubenswrapper[36504]: I1203 22:13:20.065187 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:20.122627 master-0 kubenswrapper[36504]: I1203 22:13:20.120501 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 22:13:20.203332 master-0 kubenswrapper[36504]: I1203 22:13:20.203210 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 22:13:20.237816 master-0 kubenswrapper[36504]: I1203 22:13:20.233165 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 22:13:20.250092 master-0 kubenswrapper[36504]: I1203 22:13:20.250023 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 22:13:20.254045 master-0 kubenswrapper[36504]: I1203 22:13:20.254003 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-t6rs7" Dec 03 22:13:20.312632 master-0 kubenswrapper[36504]: E1203 22:13:20.312558 36504 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 22:13:20.312632 master-0 kubenswrapper[36504]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_prometheus-k8s-0_openshift-monitoring_ac5d9240-cb7d-4714-9891-dd602555b7c1_0(901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e): error adding pod openshift-monitoring_prometheus-k8s-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e" Netns:"/var/run/netns/e918436b-8821-4c19-9d6a-e4d5580fee16" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=prometheus-k8s-0;K8S_POD_INFRA_CONTAINER_ID=901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e;K8S_POD_UID=ac5d9240-cb7d-4714-9891-dd602555b7c1" Path:"" ERRORED: error configuring pod [openshift-monitoring/prometheus-k8s-0] networking: [openshift-monitoring/prometheus-k8s-0/ac5d9240-cb7d-4714-9891-dd602555b7c1:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] [openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] pod deleted before sandbox ADD operation began. Request Pod UID ac5d9240-cb7d-4714-9891-dd602555b7c1 is different from the Pod UID (6f9d9300-e389-4941-904a-51dbdcc14c71) retrieved from the informer/API Dec 03 22:13:20.312632 master-0 kubenswrapper[36504]: ' Dec 03 22:13:20.312632 master-0 kubenswrapper[36504]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:13:20.312632 master-0 kubenswrapper[36504]: > Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: E1203 22:13:20.312650 36504 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_prometheus-k8s-0_openshift-monitoring_ac5d9240-cb7d-4714-9891-dd602555b7c1_0(901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e): error adding pod openshift-monitoring_prometheus-k8s-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e" Netns:"/var/run/netns/e918436b-8821-4c19-9d6a-e4d5580fee16" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=prometheus-k8s-0;K8S_POD_INFRA_CONTAINER_ID=901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e;K8S_POD_UID=ac5d9240-cb7d-4714-9891-dd602555b7c1" Path:"" ERRORED: error configuring pod [openshift-monitoring/prometheus-k8s-0] networking: [openshift-monitoring/prometheus-k8s-0/ac5d9240-cb7d-4714-9891-dd602555b7c1:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] [openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] pod deleted before sandbox ADD operation began. Request Pod UID ac5d9240-cb7d-4714-9891-dd602555b7c1 is different from the Pod UID (6f9d9300-e389-4941-904a-51dbdcc14c71) retrieved from the informer/API Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: ' Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: > pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: E1203 22:13:20.312674 36504 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_prometheus-k8s-0_openshift-monitoring_ac5d9240-cb7d-4714-9891-dd602555b7c1_0(901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e): error adding pod openshift-monitoring_prometheus-k8s-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e" Netns:"/var/run/netns/e918436b-8821-4c19-9d6a-e4d5580fee16" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=prometheus-k8s-0;K8S_POD_INFRA_CONTAINER_ID=901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e;K8S_POD_UID=ac5d9240-cb7d-4714-9891-dd602555b7c1" Path:"" ERRORED: error configuring pod [openshift-monitoring/prometheus-k8s-0] networking: [openshift-monitoring/prometheus-k8s-0/ac5d9240-cb7d-4714-9891-dd602555b7c1:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] [openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] pod deleted before sandbox ADD operation began. Request Pod UID ac5d9240-cb7d-4714-9891-dd602555b7c1 is different from the Pod UID (6f9d9300-e389-4941-904a-51dbdcc14c71) retrieved from the informer/API Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: ' Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: > pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:20.313145 master-0 kubenswrapper[36504]: E1203 22:13:20.312746 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"prometheus-k8s-0_openshift-monitoring(ac5d9240-cb7d-4714-9891-dd602555b7c1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"prometheus-k8s-0_openshift-monitoring(ac5d9240-cb7d-4714-9891-dd602555b7c1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_prometheus-k8s-0_openshift-monitoring_ac5d9240-cb7d-4714-9891-dd602555b7c1_0(901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e): error adding pod openshift-monitoring_prometheus-k8s-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e\\\" Netns:\\\"/var/run/netns/e918436b-8821-4c19-9d6a-e4d5580fee16\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-monitoring;K8S_POD_NAME=prometheus-k8s-0;K8S_POD_INFRA_CONTAINER_ID=901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e;K8S_POD_UID=ac5d9240-cb7d-4714-9891-dd602555b7c1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-monitoring/prometheus-k8s-0] networking: [openshift-monitoring/prometheus-k8s-0/ac5d9240-cb7d-4714-9891-dd602555b7c1:ovn-kubernetes]: error adding container to network \\\"ovn-kubernetes\\\": CNI request failed with status 400: '[openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] [openshift-monitoring/prometheus-k8s-0 901b700654084ce609dd36db87ae91dceb1a2e71a179ef6156f1483402521f8e network default NAD default] pod deleted before sandbox ADD operation began. Request Pod UID ac5d9240-cb7d-4714-9891-dd602555b7c1 is different from the Pod UID (6f9d9300-e389-4941-904a-51dbdcc14c71) retrieved from the informer/API\\n'\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-monitoring/prometheus-k8s-0" podUID="ac5d9240-cb7d-4714-9891-dd602555b7c1" Dec 03 22:13:20.329514 master-0 kubenswrapper[36504]: I1203 22:13:20.328929 36504 patch_prober.go:28] interesting pod/console-66896f8657-9276x container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" start-of-body= Dec 03 22:13:20.329514 master-0 kubenswrapper[36504]: I1203 22:13:20.329010 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-66896f8657-9276x" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" probeResult="failure" output="Get \"https://10.128.0.112:8443/health\": dial tcp 10.128.0.112:8443: connect: connection refused" Dec 03 22:13:20.411649 master-0 kubenswrapper[36504]: I1203 22:13:20.411593 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-dhpt2" Dec 03 22:13:20.428204 master-0 kubenswrapper[36504]: I1203 22:13:20.428153 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 22:13:20.444476 master-0 kubenswrapper[36504]: I1203 22:13:20.444362 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 22:13:20.465984 master-0 kubenswrapper[36504]: I1203 22:13:20.465503 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 22:13:20.519842 master-0 kubenswrapper[36504]: I1203 22:13:20.519786 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Dec 03 22:13:20.546458 master-0 kubenswrapper[36504]: I1203 22:13:20.546394 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 22:13:20.600823 master-0 kubenswrapper[36504]: I1203 22:13:20.600741 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 22:13:20.636295 master-0 kubenswrapper[36504]: I1203 22:13:20.636247 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-8whkp" Dec 03 22:13:20.648060 master-0 kubenswrapper[36504]: I1203 22:13:20.647904 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Dec 03 22:13:20.679410 master-0 kubenswrapper[36504]: I1203 22:13:20.679361 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 22:13:20.685297 master-0 kubenswrapper[36504]: I1203 22:13:20.685245 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 22:13:20.688044 master-0 kubenswrapper[36504]: I1203 22:13:20.688013 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 22:13:20.688374 master-0 kubenswrapper[36504]: I1203 22:13:20.688350 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 22:13:20.699484 master-0 kubenswrapper[36504]: I1203 22:13:20.699449 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 22:13:20.768000 master-0 kubenswrapper[36504]: I1203 22:13:20.767869 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 22:13:20.797305 master-0 kubenswrapper[36504]: I1203 22:13:20.797263 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 22:13:20.809026 master-0 kubenswrapper[36504]: I1203 22:13:20.808996 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 22:13:20.879739 master-0 kubenswrapper[36504]: I1203 22:13:20.879685 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 22:13:20.939069 master-0 kubenswrapper[36504]: I1203 22:13:20.939023 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 22:13:21.051889 master-0 kubenswrapper[36504]: I1203 22:13:21.051729 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 22:13:21.066359 master-0 kubenswrapper[36504]: I1203 22:13:21.066294 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-56c9b9fa8d9gs" Dec 03 22:13:21.149041 master-0 kubenswrapper[36504]: I1203 22:13:21.148854 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 22:13:21.160741 master-0 kubenswrapper[36504]: I1203 22:13:21.160605 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 22:13:21.173985 master-0 kubenswrapper[36504]: I1203 22:13:21.173720 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 03 22:13:21.176656 master-0 kubenswrapper[36504]: I1203 22:13:21.176529 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 22:13:21.208986 master-0 kubenswrapper[36504]: I1203 22:13:21.208935 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Dec 03 22:13:21.246671 master-0 kubenswrapper[36504]: I1203 22:13:21.246508 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 22:13:21.276805 master-0 kubenswrapper[36504]: I1203 22:13:21.276735 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-5qqsp" Dec 03 22:13:21.371336 master-0 kubenswrapper[36504]: I1203 22:13:21.371271 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 22:13:21.371336 master-0 kubenswrapper[36504]: I1203 22:13:21.371320 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 22:13:21.390933 master-0 kubenswrapper[36504]: I1203 22:13:21.390833 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vbccl" Dec 03 22:13:21.402050 master-0 kubenswrapper[36504]: I1203 22:13:21.402003 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 22:13:21.444531 master-0 kubenswrapper[36504]: I1203 22:13:21.444413 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Dec 03 22:13:21.505544 master-0 kubenswrapper[36504]: I1203 22:13:21.505447 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 22:13:21.511991 master-0 kubenswrapper[36504]: I1203 22:13:21.511914 36504 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Dec 03 22:13:21.512307 master-0 kubenswrapper[36504]: I1203 22:13:21.512226 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a767ff0baecd8145c3bd0a8b3c1771e4" containerName="startup-monitor" containerID="cri-o://38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d" gracePeriod=5 Dec 03 22:13:21.534050 master-0 kubenswrapper[36504]: I1203 22:13:21.533960 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:13:21.640238 master-0 kubenswrapper[36504]: I1203 22:13:21.640124 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Dec 03 22:13:21.643862 master-0 kubenswrapper[36504]: I1203 22:13:21.643841 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 22:13:21.651670 master-0 kubenswrapper[36504]: I1203 22:13:21.651627 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Dec 03 22:13:21.804749 master-0 kubenswrapper[36504]: I1203 22:13:21.804686 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 22:13:21.819422 master-0 kubenswrapper[36504]: I1203 22:13:21.819382 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 22:13:21.847546 master-0 kubenswrapper[36504]: I1203 22:13:21.847508 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 03 22:13:21.934457 master-0 kubenswrapper[36504]: I1203 22:13:21.934347 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 22:13:21.962265 master-0 kubenswrapper[36504]: I1203 22:13:21.962229 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 22:13:22.059908 master-0 kubenswrapper[36504]: I1203 22:13:22.059855 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-gzjfz" Dec 03 22:13:22.065046 master-0 kubenswrapper[36504]: I1203 22:13:22.065003 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 22:13:22.068278 master-0 kubenswrapper[36504]: I1203 22:13:22.068250 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Dec 03 22:13:22.068774 master-0 kubenswrapper[36504]: I1203 22:13:22.068742 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 22:13:22.072075 master-0 kubenswrapper[36504]: I1203 22:13:22.072046 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-ml7r7" Dec 03 22:13:22.171925 master-0 kubenswrapper[36504]: I1203 22:13:22.171752 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 22:13:22.192152 master-0 kubenswrapper[36504]: I1203 22:13:22.191164 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 22:13:22.234385 master-0 kubenswrapper[36504]: I1203 22:13:22.234334 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 22:13:22.272266 master-0 kubenswrapper[36504]: I1203 22:13:22.272207 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Dec 03 22:13:22.336276 master-0 kubenswrapper[36504]: I1203 22:13:22.336223 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Dec 03 22:13:22.359733 master-0 kubenswrapper[36504]: I1203 22:13:22.359500 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 22:13:22.400604 master-0 kubenswrapper[36504]: I1203 22:13:22.400499 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 22:13:22.433733 master-0 kubenswrapper[36504]: I1203 22:13:22.433487 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 03 22:13:22.450782 master-0 kubenswrapper[36504]: I1203 22:13:22.450743 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 22:13:22.506444 master-0 kubenswrapper[36504]: I1203 22:13:22.506375 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 22:13:22.529829 master-0 kubenswrapper[36504]: I1203 22:13:22.529536 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Dec 03 22:13:22.557057 master-0 kubenswrapper[36504]: I1203 22:13:22.557005 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Dec 03 22:13:22.570125 master-0 kubenswrapper[36504]: I1203 22:13:22.570063 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 22:13:22.591575 master-0 kubenswrapper[36504]: I1203 22:13:22.591524 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Dec 03 22:13:22.609947 master-0 kubenswrapper[36504]: I1203 22:13:22.609891 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 22:13:22.625699 master-0 kubenswrapper[36504]: I1203 22:13:22.625632 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 22:13:22.627608 master-0 kubenswrapper[36504]: I1203 22:13:22.627266 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 03 22:13:22.693288 master-0 kubenswrapper[36504]: I1203 22:13:22.693216 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 22:13:22.717651 master-0 kubenswrapper[36504]: I1203 22:13:22.717435 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 22:13:22.719907 master-0 kubenswrapper[36504]: I1203 22:13:22.719306 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Dec 03 22:13:22.721025 master-0 kubenswrapper[36504]: I1203 22:13:22.720968 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Dec 03 22:13:22.805172 master-0 kubenswrapper[36504]: I1203 22:13:22.805086 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 22:13:22.824005 master-0 kubenswrapper[36504]: I1203 22:13:22.823934 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 22:13:22.999249 master-0 kubenswrapper[36504]: I1203 22:13:22.999097 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 22:13:23.069797 master-0 kubenswrapper[36504]: I1203 22:13:23.069722 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 22:13:23.102672 master-0 kubenswrapper[36504]: I1203 22:13:23.102596 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 22:13:23.109372 master-0 kubenswrapper[36504]: I1203 22:13:23.109330 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-7rlph" Dec 03 22:13:23.115287 master-0 kubenswrapper[36504]: I1203 22:13:23.115251 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 22:13:23.162150 master-0 kubenswrapper[36504]: I1203 22:13:23.162091 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Dec 03 22:13:23.247033 master-0 kubenswrapper[36504]: I1203 22:13:23.246967 36504 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 22:13:23.251518 master-0 kubenswrapper[36504]: I1203 22:13:23.251428 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 22:13:23.252637 master-0 kubenswrapper[36504]: I1203 22:13:23.252599 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 22:13:23.285898 master-0 kubenswrapper[36504]: I1203 22:13:23.285839 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 22:13:23.299024 master-0 kubenswrapper[36504]: I1203 22:13:23.298987 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-hf9lh" Dec 03 22:13:23.319894 master-0 kubenswrapper[36504]: I1203 22:13:23.319844 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 22:13:23.331548 master-0 kubenswrapper[36504]: I1203 22:13:23.331513 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 22:13:23.332387 master-0 kubenswrapper[36504]: I1203 22:13:23.332367 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 22:13:23.333489 master-0 kubenswrapper[36504]: I1203 22:13:23.333465 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 22:13:23.346688 master-0 kubenswrapper[36504]: I1203 22:13:23.346642 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 22:13:23.466476 master-0 kubenswrapper[36504]: I1203 22:13:23.466430 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 22:13:23.474160 master-0 kubenswrapper[36504]: I1203 22:13:23.474129 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 22:13:23.489736 master-0 kubenswrapper[36504]: I1203 22:13:23.489646 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 03 22:13:23.518875 master-0 kubenswrapper[36504]: I1203 22:13:23.518681 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 22:13:23.632887 master-0 kubenswrapper[36504]: I1203 22:13:23.632829 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 22:13:23.680515 master-0 kubenswrapper[36504]: I1203 22:13:23.680445 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 22:13:24.226697 master-0 kubenswrapper[36504]: I1203 22:13:24.226634 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 22:13:24.232263 master-0 kubenswrapper[36504]: I1203 22:13:24.232216 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-5q257" Dec 03 22:13:24.243772 master-0 kubenswrapper[36504]: I1203 22:13:24.243700 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 22:13:24.461160 master-0 kubenswrapper[36504]: I1203 22:13:24.461073 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-zj6wp" Dec 03 22:13:24.524655 master-0 kubenswrapper[36504]: I1203 22:13:24.524523 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Dec 03 22:13:24.525365 master-0 kubenswrapper[36504]: I1203 22:13:24.524675 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 22:13:24.560955 master-0 kubenswrapper[36504]: I1203 22:13:24.560899 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 22:13:24.578506 master-0 kubenswrapper[36504]: I1203 22:13:24.578446 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Dec 03 22:13:24.614344 master-0 kubenswrapper[36504]: I1203 22:13:24.614288 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-6lwtj" Dec 03 22:13:24.618988 master-0 kubenswrapper[36504]: I1203 22:13:24.618942 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 22:13:24.680098 master-0 kubenswrapper[36504]: I1203 22:13:24.680049 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 22:13:24.745758 master-0 kubenswrapper[36504]: I1203 22:13:24.745665 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 22:13:24.862339 master-0 kubenswrapper[36504]: I1203 22:13:24.862211 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 22:13:24.936717 master-0 kubenswrapper[36504]: I1203 22:13:24.936652 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 22:13:24.948228 master-0 kubenswrapper[36504]: I1203 22:13:24.948157 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-7n9xc" Dec 03 22:13:25.101736 master-0 kubenswrapper[36504]: I1203 22:13:25.101547 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Dec 03 22:13:25.105243 master-0 kubenswrapper[36504]: I1203 22:13:25.105192 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:13:25.266133 master-0 kubenswrapper[36504]: I1203 22:13:25.266063 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 22:13:25.409568 master-0 kubenswrapper[36504]: I1203 22:13:25.409471 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 22:13:25.555922 master-0 kubenswrapper[36504]: I1203 22:13:25.555811 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 03 22:13:25.685854 master-0 kubenswrapper[36504]: I1203 22:13:25.685746 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-lrw2p" Dec 03 22:13:25.756083 master-0 kubenswrapper[36504]: I1203 22:13:25.756011 36504 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 22:13:25.833436 master-0 kubenswrapper[36504]: I1203 22:13:25.833242 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:13:25.837276 master-0 kubenswrapper[36504]: I1203 22:13:25.837235 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:13:25.844861 master-0 kubenswrapper[36504]: I1203 22:13:25.844776 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 22:13:25.950401 master-0 kubenswrapper[36504]: I1203 22:13:25.950211 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Dec 03 22:13:25.953152 master-0 kubenswrapper[36504]: I1203 22:13:25.953117 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-mss7s" Dec 03 22:13:25.962415 master-0 kubenswrapper[36504]: I1203 22:13:25.962376 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 22:13:26.001332 master-0 kubenswrapper[36504]: I1203 22:13:26.001252 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 22:13:26.032416 master-0 kubenswrapper[36504]: I1203 22:13:26.032360 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Dec 03 22:13:26.051369 master-0 kubenswrapper[36504]: I1203 22:13:26.051296 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 22:13:26.053390 master-0 kubenswrapper[36504]: I1203 22:13:26.053351 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 22:13:26.101187 master-0 kubenswrapper[36504]: I1203 22:13:26.101072 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 22:13:26.129671 master-0 kubenswrapper[36504]: I1203 22:13:26.129615 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Dec 03 22:13:26.185595 master-0 kubenswrapper[36504]: I1203 22:13:26.185517 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 22:13:26.753194 master-0 kubenswrapper[36504]: I1203 22:13:26.753128 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 22:13:26.890193 master-0 kubenswrapper[36504]: I1203 22:13:26.890120 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:13:26.890414 master-0 kubenswrapper[36504]: I1203 22:13:26.890354 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:26.891470 master-0 kubenswrapper[36504]: I1203 22:13:26.891167 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:26.987027 master-0 kubenswrapper[36504]: I1203 22:13:26.984445 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Dec 03 22:13:26.992826 master-0 kubenswrapper[36504]: I1203 22:13:26.992724 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 22:13:27.121370 master-0 kubenswrapper[36504]: I1203 22:13:27.119053 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a767ff0baecd8145c3bd0a8b3c1771e4/startup-monitor/0.log" Dec 03 22:13:27.121370 master-0 kubenswrapper[36504]: I1203 22:13:27.119149 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:13:27.254461 master-0 kubenswrapper[36504]: I1203 22:13:27.254310 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-pod-resource-dir\") pod \"a767ff0baecd8145c3bd0a8b3c1771e4\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " Dec 03 22:13:27.254461 master-0 kubenswrapper[36504]: I1203 22:13:27.254420 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-lock\") pod \"a767ff0baecd8145c3bd0a8b3c1771e4\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " Dec 03 22:13:27.254461 master-0 kubenswrapper[36504]: I1203 22:13:27.254457 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-manifests\") pod \"a767ff0baecd8145c3bd0a8b3c1771e4\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " Dec 03 22:13:27.254811 master-0 kubenswrapper[36504]: I1203 22:13:27.254569 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-resource-dir\") pod \"a767ff0baecd8145c3bd0a8b3c1771e4\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " Dec 03 22:13:27.254811 master-0 kubenswrapper[36504]: I1203 22:13:27.254619 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-log\") pod \"a767ff0baecd8145c3bd0a8b3c1771e4\" (UID: \"a767ff0baecd8145c3bd0a8b3c1771e4\") " Dec 03 22:13:27.254811 master-0 kubenswrapper[36504]: I1203 22:13:27.254558 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-lock" (OuterVolumeSpecName: "var-lock") pod "a767ff0baecd8145c3bd0a8b3c1771e4" (UID: "a767ff0baecd8145c3bd0a8b3c1771e4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:13:27.254811 master-0 kubenswrapper[36504]: I1203 22:13:27.254654 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a767ff0baecd8145c3bd0a8b3c1771e4" (UID: "a767ff0baecd8145c3bd0a8b3c1771e4"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:13:27.254811 master-0 kubenswrapper[36504]: I1203 22:13:27.254654 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-manifests" (OuterVolumeSpecName: "manifests") pod "a767ff0baecd8145c3bd0a8b3c1771e4" (UID: "a767ff0baecd8145c3bd0a8b3c1771e4"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:13:27.255256 master-0 kubenswrapper[36504]: I1203 22:13:27.254816 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-log" (OuterVolumeSpecName: "var-log") pod "a767ff0baecd8145c3bd0a8b3c1771e4" (UID: "a767ff0baecd8145c3bd0a8b3c1771e4"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:13:27.255256 master-0 kubenswrapper[36504]: I1203 22:13:27.255056 36504 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:27.255256 master-0 kubenswrapper[36504]: I1203 22:13:27.255080 36504 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-log\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:27.255256 master-0 kubenswrapper[36504]: I1203 22:13:27.255092 36504 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:27.255256 master-0 kubenswrapper[36504]: I1203 22:13:27.255107 36504 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-manifests\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:27.261481 master-0 kubenswrapper[36504]: I1203 22:13:27.261444 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a767ff0baecd8145c3bd0a8b3c1771e4" (UID: "a767ff0baecd8145c3bd0a8b3c1771e4"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:13:27.357089 master-0 kubenswrapper[36504]: I1203 22:13:27.357029 36504 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a767ff0baecd8145c3bd0a8b3c1771e4-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:27.396131 master-0 kubenswrapper[36504]: I1203 22:13:27.396060 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 22:13:27.402534 master-0 kubenswrapper[36504]: W1203 22:13:27.402444 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5d9240_cb7d_4714_9891_dd602555b7c1.slice/crio-7528397e9c877fdf013afed294261d850967d96b126c2df8aac346df4f1d2619 WatchSource:0}: Error finding container 7528397e9c877fdf013afed294261d850967d96b126c2df8aac346df4f1d2619: Status 404 returned error can't find the container with id 7528397e9c877fdf013afed294261d850967d96b126c2df8aac346df4f1d2619 Dec 03 22:13:27.501395 master-0 kubenswrapper[36504]: I1203 22:13:27.501313 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a767ff0baecd8145c3bd0a8b3c1771e4/startup-monitor/0.log" Dec 03 22:13:27.501395 master-0 kubenswrapper[36504]: I1203 22:13:27.501381 36504 generic.go:334] "Generic (PLEG): container finished" podID="a767ff0baecd8145c3bd0a8b3c1771e4" containerID="38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d" exitCode=137 Dec 03 22:13:27.501613 master-0 kubenswrapper[36504]: I1203 22:13:27.501469 36504 scope.go:117] "RemoveContainer" containerID="38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d" Dec 03 22:13:27.501613 master-0 kubenswrapper[36504]: I1203 22:13:27.501504 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Dec 03 22:13:27.503509 master-0 kubenswrapper[36504]: I1203 22:13:27.503421 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"7528397e9c877fdf013afed294261d850967d96b126c2df8aac346df4f1d2619"} Dec 03 22:13:27.537273 master-0 kubenswrapper[36504]: I1203 22:13:27.537208 36504 scope.go:117] "RemoveContainer" containerID="38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d" Dec 03 22:13:27.538047 master-0 kubenswrapper[36504]: E1203 22:13:27.537997 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d\": container with ID starting with 38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d not found: ID does not exist" containerID="38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d" Dec 03 22:13:27.538170 master-0 kubenswrapper[36504]: I1203 22:13:27.538050 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d"} err="failed to get container status \"38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d\": rpc error: code = NotFound desc = could not find container \"38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d\": container with ID starting with 38e1cc17f3df4a2acff843a2a9318a872272fc625680b9bf443ddf5aa8f7f70d not found: ID does not exist" Dec 03 22:13:27.602511 master-0 kubenswrapper[36504]: I1203 22:13:27.602465 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 22:13:27.798345 master-0 kubenswrapper[36504]: I1203 22:13:27.798137 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 22:13:28.524899 master-0 kubenswrapper[36504]: I1203 22:13:28.524606 36504 generic.go:334] "Generic (PLEG): container finished" podID="ac5d9240-cb7d-4714-9891-dd602555b7c1" containerID="913e496f36f68e84b79f7d5cf37f4644811acd5b2dee7cdde286b8ffd17a70c3" exitCode=0 Dec 03 22:13:28.524899 master-0 kubenswrapper[36504]: I1203 22:13:28.524664 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerDied","Data":"913e496f36f68e84b79f7d5cf37f4644811acd5b2dee7cdde286b8ffd17a70c3"} Dec 03 22:13:29.110272 master-0 kubenswrapper[36504]: I1203 22:13:29.110129 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a767ff0baecd8145c3bd0a8b3c1771e4" path="/var/lib/kubelet/pods/a767ff0baecd8145c3bd0a8b3c1771e4/volumes" Dec 03 22:13:29.551182 master-0 kubenswrapper[36504]: I1203 22:13:29.548807 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"724b3c64264da542f4eb5c9d96fddeb62e6b19db48556d54f7036ae15ce87c04"} Dec 03 22:13:29.551182 master-0 kubenswrapper[36504]: I1203 22:13:29.548865 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"45ec0ceaccdb9f56b462ea45b54bfe01678ba4f416bf1eb2c313888286146d13"} Dec 03 22:13:29.551182 master-0 kubenswrapper[36504]: I1203 22:13:29.548880 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"131ed6713741f238f16165172d59709aea7b06cc6545948299963ac5e2ed4ec7"} Dec 03 22:13:29.551182 master-0 kubenswrapper[36504]: I1203 22:13:29.548893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"36aef5a05840389d5cc33a91bdc43ba3057fedba391c8c2b932119f850450a5d"} Dec 03 22:13:29.551182 master-0 kubenswrapper[36504]: I1203 22:13:29.548907 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"942567b5e2c3e857e8f5e93e61f38ae8899b5e03d376425a3744785e54d2ebd9"} Dec 03 22:13:29.551182 master-0 kubenswrapper[36504]: I1203 22:13:29.548919 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ac5d9240-cb7d-4714-9891-dd602555b7c1","Type":"ContainerStarted","Data":"bc4eaa9e4352696a6514a78c6e0e02aa19ae37146ab44e7bbcf3339d6f8575d7"} Dec 03 22:13:30.065564 master-0 kubenswrapper[36504]: I1203 22:13:30.065462 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:13:30.333695 master-0 kubenswrapper[36504]: I1203 22:13:30.333546 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:13:30.339747 master-0 kubenswrapper[36504]: I1203 22:13:30.339706 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:13:30.366620 master-0 kubenswrapper[36504]: I1203 22:13:30.366535 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=11.366516389 podStartE2EDuration="11.366516389s" podCreationTimestamp="2025-12-03 22:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:13:29.579231726 +0000 UTC m=+174.799003763" watchObservedRunningTime="2025-12-03 22:13:30.366516389 +0000 UTC m=+175.586288396" Dec 03 22:13:30.437648 master-0 kubenswrapper[36504]: I1203 22:13:30.437596 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c4b7b5d77-2z5zb"] Dec 03 22:13:30.557456 master-0 kubenswrapper[36504]: I1203 22:13:30.557332 36504 generic.go:334] "Generic (PLEG): container finished" podID="40f8e70d-5f98-47f1-afa8-ea67242981fc" containerID="eac3faec501ffdc007c07d93a9508e47b671bbce7cc0a7b3a4970c2ac98f0e4b" exitCode=0 Dec 03 22:13:30.558004 master-0 kubenswrapper[36504]: I1203 22:13:30.557946 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" event={"ID":"40f8e70d-5f98-47f1-afa8-ea67242981fc","Type":"ContainerDied","Data":"eac3faec501ffdc007c07d93a9508e47b671bbce7cc0a7b3a4970c2ac98f0e4b"} Dec 03 22:13:31.113133 master-0 kubenswrapper[36504]: I1203 22:13:31.113086 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:13:31.229412 master-0 kubenswrapper[36504]: I1203 22:13:31.229359 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.229858 master-0 kubenswrapper[36504]: I1203 22:13:31.229832 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.230006 master-0 kubenswrapper[36504]: I1203 22:13:31.229988 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.230178 master-0 kubenswrapper[36504]: I1203 22:13:31.230161 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.230319 master-0 kubenswrapper[36504]: I1203 22:13:31.230304 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.230453 master-0 kubenswrapper[36504]: I1203 22:13:31.230438 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.230569 master-0 kubenswrapper[36504]: I1203 22:13:31.230554 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") pod \"40f8e70d-5f98-47f1-afa8-ea67242981fc\" (UID: \"40f8e70d-5f98-47f1-afa8-ea67242981fc\") " Dec 03 22:13:31.230760 master-0 kubenswrapper[36504]: I1203 22:13:31.230678 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:13:31.230995 master-0 kubenswrapper[36504]: I1203 22:13:31.230963 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log" (OuterVolumeSpecName: "audit-log") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:13:31.231147 master-0 kubenswrapper[36504]: I1203 22:13:31.231119 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:13:31.231506 master-0 kubenswrapper[36504]: I1203 22:13:31.231484 36504 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.231610 master-0 kubenswrapper[36504]: I1203 22:13:31.231594 36504 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/40f8e70d-5f98-47f1-afa8-ea67242981fc-audit-log\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.231729 master-0 kubenswrapper[36504]: I1203 22:13:31.231698 36504 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/40f8e70d-5f98-47f1-afa8-ea67242981fc-metrics-server-audit-profiles\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.233176 master-0 kubenswrapper[36504]: I1203 22:13:31.233101 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:13:31.233621 master-0 kubenswrapper[36504]: I1203 22:13:31.233567 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:13:31.235295 master-0 kubenswrapper[36504]: I1203 22:13:31.235239 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww" (OuterVolumeSpecName: "kube-api-access-npkww") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "kube-api-access-npkww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:13:31.236953 master-0 kubenswrapper[36504]: I1203 22:13:31.236900 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "40f8e70d-5f98-47f1-afa8-ea67242981fc" (UID: "40f8e70d-5f98-47f1-afa8-ea67242981fc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:13:31.333573 master-0 kubenswrapper[36504]: I1203 22:13:31.333403 36504 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.333573 master-0 kubenswrapper[36504]: I1203 22:13:31.333456 36504 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-client-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.333573 master-0 kubenswrapper[36504]: I1203 22:13:31.333467 36504 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/40f8e70d-5f98-47f1-afa8-ea67242981fc-secret-metrics-server-tls\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.333573 master-0 kubenswrapper[36504]: I1203 22:13:31.333477 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npkww\" (UniqueName: \"kubernetes.io/projected/40f8e70d-5f98-47f1-afa8-ea67242981fc-kube-api-access-npkww\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:31.569452 master-0 kubenswrapper[36504]: I1203 22:13:31.569304 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" event={"ID":"40f8e70d-5f98-47f1-afa8-ea67242981fc","Type":"ContainerDied","Data":"01563ec1e9dc14e44ec2552850bf0825026fe0a03259d8a4f99d746be494b365"} Dec 03 22:13:31.569452 master-0 kubenswrapper[36504]: I1203 22:13:31.569421 36504 scope.go:117] "RemoveContainer" containerID="eac3faec501ffdc007c07d93a9508e47b671bbce7cc0a7b3a4970c2ac98f0e4b" Dec 03 22:13:31.569948 master-0 kubenswrapper[36504]: I1203 22:13:31.569465 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-b9f5dccb6-4h4jv" Dec 03 22:13:31.682677 master-0 kubenswrapper[36504]: I1203 22:13:31.682509 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-b9f5dccb6-4h4jv"] Dec 03 22:13:31.692899 master-0 kubenswrapper[36504]: I1203 22:13:31.692810 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-b9f5dccb6-4h4jv"] Dec 03 22:13:33.111646 master-0 kubenswrapper[36504]: I1203 22:13:33.111532 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f8e70d-5f98-47f1-afa8-ea67242981fc" path="/var/lib/kubelet/pods/40f8e70d-5f98-47f1-afa8-ea67242981fc/volumes" Dec 03 22:13:35.679571 master-0 kubenswrapper[36504]: E1203 22:13:35.679335 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:13:36.095493 master-0 kubenswrapper[36504]: I1203 22:13:36.095369 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:13:48.727241 master-0 kubenswrapper[36504]: I1203 22:13:48.727145 36504 generic.go:334] "Generic (PLEG): container finished" podID="a4399d20-f9a6-4ab1-86be-e2845394eaba" containerID="1f34be941b3fc23e5df25dd00be7228b904591ae9bbedc66623d73e0baeab2fe" exitCode=0 Dec 03 22:13:48.727241 master-0 kubenswrapper[36504]: I1203 22:13:48.727208 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerDied","Data":"1f34be941b3fc23e5df25dd00be7228b904591ae9bbedc66623d73e0baeab2fe"} Dec 03 22:13:48.727241 master-0 kubenswrapper[36504]: I1203 22:13:48.727244 36504 scope.go:117] "RemoveContainer" containerID="8d502fe6a4b80a8d86ec823cb141773d9b14b3a8b0f0a224e12fcddbb0204483" Dec 03 22:13:48.728678 master-0 kubenswrapper[36504]: I1203 22:13:48.728081 36504 scope.go:117] "RemoveContainer" containerID="1f34be941b3fc23e5df25dd00be7228b904591ae9bbedc66623d73e0baeab2fe" Dec 03 22:13:49.740484 master-0 kubenswrapper[36504]: I1203 22:13:49.740379 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" event={"ID":"a4399d20-f9a6-4ab1-86be-e2845394eaba","Type":"ContainerStarted","Data":"2f5fbb693ca900587fd1615092d0fb99f148bef3ba8c0dd10d897956ce2c63c6"} Dec 03 22:13:49.741426 master-0 kubenswrapper[36504]: I1203 22:13:49.740896 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:13:49.744100 master-0 kubenswrapper[36504]: I1203 22:13:49.744037 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-7d67745bb7-4jd6d" Dec 03 22:13:55.488107 master-0 kubenswrapper[36504]: I1203 22:13:55.488009 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" containerID="cri-o://be290e513743319bbbcc4bff45a64a81baa694de12cf0da73461a9cbd2d70f83" gracePeriod=15 Dec 03 22:13:55.795962 master-0 kubenswrapper[36504]: I1203 22:13:55.795846 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4b7b5d77-2z5zb_bf7c999c-2a7f-49e9-8b49-3372fcaf59b0/console/0.log" Dec 03 22:13:55.795962 master-0 kubenswrapper[36504]: I1203 22:13:55.795892 36504 generic.go:334] "Generic (PLEG): container finished" podID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerID="be290e513743319bbbcc4bff45a64a81baa694de12cf0da73461a9cbd2d70f83" exitCode=2 Dec 03 22:13:55.795962 master-0 kubenswrapper[36504]: I1203 22:13:55.795923 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4b7b5d77-2z5zb" event={"ID":"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0","Type":"ContainerDied","Data":"be290e513743319bbbcc4bff45a64a81baa694de12cf0da73461a9cbd2d70f83"} Dec 03 22:13:55.988791 master-0 kubenswrapper[36504]: I1203 22:13:55.988748 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4b7b5d77-2z5zb_bf7c999c-2a7f-49e9-8b49-3372fcaf59b0/console/0.log" Dec 03 22:13:55.989135 master-0 kubenswrapper[36504]: I1203 22:13:55.988872 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:13:56.052034 master-0 kubenswrapper[36504]: I1203 22:13:56.051866 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-config\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052034 master-0 kubenswrapper[36504]: I1203 22:13:56.051970 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-oauth-config\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052034 master-0 kubenswrapper[36504]: I1203 22:13:56.051992 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-oauth-serving-cert\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052034 master-0 kubenswrapper[36504]: I1203 22:13:56.052011 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv2rr\" (UniqueName: \"kubernetes.io/projected/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-kube-api-access-vv2rr\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052034 master-0 kubenswrapper[36504]: I1203 22:13:56.052030 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-service-ca\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052599 master-0 kubenswrapper[36504]: I1203 22:13:56.052104 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-serving-cert\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052599 master-0 kubenswrapper[36504]: I1203 22:13:56.052129 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-trusted-ca-bundle\") pod \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\" (UID: \"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0\") " Dec 03 22:13:56.052987 master-0 kubenswrapper[36504]: I1203 22:13:56.052836 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:13:56.052987 master-0 kubenswrapper[36504]: I1203 22:13:56.052829 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-config" (OuterVolumeSpecName: "console-config") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:13:56.054202 master-0 kubenswrapper[36504]: I1203 22:13:56.054043 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:13:56.055095 master-0 kubenswrapper[36504]: I1203 22:13:56.055014 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-service-ca" (OuterVolumeSpecName: "service-ca") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:13:56.056638 master-0 kubenswrapper[36504]: I1203 22:13:56.056558 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:13:56.057125 master-0 kubenswrapper[36504]: I1203 22:13:56.057084 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-kube-api-access-vv2rr" (OuterVolumeSpecName: "kube-api-access-vv2rr") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "kube-api-access-vv2rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:13:56.058038 master-0 kubenswrapper[36504]: I1203 22:13:56.057996 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" (UID: "bf7c999c-2a7f-49e9-8b49-3372fcaf59b0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154147 36504 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154240 36504 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154255 36504 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154266 36504 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154278 36504 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154291 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv2rr\" (UniqueName: \"kubernetes.io/projected/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-kube-api-access-vv2rr\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.154280 master-0 kubenswrapper[36504]: I1203 22:13:56.154302 36504 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:13:56.804625 master-0 kubenswrapper[36504]: I1203 22:13:56.804562 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4b7b5d77-2z5zb_bf7c999c-2a7f-49e9-8b49-3372fcaf59b0/console/0.log" Dec 03 22:13:56.805258 master-0 kubenswrapper[36504]: I1203 22:13:56.804637 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4b7b5d77-2z5zb" event={"ID":"bf7c999c-2a7f-49e9-8b49-3372fcaf59b0","Type":"ContainerDied","Data":"e3c3b0a64a288f9a6eb4fb457b73d8e090e6fd73caf9aac80031c9cdbdd68ba1"} Dec 03 22:13:56.805258 master-0 kubenswrapper[36504]: I1203 22:13:56.804672 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4b7b5d77-2z5zb" Dec 03 22:13:56.805258 master-0 kubenswrapper[36504]: I1203 22:13:56.804691 36504 scope.go:117] "RemoveContainer" containerID="be290e513743319bbbcc4bff45a64a81baa694de12cf0da73461a9cbd2d70f83" Dec 03 22:13:56.830954 master-0 kubenswrapper[36504]: I1203 22:13:56.830903 36504 patch_prober.go:28] interesting pod/console-6c4b7b5d77-2z5zb container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.128.0.109:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:13:56.831144 master-0 kubenswrapper[36504]: I1203 22:13:56.830986 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6c4b7b5d77-2z5zb" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" probeResult="failure" output="Get \"https://10.128.0.109:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:13:56.846068 master-0 kubenswrapper[36504]: I1203 22:13:56.845996 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c4b7b5d77-2z5zb"] Dec 03 22:13:56.852141 master-0 kubenswrapper[36504]: I1203 22:13:56.852084 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c4b7b5d77-2z5zb"] Dec 03 22:13:57.103062 master-0 kubenswrapper[36504]: I1203 22:13:57.102932 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" path="/var/lib/kubelet/pods/bf7c999c-2a7f-49e9-8b49-3372fcaf59b0/volumes" Dec 03 22:14:20.066084 master-0 kubenswrapper[36504]: I1203 22:14:20.065989 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:14:20.121396 master-0 kubenswrapper[36504]: I1203 22:14:20.121298 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:14:21.033058 master-0 kubenswrapper[36504]: I1203 22:14:21.032976 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 22:14:35.688468 master-0 kubenswrapper[36504]: E1203 22:14:35.688395 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:14:36.095450 master-0 kubenswrapper[36504]: I1203 22:14:36.095397 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:14:41.463306 master-0 kubenswrapper[36504]: I1203 22:14:41.463224 36504 patch_prober.go:28] interesting pod/metrics-server-576975bfdb-gd855 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.128.0.92:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:14:41.464071 master-0 kubenswrapper[36504]: I1203 22:14:41.463268 36504 patch_prober.go:28] interesting pod/metrics-server-576975bfdb-gd855 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.128.0.92:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 22:14:41.464071 master-0 kubenswrapper[36504]: I1203 22:14:41.463314 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" podUID="1993ad4d-95fc-4af9-9ce3-e29112a1a3a4" containerName="metrics-server" probeResult="failure" output="Get \"https://10.128.0.92:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:14:41.464071 master-0 kubenswrapper[36504]: I1203 22:14:41.463429 36504 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-576975bfdb-gd855" podUID="1993ad4d-95fc-4af9-9ce3-e29112a1a3a4" containerName="metrics-server" probeResult="failure" output="Get \"https://10.128.0.92:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 22:14:48.095620 master-0 kubenswrapper[36504]: I1203 22:14:48.095544 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:15:00.346802 master-0 kubenswrapper[36504]: I1203 22:15:00.346723 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l"] Dec 03 22:15:00.347878 master-0 kubenswrapper[36504]: E1203 22:15:00.347861 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a767ff0baecd8145c3bd0a8b3c1771e4" containerName="startup-monitor" Dec 03 22:15:00.347964 master-0 kubenswrapper[36504]: I1203 22:15:00.347954 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a767ff0baecd8145c3bd0a8b3c1771e4" containerName="startup-monitor" Dec 03 22:15:00.348036 master-0 kubenswrapper[36504]: E1203 22:15:00.348026 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" Dec 03 22:15:00.348103 master-0 kubenswrapper[36504]: I1203 22:15:00.348093 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" Dec 03 22:15:00.348163 master-0 kubenswrapper[36504]: E1203 22:15:00.348154 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f8e70d-5f98-47f1-afa8-ea67242981fc" containerName="metrics-server" Dec 03 22:15:00.348220 master-0 kubenswrapper[36504]: I1203 22:15:00.348211 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f8e70d-5f98-47f1-afa8-ea67242981fc" containerName="metrics-server" Dec 03 22:15:00.348454 master-0 kubenswrapper[36504]: I1203 22:15:00.348439 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a767ff0baecd8145c3bd0a8b3c1771e4" containerName="startup-monitor" Dec 03 22:15:00.348576 master-0 kubenswrapper[36504]: I1203 22:15:00.348564 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7c999c-2a7f-49e9-8b49-3372fcaf59b0" containerName="console" Dec 03 22:15:00.348671 master-0 kubenswrapper[36504]: I1203 22:15:00.348657 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f8e70d-5f98-47f1-afa8-ea67242981fc" containerName="metrics-server" Dec 03 22:15:00.349321 master-0 kubenswrapper[36504]: I1203 22:15:00.349275 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.351591 master-0 kubenswrapper[36504]: I1203 22:15:00.351000 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0aa99791-2fca-46c5-98f7-206de72957bb-secret-volume\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.351591 master-0 kubenswrapper[36504]: I1203 22:15:00.351228 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0aa99791-2fca-46c5-98f7-206de72957bb-config-volume\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.351591 master-0 kubenswrapper[36504]: I1203 22:15:00.351299 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 22:15:00.351591 master-0 kubenswrapper[36504]: I1203 22:15:00.351410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfq4\" (UniqueName: \"kubernetes.io/projected/0aa99791-2fca-46c5-98f7-206de72957bb-kube-api-access-mmfq4\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.351591 master-0 kubenswrapper[36504]: I1203 22:15:00.351595 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:15:00.353746 master-0 kubenswrapper[36504]: I1203 22:15:00.353703 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l"] Dec 03 22:15:00.452404 master-0 kubenswrapper[36504]: I1203 22:15:00.452326 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfq4\" (UniqueName: \"kubernetes.io/projected/0aa99791-2fca-46c5-98f7-206de72957bb-kube-api-access-mmfq4\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.452769 master-0 kubenswrapper[36504]: I1203 22:15:00.452471 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0aa99791-2fca-46c5-98f7-206de72957bb-secret-volume\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.452769 master-0 kubenswrapper[36504]: I1203 22:15:00.452515 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0aa99791-2fca-46c5-98f7-206de72957bb-config-volume\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.453540 master-0 kubenswrapper[36504]: I1203 22:15:00.453494 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0aa99791-2fca-46c5-98f7-206de72957bb-config-volume\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.455581 master-0 kubenswrapper[36504]: I1203 22:15:00.455539 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0aa99791-2fca-46c5-98f7-206de72957bb-secret-volume\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.472278 master-0 kubenswrapper[36504]: I1203 22:15:00.472226 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfq4\" (UniqueName: \"kubernetes.io/projected/0aa99791-2fca-46c5-98f7-206de72957bb-kube-api-access-mmfq4\") pod \"collect-profiles-29413335-9cf9l\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:00.674333 master-0 kubenswrapper[36504]: I1203 22:15:00.674244 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:01.230199 master-0 kubenswrapper[36504]: I1203 22:15:01.230138 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l"] Dec 03 22:15:01.231823 master-0 kubenswrapper[36504]: W1203 22:15:01.231743 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa99791_2fca_46c5_98f7_206de72957bb.slice/crio-095f59fc7ab1ed34148e71b74a973d1caa27b08a2d591fa35d1877e27eda4b4a WatchSource:0}: Error finding container 095f59fc7ab1ed34148e71b74a973d1caa27b08a2d591fa35d1877e27eda4b4a: Status 404 returned error can't find the container with id 095f59fc7ab1ed34148e71b74a973d1caa27b08a2d591fa35d1877e27eda4b4a Dec 03 22:15:01.349777 master-0 kubenswrapper[36504]: I1203 22:15:01.349659 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" event={"ID":"0aa99791-2fca-46c5-98f7-206de72957bb","Type":"ContainerStarted","Data":"095f59fc7ab1ed34148e71b74a973d1caa27b08a2d591fa35d1877e27eda4b4a"} Dec 03 22:15:02.358792 master-0 kubenswrapper[36504]: I1203 22:15:02.358667 36504 generic.go:334] "Generic (PLEG): container finished" podID="0aa99791-2fca-46c5-98f7-206de72957bb" containerID="ef6203f31725415f0d2b8868e71e83f9ab91a310fd0c29b08688106ab0bc8c60" exitCode=0 Dec 03 22:15:02.358792 master-0 kubenswrapper[36504]: I1203 22:15:02.358727 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" event={"ID":"0aa99791-2fca-46c5-98f7-206de72957bb","Type":"ContainerDied","Data":"ef6203f31725415f0d2b8868e71e83f9ab91a310fd0c29b08688106ab0bc8c60"} Dec 03 22:15:03.716821 master-0 kubenswrapper[36504]: I1203 22:15:03.716752 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:03.903052 master-0 kubenswrapper[36504]: I1203 22:15:03.902950 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmfq4\" (UniqueName: \"kubernetes.io/projected/0aa99791-2fca-46c5-98f7-206de72957bb-kube-api-access-mmfq4\") pod \"0aa99791-2fca-46c5-98f7-206de72957bb\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " Dec 03 22:15:03.903052 master-0 kubenswrapper[36504]: I1203 22:15:03.903095 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0aa99791-2fca-46c5-98f7-206de72957bb-config-volume\") pod \"0aa99791-2fca-46c5-98f7-206de72957bb\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " Dec 03 22:15:03.903052 master-0 kubenswrapper[36504]: I1203 22:15:03.903311 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0aa99791-2fca-46c5-98f7-206de72957bb-secret-volume\") pod \"0aa99791-2fca-46c5-98f7-206de72957bb\" (UID: \"0aa99791-2fca-46c5-98f7-206de72957bb\") " Dec 03 22:15:03.904361 master-0 kubenswrapper[36504]: I1203 22:15:03.903987 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa99791-2fca-46c5-98f7-206de72957bb-config-volume" (OuterVolumeSpecName: "config-volume") pod "0aa99791-2fca-46c5-98f7-206de72957bb" (UID: "0aa99791-2fca-46c5-98f7-206de72957bb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:15:03.908598 master-0 kubenswrapper[36504]: I1203 22:15:03.908530 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa99791-2fca-46c5-98f7-206de72957bb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0aa99791-2fca-46c5-98f7-206de72957bb" (UID: "0aa99791-2fca-46c5-98f7-206de72957bb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:15:03.909111 master-0 kubenswrapper[36504]: I1203 22:15:03.909027 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa99791-2fca-46c5-98f7-206de72957bb-kube-api-access-mmfq4" (OuterVolumeSpecName: "kube-api-access-mmfq4") pod "0aa99791-2fca-46c5-98f7-206de72957bb" (UID: "0aa99791-2fca-46c5-98f7-206de72957bb"). InnerVolumeSpecName "kube-api-access-mmfq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:15:04.005613 master-0 kubenswrapper[36504]: I1203 22:15:04.005504 36504 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0aa99791-2fca-46c5-98f7-206de72957bb-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:04.005613 master-0 kubenswrapper[36504]: I1203 22:15:04.005590 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmfq4\" (UniqueName: \"kubernetes.io/projected/0aa99791-2fca-46c5-98f7-206de72957bb-kube-api-access-mmfq4\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:04.005613 master-0 kubenswrapper[36504]: I1203 22:15:04.005617 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0aa99791-2fca-46c5-98f7-206de72957bb-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:04.378292 master-0 kubenswrapper[36504]: I1203 22:15:04.378231 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" event={"ID":"0aa99791-2fca-46c5-98f7-206de72957bb","Type":"ContainerDied","Data":"095f59fc7ab1ed34148e71b74a973d1caa27b08a2d591fa35d1877e27eda4b4a"} Dec 03 22:15:04.378292 master-0 kubenswrapper[36504]: I1203 22:15:04.378281 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="095f59fc7ab1ed34148e71b74a973d1caa27b08a2d591fa35d1877e27eda4b4a" Dec 03 22:15:04.378292 master-0 kubenswrapper[36504]: I1203 22:15:04.378281 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l" Dec 03 22:15:30.974217 master-0 kubenswrapper[36504]: I1203 22:15:30.974115 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-57g9j"] Dec 03 22:15:30.975062 master-0 kubenswrapper[36504]: E1203 22:15:30.974793 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aa99791-2fca-46c5-98f7-206de72957bb" containerName="collect-profiles" Dec 03 22:15:30.975062 master-0 kubenswrapper[36504]: I1203 22:15:30.974818 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa99791-2fca-46c5-98f7-206de72957bb" containerName="collect-profiles" Dec 03 22:15:30.975145 master-0 kubenswrapper[36504]: I1203 22:15:30.975080 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aa99791-2fca-46c5-98f7-206de72957bb" containerName="collect-profiles" Dec 03 22:15:30.981792 master-0 kubenswrapper[36504]: I1203 22:15:30.981716 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:30.993443 master-0 kubenswrapper[36504]: I1203 22:15:30.993327 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57g9j"] Dec 03 22:15:31.116340 master-0 kubenswrapper[36504]: I1203 22:15:31.116277 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt7tn\" (UniqueName: \"kubernetes.io/projected/92d46019-3965-4845-a265-1287773cc7fe-kube-api-access-tt7tn\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.116656 master-0 kubenswrapper[36504]: I1203 22:15:31.116567 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-utilities\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.116747 master-0 kubenswrapper[36504]: I1203 22:15:31.116669 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-catalog-content\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.218335 master-0 kubenswrapper[36504]: I1203 22:15:31.218241 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt7tn\" (UniqueName: \"kubernetes.io/projected/92d46019-3965-4845-a265-1287773cc7fe-kube-api-access-tt7tn\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.218696 master-0 kubenswrapper[36504]: I1203 22:15:31.218438 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-utilities\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.218696 master-0 kubenswrapper[36504]: I1203 22:15:31.218464 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-catalog-content\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.219271 master-0 kubenswrapper[36504]: I1203 22:15:31.219223 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-catalog-content\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.220539 master-0 kubenswrapper[36504]: I1203 22:15:31.220474 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-utilities\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.242358 master-0 kubenswrapper[36504]: I1203 22:15:31.242172 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt7tn\" (UniqueName: \"kubernetes.io/projected/92d46019-3965-4845-a265-1287773cc7fe-kube-api-access-tt7tn\") pod \"certified-operators-57g9j\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.315183 master-0 kubenswrapper[36504]: I1203 22:15:31.315108 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:31.809677 master-0 kubenswrapper[36504]: I1203 22:15:31.809616 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57g9j"] Dec 03 22:15:31.812624 master-0 kubenswrapper[36504]: W1203 22:15:31.812559 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d46019_3965_4845_a265_1287773cc7fe.slice/crio-60a729c56608d8867caa067bfc3bd2a8d0111e82796c3a0fea25002afd3ec100 WatchSource:0}: Error finding container 60a729c56608d8867caa067bfc3bd2a8d0111e82796c3a0fea25002afd3ec100: Status 404 returned error can't find the container with id 60a729c56608d8867caa067bfc3bd2a8d0111e82796c3a0fea25002afd3ec100 Dec 03 22:15:32.694810 master-0 kubenswrapper[36504]: I1203 22:15:32.694725 36504 generic.go:334] "Generic (PLEG): container finished" podID="92d46019-3965-4845-a265-1287773cc7fe" containerID="d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c" exitCode=0 Dec 03 22:15:32.694810 master-0 kubenswrapper[36504]: I1203 22:15:32.694793 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerDied","Data":"d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c"} Dec 03 22:15:32.694810 master-0 kubenswrapper[36504]: I1203 22:15:32.694825 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerStarted","Data":"60a729c56608d8867caa067bfc3bd2a8d0111e82796c3a0fea25002afd3ec100"} Dec 03 22:15:33.708075 master-0 kubenswrapper[36504]: I1203 22:15:33.707995 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerStarted","Data":"ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5"} Dec 03 22:15:34.732670 master-0 kubenswrapper[36504]: I1203 22:15:34.732598 36504 generic.go:334] "Generic (PLEG): container finished" podID="92d46019-3965-4845-a265-1287773cc7fe" containerID="ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5" exitCode=0 Dec 03 22:15:34.733645 master-0 kubenswrapper[36504]: I1203 22:15:34.733585 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerDied","Data":"ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5"} Dec 03 22:15:35.071819 master-0 kubenswrapper[36504]: I1203 22:15:35.071647 36504 kubelet.go:1505] "Image garbage collection succeeded" Dec 03 22:15:35.683037 master-0 kubenswrapper[36504]: E1203 22:15:35.682961 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:15:35.745156 master-0 kubenswrapper[36504]: I1203 22:15:35.745088 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerStarted","Data":"f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef"} Dec 03 22:15:35.783031 master-0 kubenswrapper[36504]: I1203 22:15:35.782926 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-57g9j" podStartSLOduration=3.205671272 podStartE2EDuration="5.78290258s" podCreationTimestamp="2025-12-03 22:15:30 +0000 UTC" firstStartedPulling="2025-12-03 22:15:32.696209999 +0000 UTC m=+297.915982006" lastFinishedPulling="2025-12-03 22:15:35.273441307 +0000 UTC m=+300.493213314" observedRunningTime="2025-12-03 22:15:35.774292261 +0000 UTC m=+300.994064268" watchObservedRunningTime="2025-12-03 22:15:35.78290258 +0000 UTC m=+301.002674597" Dec 03 22:15:37.096394 master-0 kubenswrapper[36504]: I1203 22:15:37.096312 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:15:41.316037 master-0 kubenswrapper[36504]: I1203 22:15:41.315910 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:41.316865 master-0 kubenswrapper[36504]: I1203 22:15:41.316227 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:41.388881 master-0 kubenswrapper[36504]: I1203 22:15:41.388754 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:41.887860 master-0 kubenswrapper[36504]: I1203 22:15:41.887792 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:41.981301 master-0 kubenswrapper[36504]: I1203 22:15:41.981226 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-57g9j"] Dec 03 22:15:43.832287 master-0 kubenswrapper[36504]: I1203 22:15:43.832144 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-57g9j" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="registry-server" containerID="cri-o://f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef" gracePeriod=2 Dec 03 22:15:44.368095 master-0 kubenswrapper[36504]: I1203 22:15:44.368040 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:44.552864 master-0 kubenswrapper[36504]: I1203 22:15:44.552808 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-catalog-content\") pod \"92d46019-3965-4845-a265-1287773cc7fe\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " Dec 03 22:15:44.553107 master-0 kubenswrapper[36504]: I1203 22:15:44.552979 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-utilities\") pod \"92d46019-3965-4845-a265-1287773cc7fe\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " Dec 03 22:15:44.553107 master-0 kubenswrapper[36504]: I1203 22:15:44.553070 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt7tn\" (UniqueName: \"kubernetes.io/projected/92d46019-3965-4845-a265-1287773cc7fe-kube-api-access-tt7tn\") pod \"92d46019-3965-4845-a265-1287773cc7fe\" (UID: \"92d46019-3965-4845-a265-1287773cc7fe\") " Dec 03 22:15:44.553939 master-0 kubenswrapper[36504]: I1203 22:15:44.553883 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-utilities" (OuterVolumeSpecName: "utilities") pod "92d46019-3965-4845-a265-1287773cc7fe" (UID: "92d46019-3965-4845-a265-1287773cc7fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:15:44.556214 master-0 kubenswrapper[36504]: I1203 22:15:44.556152 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d46019-3965-4845-a265-1287773cc7fe-kube-api-access-tt7tn" (OuterVolumeSpecName: "kube-api-access-tt7tn") pod "92d46019-3965-4845-a265-1287773cc7fe" (UID: "92d46019-3965-4845-a265-1287773cc7fe"). InnerVolumeSpecName "kube-api-access-tt7tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:15:44.655127 master-0 kubenswrapper[36504]: I1203 22:15:44.655053 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:44.655127 master-0 kubenswrapper[36504]: I1203 22:15:44.655108 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt7tn\" (UniqueName: \"kubernetes.io/projected/92d46019-3965-4845-a265-1287773cc7fe-kube-api-access-tt7tn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:44.765457 master-0 kubenswrapper[36504]: I1203 22:15:44.765407 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92d46019-3965-4845-a265-1287773cc7fe" (UID: "92d46019-3965-4845-a265-1287773cc7fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:15:44.860475 master-0 kubenswrapper[36504]: I1203 22:15:44.858042 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92d46019-3965-4845-a265-1287773cc7fe-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:44.880242 master-0 kubenswrapper[36504]: I1203 22:15:44.879960 36504 generic.go:334] "Generic (PLEG): container finished" podID="92d46019-3965-4845-a265-1287773cc7fe" containerID="f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef" exitCode=0 Dec 03 22:15:44.880700 master-0 kubenswrapper[36504]: I1203 22:15:44.880647 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerDied","Data":"f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef"} Dec 03 22:15:44.880821 master-0 kubenswrapper[36504]: I1203 22:15:44.880806 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57g9j" event={"ID":"92d46019-3965-4845-a265-1287773cc7fe","Type":"ContainerDied","Data":"60a729c56608d8867caa067bfc3bd2a8d0111e82796c3a0fea25002afd3ec100"} Dec 03 22:15:44.880896 master-0 kubenswrapper[36504]: I1203 22:15:44.880808 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57g9j" Dec 03 22:15:44.881057 master-0 kubenswrapper[36504]: I1203 22:15:44.880828 36504 scope.go:117] "RemoveContainer" containerID="f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef" Dec 03 22:15:44.910236 master-0 kubenswrapper[36504]: I1203 22:15:44.910198 36504 scope.go:117] "RemoveContainer" containerID="ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5" Dec 03 22:15:44.928060 master-0 kubenswrapper[36504]: I1203 22:15:44.927999 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-57g9j"] Dec 03 22:15:44.935180 master-0 kubenswrapper[36504]: I1203 22:15:44.935130 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-57g9j"] Dec 03 22:15:44.937828 master-0 kubenswrapper[36504]: I1203 22:15:44.937803 36504 scope.go:117] "RemoveContainer" containerID="d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c" Dec 03 22:15:44.961555 master-0 kubenswrapper[36504]: I1203 22:15:44.961193 36504 scope.go:117] "RemoveContainer" containerID="f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef" Dec 03 22:15:44.962340 master-0 kubenswrapper[36504]: E1203 22:15:44.962018 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef\": container with ID starting with f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef not found: ID does not exist" containerID="f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef" Dec 03 22:15:44.962626 master-0 kubenswrapper[36504]: I1203 22:15:44.962580 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef"} err="failed to get container status \"f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef\": rpc error: code = NotFound desc = could not find container \"f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef\": container with ID starting with f85402b35e5cb43243c3831c4050635c6a56bc51e04a1775de8e7991522c04ef not found: ID does not exist" Dec 03 22:15:44.962626 master-0 kubenswrapper[36504]: I1203 22:15:44.962620 36504 scope.go:117] "RemoveContainer" containerID="ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5" Dec 03 22:15:44.963164 master-0 kubenswrapper[36504]: E1203 22:15:44.963097 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5\": container with ID starting with ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5 not found: ID does not exist" containerID="ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5" Dec 03 22:15:44.963253 master-0 kubenswrapper[36504]: I1203 22:15:44.963177 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5"} err="failed to get container status \"ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5\": rpc error: code = NotFound desc = could not find container \"ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5\": container with ID starting with ae7ff6c147cc94236c9bee09dcd8cb8cae86cfbda1e88eae9efee3f3378d15f5 not found: ID does not exist" Dec 03 22:15:44.963253 master-0 kubenswrapper[36504]: I1203 22:15:44.963225 36504 scope.go:117] "RemoveContainer" containerID="d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c" Dec 03 22:15:44.963657 master-0 kubenswrapper[36504]: E1203 22:15:44.963615 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c\": container with ID starting with d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c not found: ID does not exist" containerID="d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c" Dec 03 22:15:44.963745 master-0 kubenswrapper[36504]: I1203 22:15:44.963667 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c"} err="failed to get container status \"d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c\": rpc error: code = NotFound desc = could not find container \"d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c\": container with ID starting with d3f34ffb6692870a3f3d5468d7b66636bc11a222e67e663be59293d93bc0a16c not found: ID does not exist" Dec 03 22:15:45.113979 master-0 kubenswrapper[36504]: I1203 22:15:45.113702 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d46019-3965-4845-a265-1287773cc7fe" path="/var/lib/kubelet/pods/92d46019-3965-4845-a265-1287773cc7fe/volumes" Dec 03 22:15:45.922108 master-0 kubenswrapper[36504]: I1203 22:15:45.921928 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r"] Dec 03 22:15:45.924965 master-0 kubenswrapper[36504]: E1203 22:15:45.924820 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="registry-server" Dec 03 22:15:45.924965 master-0 kubenswrapper[36504]: I1203 22:15:45.924906 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="registry-server" Dec 03 22:15:45.924965 master-0 kubenswrapper[36504]: E1203 22:15:45.924949 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="extract-content" Dec 03 22:15:45.924965 master-0 kubenswrapper[36504]: I1203 22:15:45.924968 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="extract-content" Dec 03 22:15:45.925331 master-0 kubenswrapper[36504]: E1203 22:15:45.925056 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="extract-utilities" Dec 03 22:15:45.925331 master-0 kubenswrapper[36504]: I1203 22:15:45.925077 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="extract-utilities" Dec 03 22:15:45.925807 master-0 kubenswrapper[36504]: I1203 22:15:45.925431 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d46019-3965-4845-a265-1287773cc7fe" containerName="registry-server" Dec 03 22:15:45.927678 master-0 kubenswrapper[36504]: I1203 22:15:45.927572 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:45.937278 master-0 kubenswrapper[36504]: I1203 22:15:45.937186 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r"] Dec 03 22:15:46.077112 master-0 kubenswrapper[36504]: I1203 22:15:46.077042 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r94rg\" (UniqueName: \"kubernetes.io/projected/34aec373-c49c-4419-a48d-cff893920bef-kube-api-access-r94rg\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.077112 master-0 kubenswrapper[36504]: I1203 22:15:46.077118 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.077441 master-0 kubenswrapper[36504]: I1203 22:15:46.077192 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.178520 master-0 kubenswrapper[36504]: I1203 22:15:46.178357 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.178520 master-0 kubenswrapper[36504]: I1203 22:15:46.178445 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r94rg\" (UniqueName: \"kubernetes.io/projected/34aec373-c49c-4419-a48d-cff893920bef-kube-api-access-r94rg\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.178520 master-0 kubenswrapper[36504]: I1203 22:15:46.178480 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.179038 master-0 kubenswrapper[36504]: I1203 22:15:46.179003 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.179285 master-0 kubenswrapper[36504]: I1203 22:15:46.179217 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.196572 master-0 kubenswrapper[36504]: I1203 22:15:46.196516 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r94rg\" (UniqueName: \"kubernetes.io/projected/34aec373-c49c-4419-a48d-cff893920bef-kube-api-access-r94rg\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.259985 master-0 kubenswrapper[36504]: I1203 22:15:46.259885 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:46.743196 master-0 kubenswrapper[36504]: I1203 22:15:46.740549 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r"] Dec 03 22:15:46.905147 master-0 kubenswrapper[36504]: I1203 22:15:46.905090 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" event={"ID":"34aec373-c49c-4419-a48d-cff893920bef","Type":"ContainerStarted","Data":"0b3a277d83f81cf13fd2c11d78b3eab11f13e4ebccbfd9c0c9117b5578771fb0"} Dec 03 22:15:47.919830 master-0 kubenswrapper[36504]: I1203 22:15:47.919675 36504 generic.go:334] "Generic (PLEG): container finished" podID="34aec373-c49c-4419-a48d-cff893920bef" containerID="2de20656424aee55e1c693f50ebda314607d428184dcf7182cef22eaa6a982dd" exitCode=0 Dec 03 22:15:47.919830 master-0 kubenswrapper[36504]: I1203 22:15:47.919763 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" event={"ID":"34aec373-c49c-4419-a48d-cff893920bef","Type":"ContainerDied","Data":"2de20656424aee55e1c693f50ebda314607d428184dcf7182cef22eaa6a982dd"} Dec 03 22:15:49.474107 master-0 kubenswrapper[36504]: I1203 22:15:49.472250 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zwtdd"] Dec 03 22:15:49.474107 master-0 kubenswrapper[36504]: I1203 22:15:49.473975 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.507389 master-0 kubenswrapper[36504]: I1203 22:15:49.507084 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwtdd"] Dec 03 22:15:49.643396 master-0 kubenswrapper[36504]: I1203 22:15:49.643331 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-utilities\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.643673 master-0 kubenswrapper[36504]: I1203 22:15:49.643445 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6dtk\" (UniqueName: \"kubernetes.io/projected/206689a9-2233-44a1-8070-abdf8adf2f86-kube-api-access-x6dtk\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.643673 master-0 kubenswrapper[36504]: I1203 22:15:49.643538 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-catalog-content\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.745251 master-0 kubenswrapper[36504]: I1203 22:15:49.745191 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6dtk\" (UniqueName: \"kubernetes.io/projected/206689a9-2233-44a1-8070-abdf8adf2f86-kube-api-access-x6dtk\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.745485 master-0 kubenswrapper[36504]: I1203 22:15:49.745342 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-catalog-content\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.745485 master-0 kubenswrapper[36504]: I1203 22:15:49.745404 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-utilities\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.746085 master-0 kubenswrapper[36504]: I1203 22:15:49.746037 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-utilities\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.746152 master-0 kubenswrapper[36504]: I1203 22:15:49.746120 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-catalog-content\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.760993 master-0 kubenswrapper[36504]: I1203 22:15:49.760926 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6dtk\" (UniqueName: \"kubernetes.io/projected/206689a9-2233-44a1-8070-abdf8adf2f86-kube-api-access-x6dtk\") pod \"redhat-operators-zwtdd\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.797074 master-0 kubenswrapper[36504]: I1203 22:15:49.796993 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:49.948836 master-0 kubenswrapper[36504]: I1203 22:15:49.948749 36504 generic.go:334] "Generic (PLEG): container finished" podID="34aec373-c49c-4419-a48d-cff893920bef" containerID="6714a27f374e7cb627f62051c86fb471e26038b11a2da8fdb74fc937e286497d" exitCode=0 Dec 03 22:15:49.948836 master-0 kubenswrapper[36504]: I1203 22:15:49.948827 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" event={"ID":"34aec373-c49c-4419-a48d-cff893920bef","Type":"ContainerDied","Data":"6714a27f374e7cb627f62051c86fb471e26038b11a2da8fdb74fc937e286497d"} Dec 03 22:15:50.251761 master-0 kubenswrapper[36504]: I1203 22:15:50.251718 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zwtdd"] Dec 03 22:15:50.272023 master-0 kubenswrapper[36504]: W1203 22:15:50.271959 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206689a9_2233_44a1_8070_abdf8adf2f86.slice/crio-4261f1ae4efebfbff026a012f83019226ebba84bcc9bff9f856f012a2d183516 WatchSource:0}: Error finding container 4261f1ae4efebfbff026a012f83019226ebba84bcc9bff9f856f012a2d183516: Status 404 returned error can't find the container with id 4261f1ae4efebfbff026a012f83019226ebba84bcc9bff9f856f012a2d183516 Dec 03 22:15:50.959379 master-0 kubenswrapper[36504]: I1203 22:15:50.959289 36504 generic.go:334] "Generic (PLEG): container finished" podID="34aec373-c49c-4419-a48d-cff893920bef" containerID="193c4593291633cf767a2d5bcb839bfcf51fad01160e0cbe55a849852c7aceb1" exitCode=0 Dec 03 22:15:50.960028 master-0 kubenswrapper[36504]: I1203 22:15:50.959384 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" event={"ID":"34aec373-c49c-4419-a48d-cff893920bef","Type":"ContainerDied","Data":"193c4593291633cf767a2d5bcb839bfcf51fad01160e0cbe55a849852c7aceb1"} Dec 03 22:15:50.961502 master-0 kubenswrapper[36504]: I1203 22:15:50.961472 36504 generic.go:334] "Generic (PLEG): container finished" podID="206689a9-2233-44a1-8070-abdf8adf2f86" containerID="9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc" exitCode=0 Dec 03 22:15:50.961647 master-0 kubenswrapper[36504]: I1203 22:15:50.961629 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerDied","Data":"9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc"} Dec 03 22:15:50.961742 master-0 kubenswrapper[36504]: I1203 22:15:50.961727 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerStarted","Data":"4261f1ae4efebfbff026a012f83019226ebba84bcc9bff9f856f012a2d183516"} Dec 03 22:15:50.964044 master-0 kubenswrapper[36504]: I1203 22:15:50.963999 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:15:51.973561 master-0 kubenswrapper[36504]: I1203 22:15:51.973481 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerStarted","Data":"bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941"} Dec 03 22:15:52.560479 master-0 kubenswrapper[36504]: I1203 22:15:52.560337 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:52.699682 master-0 kubenswrapper[36504]: I1203 22:15:52.699590 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r94rg\" (UniqueName: \"kubernetes.io/projected/34aec373-c49c-4419-a48d-cff893920bef-kube-api-access-r94rg\") pod \"34aec373-c49c-4419-a48d-cff893920bef\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " Dec 03 22:15:52.700065 master-0 kubenswrapper[36504]: I1203 22:15:52.699748 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-util\") pod \"34aec373-c49c-4419-a48d-cff893920bef\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " Dec 03 22:15:52.700065 master-0 kubenswrapper[36504]: I1203 22:15:52.699788 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-bundle\") pod \"34aec373-c49c-4419-a48d-cff893920bef\" (UID: \"34aec373-c49c-4419-a48d-cff893920bef\") " Dec 03 22:15:52.704448 master-0 kubenswrapper[36504]: I1203 22:15:52.701573 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-bundle" (OuterVolumeSpecName: "bundle") pod "34aec373-c49c-4419-a48d-cff893920bef" (UID: "34aec373-c49c-4419-a48d-cff893920bef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:15:52.704448 master-0 kubenswrapper[36504]: I1203 22:15:52.704310 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34aec373-c49c-4419-a48d-cff893920bef-kube-api-access-r94rg" (OuterVolumeSpecName: "kube-api-access-r94rg") pod "34aec373-c49c-4419-a48d-cff893920bef" (UID: "34aec373-c49c-4419-a48d-cff893920bef"). InnerVolumeSpecName "kube-api-access-r94rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:15:52.734434 master-0 kubenswrapper[36504]: I1203 22:15:52.734212 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-util" (OuterVolumeSpecName: "util") pod "34aec373-c49c-4419-a48d-cff893920bef" (UID: "34aec373-c49c-4419-a48d-cff893920bef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:15:52.803625 master-0 kubenswrapper[36504]: I1203 22:15:52.803423 36504 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-util\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:52.803625 master-0 kubenswrapper[36504]: I1203 22:15:52.803535 36504 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34aec373-c49c-4419-a48d-cff893920bef-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:52.803625 master-0 kubenswrapper[36504]: I1203 22:15:52.803612 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r94rg\" (UniqueName: \"kubernetes.io/projected/34aec373-c49c-4419-a48d-cff893920bef-kube-api-access-r94rg\") on node \"master-0\" DevicePath \"\"" Dec 03 22:15:52.993781 master-0 kubenswrapper[36504]: I1203 22:15:52.993690 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" Dec 03 22:15:52.994375 master-0 kubenswrapper[36504]: I1203 22:15:52.993935 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4gfm2r" event={"ID":"34aec373-c49c-4419-a48d-cff893920bef","Type":"ContainerDied","Data":"0b3a277d83f81cf13fd2c11d78b3eab11f13e4ebccbfd9c0c9117b5578771fb0"} Dec 03 22:15:52.994375 master-0 kubenswrapper[36504]: I1203 22:15:52.994015 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b3a277d83f81cf13fd2c11d78b3eab11f13e4ebccbfd9c0c9117b5578771fb0" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: I1203 22:15:53.452923 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r4d2d"] Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: E1203 22:15:53.453319 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="extract" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: I1203 22:15:53.453336 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="extract" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: E1203 22:15:53.453382 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="util" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: I1203 22:15:53.453392 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="util" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: E1203 22:15:53.453404 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="pull" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: I1203 22:15:53.453413 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="pull" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: I1203 22:15:53.453607 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="34aec373-c49c-4419-a48d-cff893920bef" containerName="extract" Dec 03 22:15:53.457901 master-0 kubenswrapper[36504]: I1203 22:15:53.454968 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.487984 master-0 kubenswrapper[36504]: I1203 22:15:53.487861 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4d2d"] Dec 03 22:15:53.617729 master-0 kubenswrapper[36504]: I1203 22:15:53.617655 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gplt2\" (UniqueName: \"kubernetes.io/projected/dd6c47ce-dd4e-415b-a497-cf019e16a778-kube-api-access-gplt2\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.617981 master-0 kubenswrapper[36504]: I1203 22:15:53.617782 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-catalog-content\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.617981 master-0 kubenswrapper[36504]: I1203 22:15:53.617833 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-utilities\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.799681 master-0 kubenswrapper[36504]: I1203 22:15:53.720049 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-utilities\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.799681 master-0 kubenswrapper[36504]: I1203 22:15:53.720197 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gplt2\" (UniqueName: \"kubernetes.io/projected/dd6c47ce-dd4e-415b-a497-cf019e16a778-kube-api-access-gplt2\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.799681 master-0 kubenswrapper[36504]: I1203 22:15:53.720286 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-catalog-content\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.799681 master-0 kubenswrapper[36504]: I1203 22:15:53.721016 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-catalog-content\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.799681 master-0 kubenswrapper[36504]: I1203 22:15:53.721302 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-utilities\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.799681 master-0 kubenswrapper[36504]: I1203 22:15:53.741574 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gplt2\" (UniqueName: \"kubernetes.io/projected/dd6c47ce-dd4e-415b-a497-cf019e16a778-kube-api-access-gplt2\") pod \"community-operators-r4d2d\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:53.805258 master-0 kubenswrapper[36504]: I1203 22:15:53.805205 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:15:54.065796 master-0 kubenswrapper[36504]: I1203 22:15:54.061450 36504 generic.go:334] "Generic (PLEG): container finished" podID="206689a9-2233-44a1-8070-abdf8adf2f86" containerID="bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941" exitCode=0 Dec 03 22:15:54.065796 master-0 kubenswrapper[36504]: I1203 22:15:54.061504 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerDied","Data":"bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941"} Dec 03 22:15:54.363538 master-0 kubenswrapper[36504]: I1203 22:15:54.363379 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4d2d"] Dec 03 22:15:54.369447 master-0 kubenswrapper[36504]: W1203 22:15:54.369424 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6c47ce_dd4e_415b_a497_cf019e16a778.slice/crio-683451a2d091bcf6fa1dc9eaf0f2c83b429231cd88223269a3702c28f2bc7efe WatchSource:0}: Error finding container 683451a2d091bcf6fa1dc9eaf0f2c83b429231cd88223269a3702c28f2bc7efe: Status 404 returned error can't find the container with id 683451a2d091bcf6fa1dc9eaf0f2c83b429231cd88223269a3702c28f2bc7efe Dec 03 22:15:55.087346 master-0 kubenswrapper[36504]: I1203 22:15:55.087170 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerStarted","Data":"85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc"} Dec 03 22:15:55.089987 master-0 kubenswrapper[36504]: I1203 22:15:55.089927 36504 generic.go:334] "Generic (PLEG): container finished" podID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerID="fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336" exitCode=0 Dec 03 22:15:55.090104 master-0 kubenswrapper[36504]: I1203 22:15:55.089987 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerDied","Data":"fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336"} Dec 03 22:15:55.090104 master-0 kubenswrapper[36504]: I1203 22:15:55.090020 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerStarted","Data":"683451a2d091bcf6fa1dc9eaf0f2c83b429231cd88223269a3702c28f2bc7efe"} Dec 03 22:15:55.110869 master-0 kubenswrapper[36504]: I1203 22:15:55.110743 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zwtdd" podStartSLOduration=2.49831132 podStartE2EDuration="6.11071867s" podCreationTimestamp="2025-12-03 22:15:49 +0000 UTC" firstStartedPulling="2025-12-03 22:15:50.963919809 +0000 UTC m=+316.183691816" lastFinishedPulling="2025-12-03 22:15:54.576327159 +0000 UTC m=+319.796099166" observedRunningTime="2025-12-03 22:15:55.110693009 +0000 UTC m=+320.330465026" watchObservedRunningTime="2025-12-03 22:15:55.11071867 +0000 UTC m=+320.330490677" Dec 03 22:15:56.099586 master-0 kubenswrapper[36504]: I1203 22:15:56.099430 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerStarted","Data":"dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10"} Dec 03 22:15:58.118407 master-0 kubenswrapper[36504]: I1203 22:15:58.118345 36504 generic.go:334] "Generic (PLEG): container finished" podID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerID="dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10" exitCode=0 Dec 03 22:15:58.118407 master-0 kubenswrapper[36504]: I1203 22:15:58.118396 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerDied","Data":"dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10"} Dec 03 22:15:58.532720 master-0 kubenswrapper[36504]: I1203 22:15:58.532659 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-68cfd8bfc5-t7rsr"] Dec 03 22:15:58.533620 master-0 kubenswrapper[36504]: I1203 22:15:58.533598 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.540901 master-0 kubenswrapper[36504]: I1203 22:15:58.540853 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Dec 03 22:15:58.541332 master-0 kubenswrapper[36504]: I1203 22:15:58.541308 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Dec 03 22:15:58.541700 master-0 kubenswrapper[36504]: I1203 22:15:58.541675 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Dec 03 22:15:58.542160 master-0 kubenswrapper[36504]: I1203 22:15:58.542127 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Dec 03 22:15:58.543219 master-0 kubenswrapper[36504]: I1203 22:15:58.543195 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Dec 03 22:15:58.559910 master-0 kubenswrapper[36504]: I1203 22:15:58.559847 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-68cfd8bfc5-t7rsr"] Dec 03 22:15:58.609216 master-0 kubenswrapper[36504]: I1203 22:15:58.608646 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q964s\" (UniqueName: \"kubernetes.io/projected/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-kube-api-access-q964s\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.609216 master-0 kubenswrapper[36504]: I1203 22:15:58.608748 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-webhook-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.609216 master-0 kubenswrapper[36504]: I1203 22:15:58.608785 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-metrics-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.609216 master-0 kubenswrapper[36504]: I1203 22:15:58.608802 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-socket-dir\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.609216 master-0 kubenswrapper[36504]: I1203 22:15:58.608835 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-apiservice-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.711797 master-0 kubenswrapper[36504]: I1203 22:15:58.710515 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-apiservice-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.711797 master-0 kubenswrapper[36504]: I1203 22:15:58.710651 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q964s\" (UniqueName: \"kubernetes.io/projected/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-kube-api-access-q964s\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.711797 master-0 kubenswrapper[36504]: I1203 22:15:58.710756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-webhook-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.711797 master-0 kubenswrapper[36504]: I1203 22:15:58.710795 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-metrics-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.711797 master-0 kubenswrapper[36504]: I1203 22:15:58.710815 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-socket-dir\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.711797 master-0 kubenswrapper[36504]: I1203 22:15:58.711442 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-socket-dir\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.720790 master-0 kubenswrapper[36504]: I1203 22:15:58.718934 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-metrics-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.720790 master-0 kubenswrapper[36504]: I1203 22:15:58.719067 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-apiservice-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.720790 master-0 kubenswrapper[36504]: I1203 22:15:58.720146 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-webhook-cert\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.731788 master-0 kubenswrapper[36504]: I1203 22:15:58.730805 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q964s\" (UniqueName: \"kubernetes.io/projected/8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb-kube-api-access-q964s\") pod \"lvms-operator-68cfd8bfc5-t7rsr\" (UID: \"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb\") " pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:58.853246 master-0 kubenswrapper[36504]: I1203 22:15:58.853103 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:15:59.130881 master-0 kubenswrapper[36504]: I1203 22:15:59.130820 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerStarted","Data":"fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e"} Dec 03 22:15:59.161104 master-0 kubenswrapper[36504]: I1203 22:15:59.159750 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r4d2d" podStartSLOduration=2.730929188 podStartE2EDuration="6.159720821s" podCreationTimestamp="2025-12-03 22:15:53 +0000 UTC" firstStartedPulling="2025-12-03 22:15:55.091749148 +0000 UTC m=+320.311521155" lastFinishedPulling="2025-12-03 22:15:58.520540781 +0000 UTC m=+323.740312788" observedRunningTime="2025-12-03 22:15:59.154194049 +0000 UTC m=+324.373966056" watchObservedRunningTime="2025-12-03 22:15:59.159720821 +0000 UTC m=+324.379492838" Dec 03 22:15:59.390243 master-0 kubenswrapper[36504]: I1203 22:15:59.390044 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-68cfd8bfc5-t7rsr"] Dec 03 22:15:59.407367 master-0 kubenswrapper[36504]: W1203 22:15:59.407318 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ccfce7e_cbef_4cba_b6e0_a9e7a5a34ddb.slice/crio-7059798709a138229509b27012f6f364e2150c1e64ed7aa2061841e897649d9b WatchSource:0}: Error finding container 7059798709a138229509b27012f6f364e2150c1e64ed7aa2061841e897649d9b: Status 404 returned error can't find the container with id 7059798709a138229509b27012f6f364e2150c1e64ed7aa2061841e897649d9b Dec 03 22:15:59.798028 master-0 kubenswrapper[36504]: I1203 22:15:59.797827 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:15:59.798028 master-0 kubenswrapper[36504]: I1203 22:15:59.797945 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:16:00.141548 master-0 kubenswrapper[36504]: I1203 22:16:00.140138 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" event={"ID":"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb","Type":"ContainerStarted","Data":"7059798709a138229509b27012f6f364e2150c1e64ed7aa2061841e897649d9b"} Dec 03 22:16:00.887023 master-0 kubenswrapper[36504]: I1203 22:16:00.886950 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zwtdd" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="registry-server" probeResult="failure" output=< Dec 03 22:16:00.887023 master-0 kubenswrapper[36504]: timeout: failed to connect service ":50051" within 1s Dec 03 22:16:00.887023 master-0 kubenswrapper[36504]: > Dec 03 22:16:03.810149 master-0 kubenswrapper[36504]: I1203 22:16:03.810022 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:16:03.810149 master-0 kubenswrapper[36504]: I1203 22:16:03.810101 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:16:03.874992 master-0 kubenswrapper[36504]: I1203 22:16:03.874910 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:16:04.245518 master-0 kubenswrapper[36504]: I1203 22:16:04.244999 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:16:06.061948 master-0 kubenswrapper[36504]: I1203 22:16:06.061888 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4d2d"] Dec 03 22:16:06.203285 master-0 kubenswrapper[36504]: I1203 22:16:06.203191 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" event={"ID":"8ccfce7e-cbef-4cba-b6e0-a9e7a5a34ddb","Type":"ContainerStarted","Data":"d564f77c7eb101340e6b0d49221f07a7d532fdc246d01f3157033c14dbda155d"} Dec 03 22:16:06.203639 master-0 kubenswrapper[36504]: I1203 22:16:06.203371 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r4d2d" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="registry-server" containerID="cri-o://fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e" gracePeriod=2 Dec 03 22:16:06.236932 master-0 kubenswrapper[36504]: I1203 22:16:06.236817 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" podStartSLOduration=2.163601798 podStartE2EDuration="8.236770673s" podCreationTimestamp="2025-12-03 22:15:58 +0000 UTC" firstStartedPulling="2025-12-03 22:15:59.410311869 +0000 UTC m=+324.630083876" lastFinishedPulling="2025-12-03 22:16:05.483480724 +0000 UTC m=+330.703252751" observedRunningTime="2025-12-03 22:16:06.230881729 +0000 UTC m=+331.450653746" watchObservedRunningTime="2025-12-03 22:16:06.236770673 +0000 UTC m=+331.456542680" Dec 03 22:16:06.676183 master-0 kubenswrapper[36504]: I1203 22:16:06.676048 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:16:06.785820 master-0 kubenswrapper[36504]: I1203 22:16:06.785726 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gplt2\" (UniqueName: \"kubernetes.io/projected/dd6c47ce-dd4e-415b-a497-cf019e16a778-kube-api-access-gplt2\") pod \"dd6c47ce-dd4e-415b-a497-cf019e16a778\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " Dec 03 22:16:06.786418 master-0 kubenswrapper[36504]: I1203 22:16:06.786343 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-utilities\") pod \"dd6c47ce-dd4e-415b-a497-cf019e16a778\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " Dec 03 22:16:06.786672 master-0 kubenswrapper[36504]: I1203 22:16:06.786654 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-catalog-content\") pod \"dd6c47ce-dd4e-415b-a497-cf019e16a778\" (UID: \"dd6c47ce-dd4e-415b-a497-cf019e16a778\") " Dec 03 22:16:06.787549 master-0 kubenswrapper[36504]: I1203 22:16:06.787509 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-utilities" (OuterVolumeSpecName: "utilities") pod "dd6c47ce-dd4e-415b-a497-cf019e16a778" (UID: "dd6c47ce-dd4e-415b-a497-cf019e16a778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:06.791280 master-0 kubenswrapper[36504]: I1203 22:16:06.791190 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6c47ce-dd4e-415b-a497-cf019e16a778-kube-api-access-gplt2" (OuterVolumeSpecName: "kube-api-access-gplt2") pod "dd6c47ce-dd4e-415b-a497-cf019e16a778" (UID: "dd6c47ce-dd4e-415b-a497-cf019e16a778"). InnerVolumeSpecName "kube-api-access-gplt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:06.889852 master-0 kubenswrapper[36504]: I1203 22:16:06.889738 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gplt2\" (UniqueName: \"kubernetes.io/projected/dd6c47ce-dd4e-415b-a497-cf019e16a778-kube-api-access-gplt2\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:06.889852 master-0 kubenswrapper[36504]: I1203 22:16:06.889838 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:06.911257 master-0 kubenswrapper[36504]: I1203 22:16:06.911163 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd6c47ce-dd4e-415b-a497-cf019e16a778" (UID: "dd6c47ce-dd4e-415b-a497-cf019e16a778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:06.991887 master-0 kubenswrapper[36504]: I1203 22:16:06.991835 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6c47ce-dd4e-415b-a497-cf019e16a778-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:07.215810 master-0 kubenswrapper[36504]: I1203 22:16:07.215587 36504 generic.go:334] "Generic (PLEG): container finished" podID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerID="fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e" exitCode=0 Dec 03 22:16:07.215810 master-0 kubenswrapper[36504]: I1203 22:16:07.215704 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerDied","Data":"fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e"} Dec 03 22:16:07.215810 master-0 kubenswrapper[36504]: I1203 22:16:07.215757 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4d2d" Dec 03 22:16:07.215810 master-0 kubenswrapper[36504]: I1203 22:16:07.215817 36504 scope.go:117] "RemoveContainer" containerID="fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e" Dec 03 22:16:07.216854 master-0 kubenswrapper[36504]: I1203 22:16:07.215797 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4d2d" event={"ID":"dd6c47ce-dd4e-415b-a497-cf019e16a778","Type":"ContainerDied","Data":"683451a2d091bcf6fa1dc9eaf0f2c83b429231cd88223269a3702c28f2bc7efe"} Dec 03 22:16:07.216854 master-0 kubenswrapper[36504]: I1203 22:16:07.216345 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:16:07.221321 master-0 kubenswrapper[36504]: I1203 22:16:07.221282 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-68cfd8bfc5-t7rsr" Dec 03 22:16:07.244727 master-0 kubenswrapper[36504]: I1203 22:16:07.244674 36504 scope.go:117] "RemoveContainer" containerID="dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10" Dec 03 22:16:07.265283 master-0 kubenswrapper[36504]: I1203 22:16:07.265203 36504 scope.go:117] "RemoveContainer" containerID="fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336" Dec 03 22:16:07.292673 master-0 kubenswrapper[36504]: I1203 22:16:07.292588 36504 scope.go:117] "RemoveContainer" containerID="fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e" Dec 03 22:16:07.293303 master-0 kubenswrapper[36504]: E1203 22:16:07.293240 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e\": container with ID starting with fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e not found: ID does not exist" containerID="fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e" Dec 03 22:16:07.293362 master-0 kubenswrapper[36504]: I1203 22:16:07.293307 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e"} err="failed to get container status \"fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e\": rpc error: code = NotFound desc = could not find container \"fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e\": container with ID starting with fd06b1103df9aa03233274fe3f0fa3cec38f19c790d61c3a88149bd5b27be43e not found: ID does not exist" Dec 03 22:16:07.293362 master-0 kubenswrapper[36504]: I1203 22:16:07.293354 36504 scope.go:117] "RemoveContainer" containerID="dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10" Dec 03 22:16:07.293757 master-0 kubenswrapper[36504]: E1203 22:16:07.293701 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10\": container with ID starting with dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10 not found: ID does not exist" containerID="dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10" Dec 03 22:16:07.293852 master-0 kubenswrapper[36504]: I1203 22:16:07.293763 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10"} err="failed to get container status \"dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10\": rpc error: code = NotFound desc = could not find container \"dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10\": container with ID starting with dd4a2995229cb87273e1821721595f3b32a64e7ee89ee04ce389174bf6440c10 not found: ID does not exist" Dec 03 22:16:07.293852 master-0 kubenswrapper[36504]: I1203 22:16:07.293808 36504 scope.go:117] "RemoveContainer" containerID="fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336" Dec 03 22:16:07.294636 master-0 kubenswrapper[36504]: E1203 22:16:07.294314 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336\": container with ID starting with fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336 not found: ID does not exist" containerID="fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336" Dec 03 22:16:07.294636 master-0 kubenswrapper[36504]: I1203 22:16:07.294356 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336"} err="failed to get container status \"fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336\": rpc error: code = NotFound desc = could not find container \"fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336\": container with ID starting with fa293b7dd917b3550d32414697c7f84200b92bd9d55ae2b6bcf482fa20bba336 not found: ID does not exist" Dec 03 22:16:07.414252 master-0 kubenswrapper[36504]: I1203 22:16:07.414153 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4d2d"] Dec 03 22:16:07.420821 master-0 kubenswrapper[36504]: I1203 22:16:07.420423 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r4d2d"] Dec 03 22:16:08.095397 master-0 kubenswrapper[36504]: I1203 22:16:08.095329 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:16:09.104274 master-0 kubenswrapper[36504]: I1203 22:16:09.104167 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" path="/var/lib/kubelet/pods/dd6c47ce-dd4e-415b-a497-cf019e16a778/volumes" Dec 03 22:16:09.842626 master-0 kubenswrapper[36504]: I1203 22:16:09.842537 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:16:09.887715 master-0 kubenswrapper[36504]: I1203 22:16:09.887630 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:16:11.117930 master-0 kubenswrapper[36504]: I1203 22:16:11.117832 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj"] Dec 03 22:16:11.118601 master-0 kubenswrapper[36504]: E1203 22:16:11.118282 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="registry-server" Dec 03 22:16:11.118601 master-0 kubenswrapper[36504]: I1203 22:16:11.118307 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="registry-server" Dec 03 22:16:11.118601 master-0 kubenswrapper[36504]: E1203 22:16:11.118336 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="extract-content" Dec 03 22:16:11.118601 master-0 kubenswrapper[36504]: I1203 22:16:11.118348 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="extract-content" Dec 03 22:16:11.118601 master-0 kubenswrapper[36504]: E1203 22:16:11.118376 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="extract-utilities" Dec 03 22:16:11.118601 master-0 kubenswrapper[36504]: I1203 22:16:11.118389 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="extract-utilities" Dec 03 22:16:11.118961 master-0 kubenswrapper[36504]: I1203 22:16:11.118627 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6c47ce-dd4e-415b-a497-cf019e16a778" containerName="registry-server" Dec 03 22:16:11.119871 master-0 kubenswrapper[36504]: I1203 22:16:11.119840 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.131138 master-0 kubenswrapper[36504]: I1203 22:16:11.131048 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj"] Dec 03 22:16:11.282078 master-0 kubenswrapper[36504]: I1203 22:16:11.282013 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.282907 master-0 kubenswrapper[36504]: I1203 22:16:11.282885 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.283087 master-0 kubenswrapper[36504]: I1203 22:16:11.283065 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4n9s\" (UniqueName: \"kubernetes.io/projected/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-kube-api-access-z4n9s\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.294843 master-0 kubenswrapper[36504]: I1203 22:16:11.294728 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh"] Dec 03 22:16:11.296747 master-0 kubenswrapper[36504]: I1203 22:16:11.296722 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.310649 master-0 kubenswrapper[36504]: I1203 22:16:11.310599 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh"] Dec 03 22:16:11.385083 master-0 kubenswrapper[36504]: I1203 22:16:11.384952 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.385083 master-0 kubenswrapper[36504]: I1203 22:16:11.385056 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.385593 master-0 kubenswrapper[36504]: I1203 22:16:11.385098 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4n9s\" (UniqueName: \"kubernetes.io/projected/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-kube-api-access-z4n9s\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.385709 master-0 kubenswrapper[36504]: I1203 22:16:11.385671 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.387576 master-0 kubenswrapper[36504]: I1203 22:16:11.387504 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.407712 master-0 kubenswrapper[36504]: I1203 22:16:11.407641 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4n9s\" (UniqueName: \"kubernetes.io/projected/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-kube-api-access-z4n9s\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.486893 master-0 kubenswrapper[36504]: I1203 22:16:11.486807 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.486893 master-0 kubenswrapper[36504]: I1203 22:16:11.486886 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.487684 master-0 kubenswrapper[36504]: I1203 22:16:11.487624 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxwd\" (UniqueName: \"kubernetes.io/projected/2a9d89a2-2e6e-4618-ab19-9df20a663e85-kube-api-access-9sxwd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.493897 master-0 kubenswrapper[36504]: I1203 22:16:11.493848 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:11.589044 master-0 kubenswrapper[36504]: I1203 22:16:11.588963 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.589044 master-0 kubenswrapper[36504]: I1203 22:16:11.589044 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.589362 master-0 kubenswrapper[36504]: I1203 22:16:11.589147 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxwd\" (UniqueName: \"kubernetes.io/projected/2a9d89a2-2e6e-4618-ab19-9df20a663e85-kube-api-access-9sxwd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.589600 master-0 kubenswrapper[36504]: I1203 22:16:11.589519 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.589797 master-0 kubenswrapper[36504]: I1203 22:16:11.589735 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.653576 master-0 kubenswrapper[36504]: I1203 22:16:11.653523 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxwd\" (UniqueName: \"kubernetes.io/projected/2a9d89a2-2e6e-4618-ab19-9df20a663e85-kube-api-access-9sxwd\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:11.916089 master-0 kubenswrapper[36504]: I1203 22:16:11.915891 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:14.103054 master-0 kubenswrapper[36504]: W1203 22:16:14.102965 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9d89a2_2e6e_4618_ab19_9df20a663e85.slice/crio-480c4a217fb32a729479e699481ccd4f7136dea4b80852fbd45caf770ebbc8c2 WatchSource:0}: Error finding container 480c4a217fb32a729479e699481ccd4f7136dea4b80852fbd45caf770ebbc8c2: Status 404 returned error can't find the container with id 480c4a217fb32a729479e699481ccd4f7136dea4b80852fbd45caf770ebbc8c2 Dec 03 22:16:14.103619 master-0 kubenswrapper[36504]: W1203 22:16:14.103373 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b8ff4a3_aab3_43df_ad20_e29d2c68650f.slice/crio-08ef8e2938522e4002dcd941b0c39879b74aeddd093188b6ff4954698d3fc8c9 WatchSource:0}: Error finding container 08ef8e2938522e4002dcd941b0c39879b74aeddd093188b6ff4954698d3fc8c9: Status 404 returned error can't find the container with id 08ef8e2938522e4002dcd941b0c39879b74aeddd093188b6ff4954698d3fc8c9 Dec 03 22:16:14.104966 master-0 kubenswrapper[36504]: I1203 22:16:14.104923 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj"] Dec 03 22:16:14.115338 master-0 kubenswrapper[36504]: I1203 22:16:14.115278 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh"] Dec 03 22:16:14.286823 master-0 kubenswrapper[36504]: I1203 22:16:14.286721 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" event={"ID":"2a9d89a2-2e6e-4618-ab19-9df20a663e85","Type":"ContainerStarted","Data":"480c4a217fb32a729479e699481ccd4f7136dea4b80852fbd45caf770ebbc8c2"} Dec 03 22:16:14.287996 master-0 kubenswrapper[36504]: I1203 22:16:14.287952 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerStarted","Data":"08ef8e2938522e4002dcd941b0c39879b74aeddd093188b6ff4954698d3fc8c9"} Dec 03 22:16:14.570629 master-0 kubenswrapper[36504]: I1203 22:16:14.570543 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx"] Dec 03 22:16:14.572619 master-0 kubenswrapper[36504]: I1203 22:16:14.572561 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.684597 master-0 kubenswrapper[36504]: I1203 22:16:14.684499 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx"] Dec 03 22:16:14.693058 master-0 kubenswrapper[36504]: I1203 22:16:14.692969 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwtdd"] Dec 03 22:16:14.693534 master-0 kubenswrapper[36504]: I1203 22:16:14.693444 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zwtdd" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="registry-server" containerID="cri-o://85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc" gracePeriod=2 Dec 03 22:16:14.751213 master-0 kubenswrapper[36504]: I1203 22:16:14.751183 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.751372 master-0 kubenswrapper[36504]: I1203 22:16:14.751350 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.751465 master-0 kubenswrapper[36504]: I1203 22:16:14.751452 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szwrq\" (UniqueName: \"kubernetes.io/projected/cb7a96cf-9f18-4c1d-a677-134c9d677997-kube-api-access-szwrq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.857808 master-0 kubenswrapper[36504]: I1203 22:16:14.854130 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.857808 master-0 kubenswrapper[36504]: I1203 22:16:14.854228 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szwrq\" (UniqueName: \"kubernetes.io/projected/cb7a96cf-9f18-4c1d-a677-134c9d677997-kube-api-access-szwrq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.857808 master-0 kubenswrapper[36504]: I1203 22:16:14.854333 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.857808 master-0 kubenswrapper[36504]: I1203 22:16:14.854912 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.857808 master-0 kubenswrapper[36504]: I1203 22:16:14.854993 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.879800 master-0 kubenswrapper[36504]: I1203 22:16:14.878119 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szwrq\" (UniqueName: \"kubernetes.io/projected/cb7a96cf-9f18-4c1d-a677-134c9d677997-kube-api-access-szwrq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:14.905296 master-0 kubenswrapper[36504]: I1203 22:16:14.905230 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:15.179885 master-0 kubenswrapper[36504]: I1203 22:16:15.179807 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:16:15.298330 master-0 kubenswrapper[36504]: I1203 22:16:15.298175 36504 generic.go:334] "Generic (PLEG): container finished" podID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerID="d68d1e9c3273464063c959749196d3b7f0bca1d070db47f5a7d78d7bbaeabd7c" exitCode=0 Dec 03 22:16:15.298330 master-0 kubenswrapper[36504]: I1203 22:16:15.298249 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" event={"ID":"2a9d89a2-2e6e-4618-ab19-9df20a663e85","Type":"ContainerDied","Data":"d68d1e9c3273464063c959749196d3b7f0bca1d070db47f5a7d78d7bbaeabd7c"} Dec 03 22:16:15.301900 master-0 kubenswrapper[36504]: I1203 22:16:15.301794 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zwtdd" Dec 03 22:16:15.301900 master-0 kubenswrapper[36504]: I1203 22:16:15.301826 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerDied","Data":"85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc"} Dec 03 22:16:15.301986 master-0 kubenswrapper[36504]: I1203 22:16:15.301923 36504 scope.go:117] "RemoveContainer" containerID="85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc" Dec 03 22:16:15.302261 master-0 kubenswrapper[36504]: I1203 22:16:15.301677 36504 generic.go:334] "Generic (PLEG): container finished" podID="206689a9-2233-44a1-8070-abdf8adf2f86" containerID="85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc" exitCode=0 Dec 03 22:16:15.302783 master-0 kubenswrapper[36504]: I1203 22:16:15.302331 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zwtdd" event={"ID":"206689a9-2233-44a1-8070-abdf8adf2f86","Type":"ContainerDied","Data":"4261f1ae4efebfbff026a012f83019226ebba84bcc9bff9f856f012a2d183516"} Dec 03 22:16:15.304647 master-0 kubenswrapper[36504]: I1203 22:16:15.304586 36504 generic.go:334] "Generic (PLEG): container finished" podID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerID="7743ff0ec4a48b3100de33f1bdd82351af82be7d072e1a3a2e5fdd3ca2f03693" exitCode=0 Dec 03 22:16:15.304716 master-0 kubenswrapper[36504]: I1203 22:16:15.304665 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerDied","Data":"7743ff0ec4a48b3100de33f1bdd82351af82be7d072e1a3a2e5fdd3ca2f03693"} Dec 03 22:16:15.320753 master-0 kubenswrapper[36504]: I1203 22:16:15.320664 36504 scope.go:117] "RemoveContainer" containerID="bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941" Dec 03 22:16:15.353898 master-0 kubenswrapper[36504]: I1203 22:16:15.353853 36504 scope.go:117] "RemoveContainer" containerID="9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc" Dec 03 22:16:15.363191 master-0 kubenswrapper[36504]: I1203 22:16:15.363134 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-utilities\") pod \"206689a9-2233-44a1-8070-abdf8adf2f86\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " Dec 03 22:16:15.363277 master-0 kubenswrapper[36504]: I1203 22:16:15.363211 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6dtk\" (UniqueName: \"kubernetes.io/projected/206689a9-2233-44a1-8070-abdf8adf2f86-kube-api-access-x6dtk\") pod \"206689a9-2233-44a1-8070-abdf8adf2f86\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " Dec 03 22:16:15.363417 master-0 kubenswrapper[36504]: I1203 22:16:15.363391 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-catalog-content\") pod \"206689a9-2233-44a1-8070-abdf8adf2f86\" (UID: \"206689a9-2233-44a1-8070-abdf8adf2f86\") " Dec 03 22:16:15.365386 master-0 kubenswrapper[36504]: I1203 22:16:15.365328 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-utilities" (OuterVolumeSpecName: "utilities") pod "206689a9-2233-44a1-8070-abdf8adf2f86" (UID: "206689a9-2233-44a1-8070-abdf8adf2f86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:15.369150 master-0 kubenswrapper[36504]: I1203 22:16:15.369075 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/206689a9-2233-44a1-8070-abdf8adf2f86-kube-api-access-x6dtk" (OuterVolumeSpecName: "kube-api-access-x6dtk") pod "206689a9-2233-44a1-8070-abdf8adf2f86" (UID: "206689a9-2233-44a1-8070-abdf8adf2f86"). InnerVolumeSpecName "kube-api-access-x6dtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:15.376281 master-0 kubenswrapper[36504]: I1203 22:16:15.376218 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx"] Dec 03 22:16:15.381657 master-0 kubenswrapper[36504]: I1203 22:16:15.381363 36504 scope.go:117] "RemoveContainer" containerID="85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc" Dec 03 22:16:15.382974 master-0 kubenswrapper[36504]: E1203 22:16:15.382904 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc\": container with ID starting with 85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc not found: ID does not exist" containerID="85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc" Dec 03 22:16:15.383047 master-0 kubenswrapper[36504]: I1203 22:16:15.383003 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc"} err="failed to get container status \"85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc\": rpc error: code = NotFound desc = could not find container \"85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc\": container with ID starting with 85955a5068a17309f04df84a4a9c40581ed88afd8777f7344648e184704715bc not found: ID does not exist" Dec 03 22:16:15.383097 master-0 kubenswrapper[36504]: I1203 22:16:15.383058 36504 scope.go:117] "RemoveContainer" containerID="bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941" Dec 03 22:16:15.383737 master-0 kubenswrapper[36504]: E1203 22:16:15.383665 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941\": container with ID starting with bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941 not found: ID does not exist" containerID="bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941" Dec 03 22:16:15.383913 master-0 kubenswrapper[36504]: I1203 22:16:15.383868 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941"} err="failed to get container status \"bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941\": rpc error: code = NotFound desc = could not find container \"bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941\": container with ID starting with bd50357c9d713a0299869daeace8ec07fd8b34a2a5485f75b585950ff1f36941 not found: ID does not exist" Dec 03 22:16:15.383913 master-0 kubenswrapper[36504]: I1203 22:16:15.383908 36504 scope.go:117] "RemoveContainer" containerID="9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc" Dec 03 22:16:15.384270 master-0 kubenswrapper[36504]: E1203 22:16:15.384223 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc\": container with ID starting with 9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc not found: ID does not exist" containerID="9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc" Dec 03 22:16:15.384323 master-0 kubenswrapper[36504]: I1203 22:16:15.384266 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc"} err="failed to get container status \"9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc\": rpc error: code = NotFound desc = could not find container \"9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc\": container with ID starting with 9cb4a16de24a7640f78ed52e0cd76a068b1aa5047ee90fc2d57f5bb59918c1cc not found: ID does not exist" Dec 03 22:16:15.386681 master-0 kubenswrapper[36504]: W1203 22:16:15.386625 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb7a96cf_9f18_4c1d_a677_134c9d677997.slice/crio-c2da4f2a1424c2e57a47453fdb99495e36da0b36c7b9095bc4c2526a95378bfb WatchSource:0}: Error finding container c2da4f2a1424c2e57a47453fdb99495e36da0b36c7b9095bc4c2526a95378bfb: Status 404 returned error can't find the container with id c2da4f2a1424c2e57a47453fdb99495e36da0b36c7b9095bc4c2526a95378bfb Dec 03 22:16:15.465995 master-0 kubenswrapper[36504]: I1203 22:16:15.465952 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:15.465995 master-0 kubenswrapper[36504]: I1203 22:16:15.465990 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6dtk\" (UniqueName: \"kubernetes.io/projected/206689a9-2233-44a1-8070-abdf8adf2f86-kube-api-access-x6dtk\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:15.472405 master-0 kubenswrapper[36504]: I1203 22:16:15.472311 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "206689a9-2233-44a1-8070-abdf8adf2f86" (UID: "206689a9-2233-44a1-8070-abdf8adf2f86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:15.567523 master-0 kubenswrapper[36504]: I1203 22:16:15.567322 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/206689a9-2233-44a1-8070-abdf8adf2f86-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:15.719231 master-0 kubenswrapper[36504]: I1203 22:16:15.719128 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zwtdd"] Dec 03 22:16:15.728634 master-0 kubenswrapper[36504]: I1203 22:16:15.728249 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zwtdd"] Dec 03 22:16:16.318414 master-0 kubenswrapper[36504]: I1203 22:16:16.318341 36504 generic.go:334] "Generic (PLEG): container finished" podID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerID="9560abf387b0a829170d9e948752f4a27936db4e2f8bc44e9bb12c06809c8894" exitCode=0 Dec 03 22:16:16.318414 master-0 kubenswrapper[36504]: I1203 22:16:16.318419 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" event={"ID":"cb7a96cf-9f18-4c1d-a677-134c9d677997","Type":"ContainerDied","Data":"9560abf387b0a829170d9e948752f4a27936db4e2f8bc44e9bb12c06809c8894"} Dec 03 22:16:16.319058 master-0 kubenswrapper[36504]: I1203 22:16:16.318461 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" event={"ID":"cb7a96cf-9f18-4c1d-a677-134c9d677997","Type":"ContainerStarted","Data":"c2da4f2a1424c2e57a47453fdb99495e36da0b36c7b9095bc4c2526a95378bfb"} Dec 03 22:16:17.106314 master-0 kubenswrapper[36504]: I1203 22:16:17.106242 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" path="/var/lib/kubelet/pods/206689a9-2233-44a1-8070-abdf8adf2f86/volumes" Dec 03 22:16:17.329594 master-0 kubenswrapper[36504]: I1203 22:16:17.329537 36504 generic.go:334] "Generic (PLEG): container finished" podID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerID="360b64b182f1d3355a8a96b8e78ea32a55bf2be34ea3e68d2c0af470cbf38b23" exitCode=0 Dec 03 22:16:17.329594 master-0 kubenswrapper[36504]: I1203 22:16:17.329599 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" event={"ID":"2a9d89a2-2e6e-4618-ab19-9df20a663e85","Type":"ContainerDied","Data":"360b64b182f1d3355a8a96b8e78ea32a55bf2be34ea3e68d2c0af470cbf38b23"} Dec 03 22:16:18.341128 master-0 kubenswrapper[36504]: I1203 22:16:18.341068 36504 generic.go:334] "Generic (PLEG): container finished" podID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerID="533189442eb2544a40c25e4f9a3802e5d9a562e43751a924e300c1229177dc7a" exitCode=0 Dec 03 22:16:18.341128 master-0 kubenswrapper[36504]: I1203 22:16:18.341141 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" event={"ID":"2a9d89a2-2e6e-4618-ab19-9df20a663e85","Type":"ContainerDied","Data":"533189442eb2544a40c25e4f9a3802e5d9a562e43751a924e300c1229177dc7a"} Dec 03 22:16:18.343521 master-0 kubenswrapper[36504]: I1203 22:16:18.343455 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerStarted","Data":"49a9e0b8f491f799419efe9ddf4529588c7bc9fa6cd7dcf1bad9da0931e94963"} Dec 03 22:16:19.354807 master-0 kubenswrapper[36504]: I1203 22:16:19.354726 36504 generic.go:334] "Generic (PLEG): container finished" podID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerID="b604e684cbe50b4ac29626360de4a01c5916c164b94df2822c705bae4b29d449" exitCode=0 Dec 03 22:16:19.355584 master-0 kubenswrapper[36504]: I1203 22:16:19.354870 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" event={"ID":"cb7a96cf-9f18-4c1d-a677-134c9d677997","Type":"ContainerDied","Data":"b604e684cbe50b4ac29626360de4a01c5916c164b94df2822c705bae4b29d449"} Dec 03 22:16:19.357739 master-0 kubenswrapper[36504]: I1203 22:16:19.357683 36504 generic.go:334] "Generic (PLEG): container finished" podID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerID="49a9e0b8f491f799419efe9ddf4529588c7bc9fa6cd7dcf1bad9da0931e94963" exitCode=0 Dec 03 22:16:19.357959 master-0 kubenswrapper[36504]: I1203 22:16:19.357856 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerDied","Data":"49a9e0b8f491f799419efe9ddf4529588c7bc9fa6cd7dcf1bad9da0931e94963"} Dec 03 22:16:19.733156 master-0 kubenswrapper[36504]: I1203 22:16:19.733040 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:19.846013 master-0 kubenswrapper[36504]: I1203 22:16:19.845914 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-bundle\") pod \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " Dec 03 22:16:19.846355 master-0 kubenswrapper[36504]: I1203 22:16:19.846095 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxwd\" (UniqueName: \"kubernetes.io/projected/2a9d89a2-2e6e-4618-ab19-9df20a663e85-kube-api-access-9sxwd\") pod \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " Dec 03 22:16:19.846355 master-0 kubenswrapper[36504]: I1203 22:16:19.846169 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-util\") pod \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\" (UID: \"2a9d89a2-2e6e-4618-ab19-9df20a663e85\") " Dec 03 22:16:19.847503 master-0 kubenswrapper[36504]: I1203 22:16:19.847454 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-bundle" (OuterVolumeSpecName: "bundle") pod "2a9d89a2-2e6e-4618-ab19-9df20a663e85" (UID: "2a9d89a2-2e6e-4618-ab19-9df20a663e85"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:19.849607 master-0 kubenswrapper[36504]: I1203 22:16:19.849533 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a9d89a2-2e6e-4618-ab19-9df20a663e85-kube-api-access-9sxwd" (OuterVolumeSpecName: "kube-api-access-9sxwd") pod "2a9d89a2-2e6e-4618-ab19-9df20a663e85" (UID: "2a9d89a2-2e6e-4618-ab19-9df20a663e85"). InnerVolumeSpecName "kube-api-access-9sxwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:19.950924 master-0 kubenswrapper[36504]: I1203 22:16:19.949694 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxwd\" (UniqueName: \"kubernetes.io/projected/2a9d89a2-2e6e-4618-ab19-9df20a663e85-kube-api-access-9sxwd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:19.950924 master-0 kubenswrapper[36504]: I1203 22:16:19.949820 36504 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:20.028195 master-0 kubenswrapper[36504]: I1203 22:16:20.028087 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-util" (OuterVolumeSpecName: "util") pod "2a9d89a2-2e6e-4618-ab19-9df20a663e85" (UID: "2a9d89a2-2e6e-4618-ab19-9df20a663e85"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:20.052017 master-0 kubenswrapper[36504]: I1203 22:16:20.051931 36504 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a9d89a2-2e6e-4618-ab19-9df20a663e85-util\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:20.370008 master-0 kubenswrapper[36504]: I1203 22:16:20.369876 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" event={"ID":"cb7a96cf-9f18-4c1d-a677-134c9d677997","Type":"ContainerStarted","Data":"d45e519b1c86f85cd7317d74a3b5534653191f5130daa0736989208ba8d13204"} Dec 03 22:16:20.373444 master-0 kubenswrapper[36504]: I1203 22:16:20.373359 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerStarted","Data":"920c08a119997032ab3ea3aafe4cea8bde7a65e3c1d833f4c95dda6a3d3497c7"} Dec 03 22:16:20.376194 master-0 kubenswrapper[36504]: I1203 22:16:20.376148 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" event={"ID":"2a9d89a2-2e6e-4618-ab19-9df20a663e85","Type":"ContainerDied","Data":"480c4a217fb32a729479e699481ccd4f7136dea4b80852fbd45caf770ebbc8c2"} Dec 03 22:16:20.376194 master-0 kubenswrapper[36504]: I1203 22:16:20.376179 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480c4a217fb32a729479e699481ccd4f7136dea4b80852fbd45caf770ebbc8c2" Dec 03 22:16:20.376398 master-0 kubenswrapper[36504]: I1203 22:16:20.376261 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83cj7gh" Dec 03 22:16:21.392943 master-0 kubenswrapper[36504]: I1203 22:16:21.392831 36504 generic.go:334] "Generic (PLEG): container finished" podID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerID="d45e519b1c86f85cd7317d74a3b5534653191f5130daa0736989208ba8d13204" exitCode=0 Dec 03 22:16:21.393889 master-0 kubenswrapper[36504]: I1203 22:16:21.392928 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" event={"ID":"cb7a96cf-9f18-4c1d-a677-134c9d677997","Type":"ContainerDied","Data":"d45e519b1c86f85cd7317d74a3b5534653191f5130daa0736989208ba8d13204"} Dec 03 22:16:21.398951 master-0 kubenswrapper[36504]: I1203 22:16:21.398893 36504 generic.go:334] "Generic (PLEG): container finished" podID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerID="920c08a119997032ab3ea3aafe4cea8bde7a65e3c1d833f4c95dda6a3d3497c7" exitCode=0 Dec 03 22:16:21.398951 master-0 kubenswrapper[36504]: I1203 22:16:21.398946 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerDied","Data":"920c08a119997032ab3ea3aafe4cea8bde7a65e3c1d833f4c95dda6a3d3497c7"} Dec 03 22:16:22.804856 master-0 kubenswrapper[36504]: I1203 22:16:22.804495 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:22.813029 master-0 kubenswrapper[36504]: I1203 22:16:22.812981 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:22.896977 master-0 kubenswrapper[36504]: I1203 22:16:22.896899 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-util\") pod \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " Dec 03 22:16:22.896977 master-0 kubenswrapper[36504]: I1203 22:16:22.896963 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szwrq\" (UniqueName: \"kubernetes.io/projected/cb7a96cf-9f18-4c1d-a677-134c9d677997-kube-api-access-szwrq\") pod \"cb7a96cf-9f18-4c1d-a677-134c9d677997\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " Dec 03 22:16:22.896977 master-0 kubenswrapper[36504]: I1203 22:16:22.896991 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4n9s\" (UniqueName: \"kubernetes.io/projected/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-kube-api-access-z4n9s\") pod \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " Dec 03 22:16:22.897321 master-0 kubenswrapper[36504]: I1203 22:16:22.897036 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-bundle\") pod \"cb7a96cf-9f18-4c1d-a677-134c9d677997\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " Dec 03 22:16:22.897321 master-0 kubenswrapper[36504]: I1203 22:16:22.897066 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-util\") pod \"cb7a96cf-9f18-4c1d-a677-134c9d677997\" (UID: \"cb7a96cf-9f18-4c1d-a677-134c9d677997\") " Dec 03 22:16:22.897321 master-0 kubenswrapper[36504]: I1203 22:16:22.897091 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-bundle\") pod \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\" (UID: \"7b8ff4a3-aab3-43df-ad20-e29d2c68650f\") " Dec 03 22:16:22.898271 master-0 kubenswrapper[36504]: I1203 22:16:22.898238 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-bundle" (OuterVolumeSpecName: "bundle") pod "7b8ff4a3-aab3-43df-ad20-e29d2c68650f" (UID: "7b8ff4a3-aab3-43df-ad20-e29d2c68650f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:22.898892 master-0 kubenswrapper[36504]: I1203 22:16:22.898865 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-bundle" (OuterVolumeSpecName: "bundle") pod "cb7a96cf-9f18-4c1d-a677-134c9d677997" (UID: "cb7a96cf-9f18-4c1d-a677-134c9d677997"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:22.904023 master-0 kubenswrapper[36504]: I1203 22:16:22.903957 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-kube-api-access-z4n9s" (OuterVolumeSpecName: "kube-api-access-z4n9s") pod "7b8ff4a3-aab3-43df-ad20-e29d2c68650f" (UID: "7b8ff4a3-aab3-43df-ad20-e29d2c68650f"). InnerVolumeSpecName "kube-api-access-z4n9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:22.906107 master-0 kubenswrapper[36504]: I1203 22:16:22.906065 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7a96cf-9f18-4c1d-a677-134c9d677997-kube-api-access-szwrq" (OuterVolumeSpecName: "kube-api-access-szwrq") pod "cb7a96cf-9f18-4c1d-a677-134c9d677997" (UID: "cb7a96cf-9f18-4c1d-a677-134c9d677997"). InnerVolumeSpecName "kube-api-access-szwrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:22.911962 master-0 kubenswrapper[36504]: I1203 22:16:22.911803 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-util" (OuterVolumeSpecName: "util") pod "cb7a96cf-9f18-4c1d-a677-134c9d677997" (UID: "cb7a96cf-9f18-4c1d-a677-134c9d677997"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:22.915429 master-0 kubenswrapper[36504]: I1203 22:16:22.915262 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-util" (OuterVolumeSpecName: "util") pod "7b8ff4a3-aab3-43df-ad20-e29d2c68650f" (UID: "7b8ff4a3-aab3-43df-ad20-e29d2c68650f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:22.998911 master-0 kubenswrapper[36504]: I1203 22:16:22.998843 36504 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-util\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:22.998911 master-0 kubenswrapper[36504]: I1203 22:16:22.998899 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szwrq\" (UniqueName: \"kubernetes.io/projected/cb7a96cf-9f18-4c1d-a677-134c9d677997-kube-api-access-szwrq\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:22.998911 master-0 kubenswrapper[36504]: I1203 22:16:22.998915 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4n9s\" (UniqueName: \"kubernetes.io/projected/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-kube-api-access-z4n9s\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:22.998911 master-0 kubenswrapper[36504]: I1203 22:16:22.998930 36504 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:22.999340 master-0 kubenswrapper[36504]: I1203 22:16:22.998946 36504 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cb7a96cf-9f18-4c1d-a677-134c9d677997-util\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:22.999340 master-0 kubenswrapper[36504]: I1203 22:16:22.998958 36504 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b8ff4a3-aab3-43df-ad20-e29d2c68650f-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:23.418321 master-0 kubenswrapper[36504]: I1203 22:16:23.417705 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" event={"ID":"7b8ff4a3-aab3-43df-ad20-e29d2c68650f","Type":"ContainerDied","Data":"08ef8e2938522e4002dcd941b0c39879b74aeddd093188b6ff4954698d3fc8c9"} Dec 03 22:16:23.418321 master-0 kubenswrapper[36504]: I1203 22:16:23.417789 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ef8e2938522e4002dcd941b0c39879b74aeddd093188b6ff4954698d3fc8c9" Dec 03 22:16:23.418321 master-0 kubenswrapper[36504]: I1203 22:16:23.417805 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931atlgtj" Dec 03 22:16:23.420664 master-0 kubenswrapper[36504]: I1203 22:16:23.420637 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" event={"ID":"cb7a96cf-9f18-4c1d-a677-134c9d677997","Type":"ContainerDied","Data":"c2da4f2a1424c2e57a47453fdb99495e36da0b36c7b9095bc4c2526a95378bfb"} Dec 03 22:16:23.420664 master-0 kubenswrapper[36504]: I1203 22:16:23.420660 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2da4f2a1424c2e57a47453fdb99495e36da0b36c7b9095bc4c2526a95378bfb" Dec 03 22:16:23.421055 master-0 kubenswrapper[36504]: I1203 22:16:23.420696 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fxjrkx" Dec 03 22:16:25.320015 master-0 kubenswrapper[36504]: I1203 22:16:25.319916 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6"] Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320578 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="registry-server" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320603 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="registry-server" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320624 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="extract" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320638 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="extract" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320659 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="extract" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320673 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="extract" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320708 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="pull" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320720 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="pull" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320743 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="util" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320754 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="util" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320809 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="util" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320822 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="util" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320838 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="extract" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320851 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="extract" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320872 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="extract-content" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320889 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="extract-content" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: E1203 22:16:25.320905 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="pull" Dec 03 22:16:25.320916 master-0 kubenswrapper[36504]: I1203 22:16:25.320923 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="pull" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: E1203 22:16:25.320957 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="extract-utilities" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.320975 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="extract-utilities" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: E1203 22:16:25.321011 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="util" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.321028 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="util" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: E1203 22:16:25.321053 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="pull" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.321066 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="pull" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.321351 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b8ff4a3-aab3-43df-ad20-e29d2c68650f" containerName="extract" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.321385 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7a96cf-9f18-4c1d-a677-134c9d677997" containerName="extract" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.321428 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="206689a9-2233-44a1-8070-abdf8adf2f86" containerName="registry-server" Dec 03 22:16:25.321619 master-0 kubenswrapper[36504]: I1203 22:16:25.321448 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a9d89a2-2e6e-4618-ab19-9df20a663e85" containerName="extract" Dec 03 22:16:25.323536 master-0 kubenswrapper[36504]: I1203 22:16:25.323491 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.346884 master-0 kubenswrapper[36504]: I1203 22:16:25.346807 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6"] Dec 03 22:16:25.437158 master-0 kubenswrapper[36504]: I1203 22:16:25.437092 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.437424 master-0 kubenswrapper[36504]: I1203 22:16:25.437205 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87sq8\" (UniqueName: \"kubernetes.io/projected/1519d792-d4bc-4ab4-b7ec-76ad913673d4-kube-api-access-87sq8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.437424 master-0 kubenswrapper[36504]: I1203 22:16:25.437360 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.538701 master-0 kubenswrapper[36504]: I1203 22:16:25.538643 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.538997 master-0 kubenswrapper[36504]: I1203 22:16:25.538760 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87sq8\" (UniqueName: \"kubernetes.io/projected/1519d792-d4bc-4ab4-b7ec-76ad913673d4-kube-api-access-87sq8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.538997 master-0 kubenswrapper[36504]: I1203 22:16:25.538917 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.539290 master-0 kubenswrapper[36504]: I1203 22:16:25.539265 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.539528 master-0 kubenswrapper[36504]: I1203 22:16:25.539480 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.557999 master-0 kubenswrapper[36504]: I1203 22:16:25.557943 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87sq8\" (UniqueName: \"kubernetes.io/projected/1519d792-d4bc-4ab4-b7ec-76ad913673d4-kube-api-access-87sq8\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:25.649422 master-0 kubenswrapper[36504]: I1203 22:16:25.649350 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:26.089196 master-0 kubenswrapper[36504]: I1203 22:16:26.088511 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6"] Dec 03 22:16:26.094030 master-0 kubenswrapper[36504]: W1203 22:16:26.093970 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1519d792_d4bc_4ab4_b7ec_76ad913673d4.slice/crio-46bd9fa0178614f3f853e7880422cc8b2b610f3f544c0496a7a672f13281177c WatchSource:0}: Error finding container 46bd9fa0178614f3f853e7880422cc8b2b610f3f544c0496a7a672f13281177c: Status 404 returned error can't find the container with id 46bd9fa0178614f3f853e7880422cc8b2b610f3f544c0496a7a672f13281177c Dec 03 22:16:26.447179 master-0 kubenswrapper[36504]: I1203 22:16:26.447115 36504 generic.go:334] "Generic (PLEG): container finished" podID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerID="2c726dc6271078c388aa790d72dd0b3344e7a62f81c625791bf686f12c76c1f7" exitCode=0 Dec 03 22:16:26.447960 master-0 kubenswrapper[36504]: I1203 22:16:26.447182 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" event={"ID":"1519d792-d4bc-4ab4-b7ec-76ad913673d4","Type":"ContainerDied","Data":"2c726dc6271078c388aa790d72dd0b3344e7a62f81c625791bf686f12c76c1f7"} Dec 03 22:16:26.447960 master-0 kubenswrapper[36504]: I1203 22:16:26.447496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" event={"ID":"1519d792-d4bc-4ab4-b7ec-76ad913673d4","Type":"ContainerStarted","Data":"46bd9fa0178614f3f853e7880422cc8b2b610f3f544c0496a7a672f13281177c"} Dec 03 22:16:29.493495 master-0 kubenswrapper[36504]: I1203 22:16:29.493408 36504 generic.go:334] "Generic (PLEG): container finished" podID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerID="32d8984592ef02c9e18c2627f890734ae2628db217937c4ff8a61c9cd8bc3bc2" exitCode=0 Dec 03 22:16:29.493495 master-0 kubenswrapper[36504]: I1203 22:16:29.493487 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" event={"ID":"1519d792-d4bc-4ab4-b7ec-76ad913673d4","Type":"ContainerDied","Data":"32d8984592ef02c9e18c2627f890734ae2628db217937c4ff8a61c9cd8bc3bc2"} Dec 03 22:16:29.661795 master-0 kubenswrapper[36504]: I1203 22:16:29.658261 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f"] Dec 03 22:16:29.661795 master-0 kubenswrapper[36504]: I1203 22:16:29.659457 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.664328 master-0 kubenswrapper[36504]: I1203 22:16:29.663435 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 22:16:29.664328 master-0 kubenswrapper[36504]: I1203 22:16:29.663579 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 22:16:29.664328 master-0 kubenswrapper[36504]: I1203 22:16:29.663724 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 22:16:29.664328 master-0 kubenswrapper[36504]: I1203 22:16:29.663918 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 22:16:29.681466 master-0 kubenswrapper[36504]: I1203 22:16:29.681344 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f"] Dec 03 22:16:29.748653 master-0 kubenswrapper[36504]: I1203 22:16:29.746016 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 03 22:16:29.762037 master-0 kubenswrapper[36504]: I1203 22:16:29.761962 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:29.770063 master-0 kubenswrapper[36504]: I1203 22:16:29.769331 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-tdkqf" Dec 03 22:16:29.770063 master-0 kubenswrapper[36504]: I1203 22:16:29.769683 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 22:16:29.808803 master-0 kubenswrapper[36504]: I1203 22:16:29.800636 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 03 22:16:29.832401 master-0 kubenswrapper[36504]: I1203 22:16:29.831411 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4q2m\" (UniqueName: \"kubernetes.io/projected/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-kube-api-access-d4q2m\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.832401 master-0 kubenswrapper[36504]: I1203 22:16:29.831528 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-webhook-cert\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.832401 master-0 kubenswrapper[36504]: I1203 22:16:29.831632 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-apiservice-cert\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.932935 master-0 kubenswrapper[36504]: I1203 22:16:29.932787 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4q2m\" (UniqueName: \"kubernetes.io/projected/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-kube-api-access-d4q2m\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.932935 master-0 kubenswrapper[36504]: I1203 22:16:29.932881 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:29.932935 master-0 kubenswrapper[36504]: I1203 22:16:29.932924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-webhook-cert\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.932935 master-0 kubenswrapper[36504]: I1203 22:16:29.932956 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b839ef46-9f53-4843-a419-dcff6de87eaa-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:29.933533 master-0 kubenswrapper[36504]: I1203 22:16:29.933004 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-var-lock\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:29.933533 master-0 kubenswrapper[36504]: I1203 22:16:29.933055 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-apiservice-cert\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.940796 master-0 kubenswrapper[36504]: I1203 22:16:29.940679 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-apiservice-cert\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.959798 master-0 kubenswrapper[36504]: I1203 22:16:29.954592 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4q2m\" (UniqueName: \"kubernetes.io/projected/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-kube-api-access-d4q2m\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:29.962140 master-0 kubenswrapper[36504]: I1203 22:16:29.960802 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d-webhook-cert\") pod \"metallb-operator-controller-manager-78597876b7-mbz6f\" (UID: \"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d\") " pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:30.036891 master-0 kubenswrapper[36504]: I1203 22:16:30.036359 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:30.040794 master-0 kubenswrapper[36504]: I1203 22:16:30.039314 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.040794 master-0 kubenswrapper[36504]: I1203 22:16:30.039438 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b839ef46-9f53-4843-a419-dcff6de87eaa-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.040794 master-0 kubenswrapper[36504]: I1203 22:16:30.039501 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-var-lock\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.040794 master-0 kubenswrapper[36504]: I1203 22:16:30.039748 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-var-lock\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.040794 master-0 kubenswrapper[36504]: I1203 22:16:30.039835 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.082796 master-0 kubenswrapper[36504]: I1203 22:16:30.075290 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw"] Dec 03 22:16:30.084596 master-0 kubenswrapper[36504]: I1203 22:16:30.084555 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.098808 master-0 kubenswrapper[36504]: I1203 22:16:30.091414 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 22:16:30.098808 master-0 kubenswrapper[36504]: I1203 22:16:30.091888 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 22:16:30.098808 master-0 kubenswrapper[36504]: I1203 22:16:30.098755 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b839ef46-9f53-4843-a419-dcff6de87eaa-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.157301 master-0 kubenswrapper[36504]: I1203 22:16:30.155820 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw"] Dec 03 22:16:30.244033 master-0 kubenswrapper[36504]: I1203 22:16:30.243228 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8676b196-ec39-41e6-938c-ad3f66a9b83e-apiservice-cert\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.244033 master-0 kubenswrapper[36504]: I1203 22:16:30.243321 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8676b196-ec39-41e6-938c-ad3f66a9b83e-webhook-cert\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.244033 master-0 kubenswrapper[36504]: I1203 22:16:30.243428 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfpr8\" (UniqueName: \"kubernetes.io/projected/8676b196-ec39-41e6-938c-ad3f66a9b83e-kube-api-access-sfpr8\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.345097 master-0 kubenswrapper[36504]: I1203 22:16:30.345020 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfpr8\" (UniqueName: \"kubernetes.io/projected/8676b196-ec39-41e6-938c-ad3f66a9b83e-kube-api-access-sfpr8\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.345330 master-0 kubenswrapper[36504]: I1203 22:16:30.345156 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8676b196-ec39-41e6-938c-ad3f66a9b83e-apiservice-cert\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.345330 master-0 kubenswrapper[36504]: I1203 22:16:30.345183 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8676b196-ec39-41e6-938c-ad3f66a9b83e-webhook-cert\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.349924 master-0 kubenswrapper[36504]: I1203 22:16:30.349865 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8676b196-ec39-41e6-938c-ad3f66a9b83e-apiservice-cert\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.350397 master-0 kubenswrapper[36504]: I1203 22:16:30.350357 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8676b196-ec39-41e6-938c-ad3f66a9b83e-webhook-cert\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.364100 master-0 kubenswrapper[36504]: I1203 22:16:30.364036 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfpr8\" (UniqueName: \"kubernetes.io/projected/8676b196-ec39-41e6-938c-ad3f66a9b83e-kube-api-access-sfpr8\") pod \"metallb-operator-webhook-server-5664d6fd-phlzw\" (UID: \"8676b196-ec39-41e6-938c-ad3f66a9b83e\") " pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.401525 master-0 kubenswrapper[36504]: I1203 22:16:30.401227 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:16:30.505807 master-0 kubenswrapper[36504]: I1203 22:16:30.505352 36504 generic.go:334] "Generic (PLEG): container finished" podID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerID="e3071638b70e920c7da2cbc081784ccae2f2f891bc11ead0387c10109a0e6aa7" exitCode=0 Dec 03 22:16:30.505807 master-0 kubenswrapper[36504]: I1203 22:16:30.505397 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" event={"ID":"1519d792-d4bc-4ab4-b7ec-76ad913673d4","Type":"ContainerDied","Data":"e3071638b70e920c7da2cbc081784ccae2f2f891bc11ead0387c10109a0e6aa7"} Dec 03 22:16:30.549179 master-0 kubenswrapper[36504]: I1203 22:16:30.538695 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:30.713394 master-0 kubenswrapper[36504]: I1203 22:16:30.711310 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f"] Dec 03 22:16:30.899919 master-0 kubenswrapper[36504]: I1203 22:16:30.899854 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Dec 03 22:16:31.082610 master-0 kubenswrapper[36504]: I1203 22:16:31.082516 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw"] Dec 03 22:16:31.096354 master-0 kubenswrapper[36504]: W1203 22:16:31.096270 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8676b196_ec39_41e6_938c_ad3f66a9b83e.slice/crio-7a2169575fb4b8328ac2d3b14552f63e674b73fc798ccd9e4fab9fffd6b37cf6 WatchSource:0}: Error finding container 7a2169575fb4b8328ac2d3b14552f63e674b73fc798ccd9e4fab9fffd6b37cf6: Status 404 returned error can't find the container with id 7a2169575fb4b8328ac2d3b14552f63e674b73fc798ccd9e4fab9fffd6b37cf6 Dec 03 22:16:31.517097 master-0 kubenswrapper[36504]: I1203 22:16:31.516896 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" event={"ID":"8676b196-ec39-41e6-938c-ad3f66a9b83e","Type":"ContainerStarted","Data":"7a2169575fb4b8328ac2d3b14552f63e674b73fc798ccd9e4fab9fffd6b37cf6"} Dec 03 22:16:31.519308 master-0 kubenswrapper[36504]: I1203 22:16:31.519269 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b839ef46-9f53-4843-a419-dcff6de87eaa","Type":"ContainerStarted","Data":"14acef51e8f6aeabc9e46d31b029f7c6cb1114df1176e1b2da5a6187fe6b7b3f"} Dec 03 22:16:31.519308 master-0 kubenswrapper[36504]: I1203 22:16:31.519304 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b839ef46-9f53-4843-a419-dcff6de87eaa","Type":"ContainerStarted","Data":"e388c12c51162e4e2e8d81c9ede79a9c4d62e67e22a76fae63f896978d794e72"} Dec 03 22:16:31.521506 master-0 kubenswrapper[36504]: I1203 22:16:31.521451 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" event={"ID":"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d","Type":"ContainerStarted","Data":"8efe6736bfe8f950f43d63414cd292c5dbcb1c0a44b41dcf3cc96b558f4a80fe"} Dec 03 22:16:31.731589 master-0 kubenswrapper[36504]: I1203 22:16:31.729267 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.729241205 podStartE2EDuration="2.729241205s" podCreationTimestamp="2025-12-03 22:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:16:31.720589755 +0000 UTC m=+356.940361762" watchObservedRunningTime="2025-12-03 22:16:31.729241205 +0000 UTC m=+356.949013212" Dec 03 22:16:31.930378 master-0 kubenswrapper[36504]: I1203 22:16:31.930300 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:32.032401 master-0 kubenswrapper[36504]: I1203 22:16:32.032227 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x"] Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: E1203 22:16:32.032636 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="pull" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: I1203 22:16:32.032662 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="pull" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: E1203 22:16:32.032694 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="util" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: I1203 22:16:32.032703 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="util" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: E1203 22:16:32.032755 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="extract" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: I1203 22:16:32.032781 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="extract" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: I1203 22:16:32.033382 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1519d792-d4bc-4ab4-b7ec-76ad913673d4" containerName="extract" Dec 03 22:16:32.035793 master-0 kubenswrapper[36504]: I1203 22:16:32.035498 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" Dec 03 22:16:32.038420 master-0 kubenswrapper[36504]: I1203 22:16:32.038187 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 22:16:32.050071 master-0 kubenswrapper[36504]: I1203 22:16:32.049986 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x"] Dec 03 22:16:32.058707 master-0 kubenswrapper[36504]: I1203 22:16:32.058582 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 22:16:32.085175 master-0 kubenswrapper[36504]: I1203 22:16:32.085097 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-util\") pod \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " Dec 03 22:16:32.085509 master-0 kubenswrapper[36504]: I1203 22:16:32.085221 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87sq8\" (UniqueName: \"kubernetes.io/projected/1519d792-d4bc-4ab4-b7ec-76ad913673d4-kube-api-access-87sq8\") pod \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " Dec 03 22:16:32.085587 master-0 kubenswrapper[36504]: I1203 22:16:32.085556 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-bundle\") pod \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\" (UID: \"1519d792-d4bc-4ab4-b7ec-76ad913673d4\") " Dec 03 22:16:32.088173 master-0 kubenswrapper[36504]: I1203 22:16:32.088144 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1519d792-d4bc-4ab4-b7ec-76ad913673d4-kube-api-access-87sq8" (OuterVolumeSpecName: "kube-api-access-87sq8") pod "1519d792-d4bc-4ab4-b7ec-76ad913673d4" (UID: "1519d792-d4bc-4ab4-b7ec-76ad913673d4"). InnerVolumeSpecName "kube-api-access-87sq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:32.093626 master-0 kubenswrapper[36504]: I1203 22:16:32.093470 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-bundle" (OuterVolumeSpecName: "bundle") pod "1519d792-d4bc-4ab4-b7ec-76ad913673d4" (UID: "1519d792-d4bc-4ab4-b7ec-76ad913673d4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:32.096462 master-0 kubenswrapper[36504]: I1203 22:16:32.096407 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-util" (OuterVolumeSpecName: "util") pod "1519d792-d4bc-4ab4-b7ec-76ad913673d4" (UID: "1519d792-d4bc-4ab4-b7ec-76ad913673d4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:32.188005 master-0 kubenswrapper[36504]: I1203 22:16:32.187919 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqm27\" (UniqueName: \"kubernetes.io/projected/16ea3a17-93f8-4f62-9776-301fd54be174-kube-api-access-mqm27\") pod \"nmstate-operator-5b5b58f5c8-db99x\" (UID: \"16ea3a17-93f8-4f62-9776-301fd54be174\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" Dec 03 22:16:32.188272 master-0 kubenswrapper[36504]: I1203 22:16:32.188064 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87sq8\" (UniqueName: \"kubernetes.io/projected/1519d792-d4bc-4ab4-b7ec-76ad913673d4-kube-api-access-87sq8\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:32.188272 master-0 kubenswrapper[36504]: I1203 22:16:32.188082 36504 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:32.188272 master-0 kubenswrapper[36504]: I1203 22:16:32.188094 36504 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1519d792-d4bc-4ab4-b7ec-76ad913673d4-util\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:32.290109 master-0 kubenswrapper[36504]: I1203 22:16:32.290019 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqm27\" (UniqueName: \"kubernetes.io/projected/16ea3a17-93f8-4f62-9776-301fd54be174-kube-api-access-mqm27\") pod \"nmstate-operator-5b5b58f5c8-db99x\" (UID: \"16ea3a17-93f8-4f62-9776-301fd54be174\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" Dec 03 22:16:32.314960 master-0 kubenswrapper[36504]: I1203 22:16:32.314465 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqm27\" (UniqueName: \"kubernetes.io/projected/16ea3a17-93f8-4f62-9776-301fd54be174-kube-api-access-mqm27\") pod \"nmstate-operator-5b5b58f5c8-db99x\" (UID: \"16ea3a17-93f8-4f62-9776-301fd54be174\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" Dec 03 22:16:32.357829 master-0 kubenswrapper[36504]: I1203 22:16:32.357194 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" Dec 03 22:16:32.535539 master-0 kubenswrapper[36504]: I1203 22:16:32.535472 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" event={"ID":"1519d792-d4bc-4ab4-b7ec-76ad913673d4","Type":"ContainerDied","Data":"46bd9fa0178614f3f853e7880422cc8b2b610f3f544c0496a7a672f13281177c"} Dec 03 22:16:32.535539 master-0 kubenswrapper[36504]: I1203 22:16:32.535540 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46bd9fa0178614f3f853e7880422cc8b2b610f3f544c0496a7a672f13281177c" Dec 03 22:16:32.536250 master-0 kubenswrapper[36504]: I1203 22:16:32.535579 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210t2zf6" Dec 03 22:16:32.801904 master-0 kubenswrapper[36504]: I1203 22:16:32.801834 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x"] Dec 03 22:16:32.811130 master-0 kubenswrapper[36504]: W1203 22:16:32.810887 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16ea3a17_93f8_4f62_9776_301fd54be174.slice/crio-d16339dd78bb63373aa5cbe34a18c885467673130142358863fcb2ef81e4f5cd WatchSource:0}: Error finding container d16339dd78bb63373aa5cbe34a18c885467673130142358863fcb2ef81e4f5cd: Status 404 returned error can't find the container with id d16339dd78bb63373aa5cbe34a18c885467673130142358863fcb2ef81e4f5cd Dec 03 22:16:33.545100 master-0 kubenswrapper[36504]: I1203 22:16:33.545025 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" event={"ID":"16ea3a17-93f8-4f62-9776-301fd54be174","Type":"ContainerStarted","Data":"d16339dd78bb63373aa5cbe34a18c885467673130142358863fcb2ef81e4f5cd"} Dec 03 22:16:34.614906 master-0 kubenswrapper[36504]: I1203 22:16:34.614848 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-794866c946-5q8lb"] Dec 03 22:16:34.617665 master-0 kubenswrapper[36504]: I1203 22:16:34.616000 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.624667 master-0 kubenswrapper[36504]: I1203 22:16:34.624608 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-794866c946-5q8lb"] Dec 03 22:16:34.745058 master-0 kubenswrapper[36504]: I1203 22:16:34.744921 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-trusted-ca-bundle\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.745330 master-0 kubenswrapper[36504]: I1203 22:16:34.745109 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-oauth-config\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.745330 master-0 kubenswrapper[36504]: I1203 22:16:34.745174 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-serving-cert\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.745330 master-0 kubenswrapper[36504]: I1203 22:16:34.745217 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5nks\" (UniqueName: \"kubernetes.io/projected/1832a08a-47a4-4449-a9db-d11f44f4c154-kube-api-access-t5nks\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.745330 master-0 kubenswrapper[36504]: I1203 22:16:34.745249 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-console-config\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.745556 master-0 kubenswrapper[36504]: I1203 22:16:34.745370 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-service-ca\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.745556 master-0 kubenswrapper[36504]: I1203 22:16:34.745401 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-oauth-serving-cert\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847093 master-0 kubenswrapper[36504]: I1203 22:16:34.847032 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-trusted-ca-bundle\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847381 master-0 kubenswrapper[36504]: I1203 22:16:34.847130 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-oauth-config\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847381 master-0 kubenswrapper[36504]: I1203 22:16:34.847159 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-serving-cert\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847381 master-0 kubenswrapper[36504]: I1203 22:16:34.847177 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5nks\" (UniqueName: \"kubernetes.io/projected/1832a08a-47a4-4449-a9db-d11f44f4c154-kube-api-access-t5nks\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847381 master-0 kubenswrapper[36504]: I1203 22:16:34.847199 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-console-config\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847381 master-0 kubenswrapper[36504]: I1203 22:16:34.847252 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-service-ca\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.847381 master-0 kubenswrapper[36504]: I1203 22:16:34.847270 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-oauth-serving-cert\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.848211 master-0 kubenswrapper[36504]: I1203 22:16:34.848179 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-oauth-serving-cert\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.853977 master-0 kubenswrapper[36504]: I1203 22:16:34.848911 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-console-config\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.853977 master-0 kubenswrapper[36504]: I1203 22:16:34.849456 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-trusted-ca-bundle\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.853977 master-0 kubenswrapper[36504]: I1203 22:16:34.849562 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-service-ca\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.854933 master-0 kubenswrapper[36504]: I1203 22:16:34.854911 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-serving-cert\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.863081 master-0 kubenswrapper[36504]: I1203 22:16:34.862714 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-oauth-config\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.874642 master-0 kubenswrapper[36504]: I1203 22:16:34.872088 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5nks\" (UniqueName: \"kubernetes.io/projected/1832a08a-47a4-4449-a9db-d11f44f4c154-kube-api-access-t5nks\") pod \"console-794866c946-5q8lb\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:34.948901 master-0 kubenswrapper[36504]: I1203 22:16:34.948839 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:35.702803 master-0 kubenswrapper[36504]: E1203 22:16:35.702717 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:16:36.596677 master-0 kubenswrapper[36504]: I1203 22:16:36.596602 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" event={"ID":"8676b196-ec39-41e6-938c-ad3f66a9b83e","Type":"ContainerStarted","Data":"5eb4f5678d98453aa8bcc908f551b7a7cde940097702732604f5fae4b494e697"} Dec 03 22:16:36.598003 master-0 kubenswrapper[36504]: I1203 22:16:36.597966 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:36.600348 master-0 kubenswrapper[36504]: I1203 22:16:36.600319 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" event={"ID":"d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d","Type":"ContainerStarted","Data":"35915fe36c09d44727e3f3c9766d943a5a24eaad3c57ad36a8504043f9618279"} Dec 03 22:16:36.601057 master-0 kubenswrapper[36504]: I1203 22:16:36.601034 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:16:36.635617 master-0 kubenswrapper[36504]: I1203 22:16:36.630394 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" podStartSLOduration=1.396013821 podStartE2EDuration="6.630370989s" podCreationTimestamp="2025-12-03 22:16:30 +0000 UTC" firstStartedPulling="2025-12-03 22:16:31.101325788 +0000 UTC m=+356.321097795" lastFinishedPulling="2025-12-03 22:16:36.335682926 +0000 UTC m=+361.555454963" observedRunningTime="2025-12-03 22:16:36.625382223 +0000 UTC m=+361.845154230" watchObservedRunningTime="2025-12-03 22:16:36.630370989 +0000 UTC m=+361.850143016" Dec 03 22:16:36.662232 master-0 kubenswrapper[36504]: I1203 22:16:36.661545 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" podStartSLOduration=2.057164879 podStartE2EDuration="7.66152303s" podCreationTimestamp="2025-12-03 22:16:29 +0000 UTC" firstStartedPulling="2025-12-03 22:16:30.729723625 +0000 UTC m=+355.949495632" lastFinishedPulling="2025-12-03 22:16:36.334081776 +0000 UTC m=+361.553853783" observedRunningTime="2025-12-03 22:16:36.657501095 +0000 UTC m=+361.877273102" watchObservedRunningTime="2025-12-03 22:16:36.66152303 +0000 UTC m=+361.881295037" Dec 03 22:16:36.819449 master-0 kubenswrapper[36504]: W1203 22:16:36.819377 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1832a08a_47a4_4449_a9db_d11f44f4c154.slice/crio-d170b552383730edbee7142866fdba388c31fa78fb0786d3030834fe13e18e6e WatchSource:0}: Error finding container d170b552383730edbee7142866fdba388c31fa78fb0786d3030834fe13e18e6e: Status 404 returned error can't find the container with id d170b552383730edbee7142866fdba388c31fa78fb0786d3030834fe13e18e6e Dec 03 22:16:36.823138 master-0 kubenswrapper[36504]: I1203 22:16:36.823089 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-794866c946-5q8lb"] Dec 03 22:16:37.673500 master-0 kubenswrapper[36504]: I1203 22:16:37.673304 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" event={"ID":"16ea3a17-93f8-4f62-9776-301fd54be174","Type":"ContainerStarted","Data":"0d19b1d010b0bdb3f6091d86ae702edec72d7592f5e03daa5d44e39fc15b5c50"} Dec 03 22:16:37.691197 master-0 kubenswrapper[36504]: I1203 22:16:37.691083 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-794866c946-5q8lb" event={"ID":"1832a08a-47a4-4449-a9db-d11f44f4c154","Type":"ContainerStarted","Data":"cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644"} Dec 03 22:16:37.691197 master-0 kubenswrapper[36504]: I1203 22:16:37.691210 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-794866c946-5q8lb" event={"ID":"1832a08a-47a4-4449-a9db-d11f44f4c154","Type":"ContainerStarted","Data":"d170b552383730edbee7142866fdba388c31fa78fb0786d3030834fe13e18e6e"} Dec 03 22:16:38.412447 master-0 kubenswrapper[36504]: I1203 22:16:38.412300 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-db99x" podStartSLOduration=2.8319516609999997 podStartE2EDuration="7.412266046s" podCreationTimestamp="2025-12-03 22:16:31 +0000 UTC" firstStartedPulling="2025-12-03 22:16:32.814073918 +0000 UTC m=+358.033845935" lastFinishedPulling="2025-12-03 22:16:37.394388313 +0000 UTC m=+362.614160320" observedRunningTime="2025-12-03 22:16:38.396962949 +0000 UTC m=+363.616735026" watchObservedRunningTime="2025-12-03 22:16:38.412266046 +0000 UTC m=+363.632038093" Dec 03 22:16:38.538426 master-0 kubenswrapper[36504]: I1203 22:16:38.538355 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-65qkf"] Dec 03 22:16:38.539918 master-0 kubenswrapper[36504]: I1203 22:16:38.539878 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.578566 master-0 kubenswrapper[36504]: I1203 22:16:38.575276 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65qkf"] Dec 03 22:16:38.612681 master-0 kubenswrapper[36504]: I1203 22:16:38.612594 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-794866c946-5q8lb" podStartSLOduration=4.612576946 podStartE2EDuration="4.612576946s" podCreationTimestamp="2025-12-03 22:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:16:38.606505496 +0000 UTC m=+363.826277523" watchObservedRunningTime="2025-12-03 22:16:38.612576946 +0000 UTC m=+363.832348953" Dec 03 22:16:38.675232 master-0 kubenswrapper[36504]: I1203 22:16:38.675082 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-catalog-content\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.675544 master-0 kubenswrapper[36504]: I1203 22:16:38.675526 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-utilities\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.675695 master-0 kubenswrapper[36504]: I1203 22:16:38.675682 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-299fd\" (UniqueName: \"kubernetes.io/projected/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-kube-api-access-299fd\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.777873 master-0 kubenswrapper[36504]: I1203 22:16:38.777755 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-299fd\" (UniqueName: \"kubernetes.io/projected/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-kube-api-access-299fd\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.778146 master-0 kubenswrapper[36504]: I1203 22:16:38.777897 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-catalog-content\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.778146 master-0 kubenswrapper[36504]: I1203 22:16:38.777958 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-utilities\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.778659 master-0 kubenswrapper[36504]: I1203 22:16:38.778630 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-utilities\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.778970 master-0 kubenswrapper[36504]: I1203 22:16:38.778912 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-catalog-content\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.838580 master-0 kubenswrapper[36504]: I1203 22:16:38.838529 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-299fd\" (UniqueName: \"kubernetes.io/projected/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-kube-api-access-299fd\") pod \"redhat-marketplace-65qkf\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:38.874865 master-0 kubenswrapper[36504]: I1203 22:16:38.874805 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:40.112372 master-0 kubenswrapper[36504]: I1203 22:16:40.112312 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-65qkf"] Dec 03 22:16:40.113713 master-0 kubenswrapper[36504]: W1203 22:16:40.113657 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b0e12e_c194_4f31_a4b4_8a88f29fbfd7.slice/crio-5da65573f30130b70d8db5a4e676496dff491b2403c569e62b6c9aaaa192f09c WatchSource:0}: Error finding container 5da65573f30130b70d8db5a4e676496dff491b2403c569e62b6c9aaaa192f09c: Status 404 returned error can't find the container with id 5da65573f30130b70d8db5a4e676496dff491b2403c569e62b6c9aaaa192f09c Dec 03 22:16:40.726899 master-0 kubenswrapper[36504]: I1203 22:16:40.726832 36504 generic.go:334] "Generic (PLEG): container finished" podID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerID="558f842262911e5e10c05948160f9ec4bb14297e890588e24ffae2517b3449ed" exitCode=0 Dec 03 22:16:40.727167 master-0 kubenswrapper[36504]: I1203 22:16:40.726908 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65qkf" event={"ID":"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7","Type":"ContainerDied","Data":"558f842262911e5e10c05948160f9ec4bb14297e890588e24ffae2517b3449ed"} Dec 03 22:16:40.727167 master-0 kubenswrapper[36504]: I1203 22:16:40.726946 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65qkf" event={"ID":"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7","Type":"ContainerStarted","Data":"5da65573f30130b70d8db5a4e676496dff491b2403c569e62b6c9aaaa192f09c"} Dec 03 22:16:42.138359 master-0 kubenswrapper[36504]: I1203 22:16:42.138276 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b"] Dec 03 22:16:42.139958 master-0 kubenswrapper[36504]: I1203 22:16:42.139582 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.141735 master-0 kubenswrapper[36504]: I1203 22:16:42.141656 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 03 22:16:42.145128 master-0 kubenswrapper[36504]: I1203 22:16:42.145074 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 03 22:16:42.150978 master-0 kubenswrapper[36504]: I1203 22:16:42.149686 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b"] Dec 03 22:16:42.260028 master-0 kubenswrapper[36504]: I1203 22:16:42.259969 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddceaa52-cf30-4df1-90ce-e353e62ab8b1-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l495b\" (UID: \"ddceaa52-cf30-4df1-90ce-e353e62ab8b1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.260428 master-0 kubenswrapper[36504]: I1203 22:16:42.260395 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xzr\" (UniqueName: \"kubernetes.io/projected/ddceaa52-cf30-4df1-90ce-e353e62ab8b1-kube-api-access-98xzr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l495b\" (UID: \"ddceaa52-cf30-4df1-90ce-e353e62ab8b1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.362708 master-0 kubenswrapper[36504]: I1203 22:16:42.362525 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98xzr\" (UniqueName: \"kubernetes.io/projected/ddceaa52-cf30-4df1-90ce-e353e62ab8b1-kube-api-access-98xzr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l495b\" (UID: \"ddceaa52-cf30-4df1-90ce-e353e62ab8b1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.362708 master-0 kubenswrapper[36504]: I1203 22:16:42.362702 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddceaa52-cf30-4df1-90ce-e353e62ab8b1-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l495b\" (UID: \"ddceaa52-cf30-4df1-90ce-e353e62ab8b1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.363501 master-0 kubenswrapper[36504]: I1203 22:16:42.363446 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddceaa52-cf30-4df1-90ce-e353e62ab8b1-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l495b\" (UID: \"ddceaa52-cf30-4df1-90ce-e353e62ab8b1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.563970 master-0 kubenswrapper[36504]: I1203 22:16:42.563912 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xzr\" (UniqueName: \"kubernetes.io/projected/ddceaa52-cf30-4df1-90ce-e353e62ab8b1-kube-api-access-98xzr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-l495b\" (UID: \"ddceaa52-cf30-4df1-90ce-e353e62ab8b1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:42.747663 master-0 kubenswrapper[36504]: I1203 22:16:42.747597 36504 generic.go:334] "Generic (PLEG): container finished" podID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerID="76aa3390ccdd12f1b87a3428d1fdfded5bbd1c1580db57fb35229438b4e4c458" exitCode=0 Dec 03 22:16:42.747663 master-0 kubenswrapper[36504]: I1203 22:16:42.747652 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65qkf" event={"ID":"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7","Type":"ContainerDied","Data":"76aa3390ccdd12f1b87a3428d1fdfded5bbd1c1580db57fb35229438b4e4c458"} Dec 03 22:16:42.760488 master-0 kubenswrapper[36504]: I1203 22:16:42.760402 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" Dec 03 22:16:43.277371 master-0 kubenswrapper[36504]: I1203 22:16:43.274747 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b"] Dec 03 22:16:43.310212 master-0 kubenswrapper[36504]: W1203 22:16:43.309760 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddceaa52_cf30_4df1_90ce_e353e62ab8b1.slice/crio-399b545a4aab2f221b93a7658937ca26c63670827f58a2a83de2af179116dff9 WatchSource:0}: Error finding container 399b545a4aab2f221b93a7658937ca26c63670827f58a2a83de2af179116dff9: Status 404 returned error can't find the container with id 399b545a4aab2f221b93a7658937ca26c63670827f58a2a83de2af179116dff9 Dec 03 22:16:43.757124 master-0 kubenswrapper[36504]: I1203 22:16:43.757041 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" event={"ID":"ddceaa52-cf30-4df1-90ce-e353e62ab8b1","Type":"ContainerStarted","Data":"399b545a4aab2f221b93a7658937ca26c63670827f58a2a83de2af179116dff9"} Dec 03 22:16:43.761011 master-0 kubenswrapper[36504]: I1203 22:16:43.760951 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65qkf" event={"ID":"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7","Type":"ContainerStarted","Data":"e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960"} Dec 03 22:16:43.796436 master-0 kubenswrapper[36504]: I1203 22:16:43.796179 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-65qkf" podStartSLOduration=4.157739854 podStartE2EDuration="6.7961438s" podCreationTimestamp="2025-12-03 22:16:37 +0000 UTC" firstStartedPulling="2025-12-03 22:16:40.72871447 +0000 UTC m=+365.948486477" lastFinishedPulling="2025-12-03 22:16:43.367118426 +0000 UTC m=+368.586890423" observedRunningTime="2025-12-03 22:16:43.787062506 +0000 UTC m=+369.006834523" watchObservedRunningTime="2025-12-03 22:16:43.7961438 +0000 UTC m=+369.015915807" Dec 03 22:16:44.952677 master-0 kubenswrapper[36504]: I1203 22:16:44.952234 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:44.952677 master-0 kubenswrapper[36504]: I1203 22:16:44.952278 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:44.979602 master-0 kubenswrapper[36504]: I1203 22:16:44.979537 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:45.852735 master-0 kubenswrapper[36504]: I1203 22:16:45.852672 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:16:46.347179 master-0 kubenswrapper[36504]: I1203 22:16:46.347032 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66896f8657-9276x"] Dec 03 22:16:48.876425 master-0 kubenswrapper[36504]: I1203 22:16:48.876255 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:48.876425 master-0 kubenswrapper[36504]: I1203 22:16:48.876373 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:49.117966 master-0 kubenswrapper[36504]: I1203 22:16:49.109234 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:50.117332 master-0 kubenswrapper[36504]: I1203 22:16:50.117216 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:50.590824 master-0 kubenswrapper[36504]: I1203 22:16:50.590512 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5664d6fd-phlzw" Dec 03 22:16:53.478649 master-0 kubenswrapper[36504]: I1203 22:16:53.478350 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65qkf"] Dec 03 22:16:53.479487 master-0 kubenswrapper[36504]: I1203 22:16:53.479062 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-65qkf" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="registry-server" containerID="cri-o://e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960" gracePeriod=2 Dec 03 22:16:53.957080 master-0 kubenswrapper[36504]: I1203 22:16:53.956992 36504 generic.go:334] "Generic (PLEG): container finished" podID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerID="e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960" exitCode=0 Dec 03 22:16:53.957080 master-0 kubenswrapper[36504]: I1203 22:16:53.957074 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65qkf" event={"ID":"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7","Type":"ContainerDied","Data":"e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960"} Dec 03 22:16:56.311735 master-0 kubenswrapper[36504]: I1203 22:16:56.311633 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb"] Dec 03 22:16:56.316523 master-0 kubenswrapper[36504]: I1203 22:16:56.316462 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" Dec 03 22:16:56.319327 master-0 kubenswrapper[36504]: I1203 22:16:56.319257 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 22:16:56.319897 master-0 kubenswrapper[36504]: I1203 22:16:56.319590 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 22:16:56.341158 master-0 kubenswrapper[36504]: I1203 22:16:56.338645 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb"] Dec 03 22:16:56.393967 master-0 kubenswrapper[36504]: I1203 22:16:56.393869 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r245g\" (UniqueName: \"kubernetes.io/projected/951ea7b4-82ce-43f2-94a5-3890e0c71877-kube-api-access-r245g\") pod \"obo-prometheus-operator-668cf9dfbb-h2tcb\" (UID: \"951ea7b4-82ce-43f2-94a5-3890e0c71877\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" Dec 03 22:16:56.496218 master-0 kubenswrapper[36504]: I1203 22:16:56.496139 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r245g\" (UniqueName: \"kubernetes.io/projected/951ea7b4-82ce-43f2-94a5-3890e0c71877-kube-api-access-r245g\") pod \"obo-prometheus-operator-668cf9dfbb-h2tcb\" (UID: \"951ea7b4-82ce-43f2-94a5-3890e0c71877\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" Dec 03 22:16:56.536291 master-0 kubenswrapper[36504]: I1203 22:16:56.536212 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r245g\" (UniqueName: \"kubernetes.io/projected/951ea7b4-82ce-43f2-94a5-3890e0c71877-kube-api-access-r245g\") pod \"obo-prometheus-operator-668cf9dfbb-h2tcb\" (UID: \"951ea7b4-82ce-43f2-94a5-3890e0c71877\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" Dec 03 22:16:56.621983 master-0 kubenswrapper[36504]: I1203 22:16:56.565253 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2"] Dec 03 22:16:56.621983 master-0 kubenswrapper[36504]: I1203 22:16:56.566562 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.621983 master-0 kubenswrapper[36504]: I1203 22:16:56.571331 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 22:16:56.621983 master-0 kubenswrapper[36504]: I1203 22:16:56.588317 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm"] Dec 03 22:16:56.630107 master-0 kubenswrapper[36504]: I1203 22:16:56.630050 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.677356 master-0 kubenswrapper[36504]: I1203 22:16:56.677280 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" Dec 03 22:16:56.686749 master-0 kubenswrapper[36504]: I1203 22:16:56.686704 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2"] Dec 03 22:16:56.698060 master-0 kubenswrapper[36504]: I1203 22:16:56.698026 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm"] Dec 03 22:16:56.729097 master-0 kubenswrapper[36504]: I1203 22:16:56.729062 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1222f5f-afc5-4a2f-938b-f2f26567cab5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2\" (UID: \"f1222f5f-afc5-4a2f-938b-f2f26567cab5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.729306 master-0 kubenswrapper[36504]: I1203 22:16:56.729284 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1222f5f-afc5-4a2f-938b-f2f26567cab5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2\" (UID: \"f1222f5f-afc5-4a2f-938b-f2f26567cab5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.831240 master-0 kubenswrapper[36504]: I1203 22:16:56.831191 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/33be1f02-0ef4-497b-b25d-90a30e49f70a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm\" (UID: \"33be1f02-0ef4-497b-b25d-90a30e49f70a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.831660 master-0 kubenswrapper[36504]: I1203 22:16:56.831542 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1222f5f-afc5-4a2f-938b-f2f26567cab5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2\" (UID: \"f1222f5f-afc5-4a2f-938b-f2f26567cab5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.833944 master-0 kubenswrapper[36504]: I1203 22:16:56.831889 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1222f5f-afc5-4a2f-938b-f2f26567cab5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2\" (UID: \"f1222f5f-afc5-4a2f-938b-f2f26567cab5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.833944 master-0 kubenswrapper[36504]: I1203 22:16:56.832733 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/33be1f02-0ef4-497b-b25d-90a30e49f70a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm\" (UID: \"33be1f02-0ef4-497b-b25d-90a30e49f70a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.834836 master-0 kubenswrapper[36504]: I1203 22:16:56.834809 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f1222f5f-afc5-4a2f-938b-f2f26567cab5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2\" (UID: \"f1222f5f-afc5-4a2f-938b-f2f26567cab5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.835608 master-0 kubenswrapper[36504]: I1203 22:16:56.835536 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f1222f5f-afc5-4a2f-938b-f2f26567cab5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2\" (UID: \"f1222f5f-afc5-4a2f-938b-f2f26567cab5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.934072 master-0 kubenswrapper[36504]: I1203 22:16:56.933906 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/33be1f02-0ef4-497b-b25d-90a30e49f70a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm\" (UID: \"33be1f02-0ef4-497b-b25d-90a30e49f70a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.934072 master-0 kubenswrapper[36504]: I1203 22:16:56.933979 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/33be1f02-0ef4-497b-b25d-90a30e49f70a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm\" (UID: \"33be1f02-0ef4-497b-b25d-90a30e49f70a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.937703 master-0 kubenswrapper[36504]: I1203 22:16:56.937640 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/33be1f02-0ef4-497b-b25d-90a30e49f70a-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm\" (UID: \"33be1f02-0ef4-497b-b25d-90a30e49f70a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.938440 master-0 kubenswrapper[36504]: I1203 22:16:56.938406 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/33be1f02-0ef4-497b-b25d-90a30e49f70a-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm\" (UID: \"33be1f02-0ef4-497b-b25d-90a30e49f70a\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:56.966176 master-0 kubenswrapper[36504]: I1203 22:16:56.966117 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" Dec 03 22:16:56.994090 master-0 kubenswrapper[36504]: I1203 22:16:56.994032 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" Dec 03 22:16:58.132896 master-0 kubenswrapper[36504]: I1203 22:16:58.132668 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-q7tnk"] Dec 03 22:16:58.133889 master-0 kubenswrapper[36504]: I1203 22:16:58.133736 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.136530 master-0 kubenswrapper[36504]: I1203 22:16:58.136502 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 22:16:58.140931 master-0 kubenswrapper[36504]: I1203 22:16:58.140316 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-q7tnk"] Dec 03 22:16:58.165190 master-0 kubenswrapper[36504]: I1203 22:16:58.165135 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8rq\" (UniqueName: \"kubernetes.io/projected/09b4c1d6-bc07-43aa-bdf5-22698a6d1075-kube-api-access-gq8rq\") pod \"observability-operator-d8bb48f5d-q7tnk\" (UID: \"09b4c1d6-bc07-43aa-bdf5-22698a6d1075\") " pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.165364 master-0 kubenswrapper[36504]: I1203 22:16:58.165297 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/09b4c1d6-bc07-43aa-bdf5-22698a6d1075-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-q7tnk\" (UID: \"09b4c1d6-bc07-43aa-bdf5-22698a6d1075\") " pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.266091 master-0 kubenswrapper[36504]: I1203 22:16:58.266000 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/09b4c1d6-bc07-43aa-bdf5-22698a6d1075-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-q7tnk\" (UID: \"09b4c1d6-bc07-43aa-bdf5-22698a6d1075\") " pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.266252 master-0 kubenswrapper[36504]: I1203 22:16:58.266099 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8rq\" (UniqueName: \"kubernetes.io/projected/09b4c1d6-bc07-43aa-bdf5-22698a6d1075-kube-api-access-gq8rq\") pod \"observability-operator-d8bb48f5d-q7tnk\" (UID: \"09b4c1d6-bc07-43aa-bdf5-22698a6d1075\") " pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.270493 master-0 kubenswrapper[36504]: I1203 22:16:58.270472 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/09b4c1d6-bc07-43aa-bdf5-22698a6d1075-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-q7tnk\" (UID: \"09b4c1d6-bc07-43aa-bdf5-22698a6d1075\") " pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.419077 master-0 kubenswrapper[36504]: I1203 22:16:58.418651 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8rq\" (UniqueName: \"kubernetes.io/projected/09b4c1d6-bc07-43aa-bdf5-22698a6d1075-kube-api-access-gq8rq\") pod \"observability-operator-d8bb48f5d-q7tnk\" (UID: \"09b4c1d6-bc07-43aa-bdf5-22698a6d1075\") " pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.486710 master-0 kubenswrapper[36504]: I1203 22:16:58.480014 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:16:58.610972 master-0 kubenswrapper[36504]: I1203 22:16:58.608612 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nqqrb"] Dec 03 22:16:58.616048 master-0 kubenswrapper[36504]: I1203 22:16:58.613962 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.681809 master-0 kubenswrapper[36504]: I1203 22:16:58.678366 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nqqrb"] Dec 03 22:16:58.693858 master-0 kubenswrapper[36504]: I1203 22:16:58.690634 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp7zv\" (UniqueName: \"kubernetes.io/projected/19acbdf7-1001-4193-a1f9-a1accf771fe9-kube-api-access-xp7zv\") pod \"perses-operator-5446b9c989-nqqrb\" (UID: \"19acbdf7-1001-4193-a1f9-a1accf771fe9\") " pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.693858 master-0 kubenswrapper[36504]: I1203 22:16:58.690874 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/19acbdf7-1001-4193-a1f9-a1accf771fe9-openshift-service-ca\") pod \"perses-operator-5446b9c989-nqqrb\" (UID: \"19acbdf7-1001-4193-a1f9-a1accf771fe9\") " pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.795504 master-0 kubenswrapper[36504]: I1203 22:16:58.792218 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/19acbdf7-1001-4193-a1f9-a1accf771fe9-openshift-service-ca\") pod \"perses-operator-5446b9c989-nqqrb\" (UID: \"19acbdf7-1001-4193-a1f9-a1accf771fe9\") " pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.795504 master-0 kubenswrapper[36504]: I1203 22:16:58.792319 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp7zv\" (UniqueName: \"kubernetes.io/projected/19acbdf7-1001-4193-a1f9-a1accf771fe9-kube-api-access-xp7zv\") pod \"perses-operator-5446b9c989-nqqrb\" (UID: \"19acbdf7-1001-4193-a1f9-a1accf771fe9\") " pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.795504 master-0 kubenswrapper[36504]: I1203 22:16:58.794203 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/19acbdf7-1001-4193-a1f9-a1accf771fe9-openshift-service-ca\") pod \"perses-operator-5446b9c989-nqqrb\" (UID: \"19acbdf7-1001-4193-a1f9-a1accf771fe9\") " pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.838526 master-0 kubenswrapper[36504]: I1203 22:16:58.838389 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp7zv\" (UniqueName: \"kubernetes.io/projected/19acbdf7-1001-4193-a1f9-a1accf771fe9-kube-api-access-xp7zv\") pod \"perses-operator-5446b9c989-nqqrb\" (UID: \"19acbdf7-1001-4193-a1f9-a1accf771fe9\") " pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:58.887553 master-0 kubenswrapper[36504]: E1203 22:16:58.876970 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960 is running failed: container process not found" containerID="e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 22:16:58.887553 master-0 kubenswrapper[36504]: E1203 22:16:58.877561 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960 is running failed: container process not found" containerID="e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 22:16:58.887553 master-0 kubenswrapper[36504]: E1203 22:16:58.877830 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960 is running failed: container process not found" containerID="e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 22:16:58.887553 master-0 kubenswrapper[36504]: E1203 22:16:58.877859 36504 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-65qkf" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="registry-server" Dec 03 22:16:58.898420 master-0 kubenswrapper[36504]: I1203 22:16:58.898372 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:58.997658 master-0 kubenswrapper[36504]: I1203 22:16:58.996073 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-299fd\" (UniqueName: \"kubernetes.io/projected/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-kube-api-access-299fd\") pod \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " Dec 03 22:16:58.997658 master-0 kubenswrapper[36504]: I1203 22:16:58.996174 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-utilities\") pod \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " Dec 03 22:16:58.997658 master-0 kubenswrapper[36504]: I1203 22:16:58.996226 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-catalog-content\") pod \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\" (UID: \"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7\") " Dec 03 22:16:59.000284 master-0 kubenswrapper[36504]: I1203 22:16:58.998562 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-utilities" (OuterVolumeSpecName: "utilities") pod "52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" (UID: "52b0e12e-c194-4f31-a4b4-8a88f29fbfd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:59.001065 master-0 kubenswrapper[36504]: I1203 22:16:59.000993 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-kube-api-access-299fd" (OuterVolumeSpecName: "kube-api-access-299fd") pod "52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" (UID: "52b0e12e-c194-4f31-a4b4-8a88f29fbfd7"). InnerVolumeSpecName "kube-api-access-299fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:16:59.008053 master-0 kubenswrapper[36504]: I1203 22:16:59.008010 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-65qkf" event={"ID":"52b0e12e-c194-4f31-a4b4-8a88f29fbfd7","Type":"ContainerDied","Data":"5da65573f30130b70d8db5a4e676496dff491b2403c569e62b6c9aaaa192f09c"} Dec 03 22:16:59.008232 master-0 kubenswrapper[36504]: I1203 22:16:59.008076 36504 scope.go:117] "RemoveContainer" containerID="e5ed74ee8336146b619986ebecb2df6f8a32cb5fbe1be3c7c72e2f17642a2960" Dec 03 22:16:59.008232 master-0 kubenswrapper[36504]: I1203 22:16:59.008087 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-65qkf" Dec 03 22:16:59.009791 master-0 kubenswrapper[36504]: I1203 22:16:59.009754 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" event={"ID":"ddceaa52-cf30-4df1-90ce-e353e62ab8b1","Type":"ContainerStarted","Data":"059eca3759ed91fb1a1dfc2a0ae500bd265b5a63ca46f34a8e744e522d765688"} Dec 03 22:16:59.019109 master-0 kubenswrapper[36504]: I1203 22:16:59.019065 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" (UID: "52b0e12e-c194-4f31-a4b4-8a88f29fbfd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:16:59.056941 master-0 kubenswrapper[36504]: I1203 22:16:59.056229 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:16:59.081873 master-0 kubenswrapper[36504]: I1203 22:16:59.081809 36504 scope.go:117] "RemoveContainer" containerID="76aa3390ccdd12f1b87a3428d1fdfded5bbd1c1580db57fb35229438b4e4c458" Dec 03 22:16:59.092639 master-0 kubenswrapper[36504]: I1203 22:16:59.091439 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb"] Dec 03 22:16:59.110518 master-0 kubenswrapper[36504]: I1203 22:16:59.110363 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-299fd\" (UniqueName: \"kubernetes.io/projected/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-kube-api-access-299fd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:59.110518 master-0 kubenswrapper[36504]: I1203 22:16:59.110403 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:59.110518 master-0 kubenswrapper[36504]: I1203 22:16:59.110416 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:16:59.126954 master-0 kubenswrapper[36504]: I1203 22:16:59.126916 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm"] Dec 03 22:16:59.128052 master-0 kubenswrapper[36504]: I1203 22:16:59.126968 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2"] Dec 03 22:16:59.133814 master-0 kubenswrapper[36504]: I1203 22:16:59.133532 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-l495b" podStartSLOduration=2.267304498 podStartE2EDuration="17.133508238s" podCreationTimestamp="2025-12-03 22:16:42 +0000 UTC" firstStartedPulling="2025-12-03 22:16:43.316092714 +0000 UTC m=+368.535864721" lastFinishedPulling="2025-12-03 22:16:58.182296454 +0000 UTC m=+383.402068461" observedRunningTime="2025-12-03 22:16:59.106952189 +0000 UTC m=+384.326724206" watchObservedRunningTime="2025-12-03 22:16:59.133508238 +0000 UTC m=+384.353280245" Dec 03 22:16:59.146952 master-0 kubenswrapper[36504]: I1203 22:16:59.146909 36504 scope.go:117] "RemoveContainer" containerID="558f842262911e5e10c05948160f9ec4bb14297e890588e24ffae2517b3449ed" Dec 03 22:16:59.167010 master-0 kubenswrapper[36504]: I1203 22:16:59.166573 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-q7tnk"] Dec 03 22:16:59.350150 master-0 kubenswrapper[36504]: I1203 22:16:59.350089 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-65qkf"] Dec 03 22:16:59.371560 master-0 kubenswrapper[36504]: I1203 22:16:59.365526 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-65qkf"] Dec 03 22:16:59.633278 master-0 kubenswrapper[36504]: I1203 22:16:59.632644 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-nqqrb"] Dec 03 22:16:59.662301 master-0 kubenswrapper[36504]: W1203 22:16:59.656936 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19acbdf7_1001_4193_a1f9_a1accf771fe9.slice/crio-9b06054c819f0eed5572d1de9671d0f2062b337a89e0581093f2c06600d231fe WatchSource:0}: Error finding container 9b06054c819f0eed5572d1de9671d0f2062b337a89e0581093f2c06600d231fe: Status 404 returned error can't find the container with id 9b06054c819f0eed5572d1de9671d0f2062b337a89e0581093f2c06600d231fe Dec 03 22:17:00.018413 master-0 kubenswrapper[36504]: I1203 22:17:00.018243 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" event={"ID":"09b4c1d6-bc07-43aa-bdf5-22698a6d1075","Type":"ContainerStarted","Data":"0df3a6bf203413343809b70914967487c1c80656e4b7282b224f83e1186e64b5"} Dec 03 22:17:00.022591 master-0 kubenswrapper[36504]: I1203 22:17:00.022546 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" event={"ID":"33be1f02-0ef4-497b-b25d-90a30e49f70a","Type":"ContainerStarted","Data":"6a70d2fe01cbf78ab9f1b301560b92fba7479154ce7384a5597188d5cf0112bd"} Dec 03 22:17:00.023552 master-0 kubenswrapper[36504]: I1203 22:17:00.023513 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" event={"ID":"f1222f5f-afc5-4a2f-938b-f2f26567cab5","Type":"ContainerStarted","Data":"94a4d1fb08658600ed0e1d17c3e8f4388e1fe976dc3fe9bd891bccd5bc480c88"} Dec 03 22:17:00.024589 master-0 kubenswrapper[36504]: I1203 22:17:00.024547 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" event={"ID":"951ea7b4-82ce-43f2-94a5-3890e0c71877","Type":"ContainerStarted","Data":"0ed09f48181b1a9c50153432b4fd5cc1716df32ed720261d8d5e2927579d5095"} Dec 03 22:17:00.026526 master-0 kubenswrapper[36504]: I1203 22:17:00.025561 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" event={"ID":"19acbdf7-1001-4193-a1f9-a1accf771fe9","Type":"ContainerStarted","Data":"9b06054c819f0eed5572d1de9671d0f2062b337a89e0581093f2c06600d231fe"} Dec 03 22:17:01.109196 master-0 kubenswrapper[36504]: I1203 22:17:01.109137 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" path="/var/lib/kubelet/pods/52b0e12e-c194-4f31-a4b4-8a88f29fbfd7/volumes" Dec 03 22:17:04.095812 master-0 kubenswrapper[36504]: I1203 22:17:04.095705 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: I1203 22:17:05.378766 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-vnp9x"] Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: E1203 22:17:05.379594 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="extract-content" Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: I1203 22:17:05.379617 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="extract-content" Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: E1203 22:17:05.379637 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="registry-server" Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: I1203 22:17:05.379650 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="registry-server" Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: E1203 22:17:05.379675 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="extract-utilities" Dec 03 22:17:05.379684 master-0 kubenswrapper[36504]: I1203 22:17:05.379688 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="extract-utilities" Dec 03 22:17:05.380740 master-0 kubenswrapper[36504]: I1203 22:17:05.380126 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="52b0e12e-c194-4f31-a4b4-8a88f29fbfd7" containerName="registry-server" Dec 03 22:17:05.385156 master-0 kubenswrapper[36504]: I1203 22:17:05.384397 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.389153 master-0 kubenswrapper[36504]: I1203 22:17:05.389100 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-vnp9x"] Dec 03 22:17:05.408813 master-0 kubenswrapper[36504]: I1203 22:17:05.395805 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 22:17:05.408813 master-0 kubenswrapper[36504]: I1203 22:17:05.396144 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 22:17:05.469965 master-0 kubenswrapper[36504]: I1203 22:17:05.469578 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56nzf\" (UniqueName: \"kubernetes.io/projected/387eac4b-3ef7-4290-822f-fcf198ad50b9-kube-api-access-56nzf\") pod \"cert-manager-webhook-f4fb5df64-vnp9x\" (UID: \"387eac4b-3ef7-4290-822f-fcf198ad50b9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.469965 master-0 kubenswrapper[36504]: I1203 22:17:05.469685 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/387eac4b-3ef7-4290-822f-fcf198ad50b9-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-vnp9x\" (UID: \"387eac4b-3ef7-4290-822f-fcf198ad50b9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.553726 master-0 kubenswrapper[36504]: I1203 22:17:05.553653 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf"] Dec 03 22:17:05.554983 master-0 kubenswrapper[36504]: I1203 22:17:05.554937 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.571626 master-0 kubenswrapper[36504]: I1203 22:17:05.571558 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/387eac4b-3ef7-4290-822f-fcf198ad50b9-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-vnp9x\" (UID: \"387eac4b-3ef7-4290-822f-fcf198ad50b9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.571927 master-0 kubenswrapper[36504]: I1203 22:17:05.571796 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56nzf\" (UniqueName: \"kubernetes.io/projected/387eac4b-3ef7-4290-822f-fcf198ad50b9-kube-api-access-56nzf\") pod \"cert-manager-webhook-f4fb5df64-vnp9x\" (UID: \"387eac4b-3ef7-4290-822f-fcf198ad50b9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.579357 master-0 kubenswrapper[36504]: I1203 22:17:05.579314 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf"] Dec 03 22:17:05.609854 master-0 kubenswrapper[36504]: I1203 22:17:05.603217 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56nzf\" (UniqueName: \"kubernetes.io/projected/387eac4b-3ef7-4290-822f-fcf198ad50b9-kube-api-access-56nzf\") pod \"cert-manager-webhook-f4fb5df64-vnp9x\" (UID: \"387eac4b-3ef7-4290-822f-fcf198ad50b9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.625484 master-0 kubenswrapper[36504]: I1203 22:17:05.625413 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/387eac4b-3ef7-4290-822f-fcf198ad50b9-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-vnp9x\" (UID: \"387eac4b-3ef7-4290-822f-fcf198ad50b9\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.681437 master-0 kubenswrapper[36504]: I1203 22:17:05.675977 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2sxq\" (UniqueName: \"kubernetes.io/projected/05c0c9d6-6747-4c9c-946c-7e3eb71388a2-kube-api-access-k2sxq\") pod \"cert-manager-cainjector-855d9ccff4-vrzdf\" (UID: \"05c0c9d6-6747-4c9c-946c-7e3eb71388a2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.681437 master-0 kubenswrapper[36504]: I1203 22:17:05.676158 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05c0c9d6-6747-4c9c-946c-7e3eb71388a2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vrzdf\" (UID: \"05c0c9d6-6747-4c9c-946c-7e3eb71388a2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.748325 master-0 kubenswrapper[36504]: I1203 22:17:05.748249 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:05.779685 master-0 kubenswrapper[36504]: I1203 22:17:05.778128 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05c0c9d6-6747-4c9c-946c-7e3eb71388a2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vrzdf\" (UID: \"05c0c9d6-6747-4c9c-946c-7e3eb71388a2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.779685 master-0 kubenswrapper[36504]: I1203 22:17:05.778218 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2sxq\" (UniqueName: \"kubernetes.io/projected/05c0c9d6-6747-4c9c-946c-7e3eb71388a2-kube-api-access-k2sxq\") pod \"cert-manager-cainjector-855d9ccff4-vrzdf\" (UID: \"05c0c9d6-6747-4c9c-946c-7e3eb71388a2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.797588 master-0 kubenswrapper[36504]: I1203 22:17:05.797527 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05c0c9d6-6747-4c9c-946c-7e3eb71388a2-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vrzdf\" (UID: \"05c0c9d6-6747-4c9c-946c-7e3eb71388a2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.800245 master-0 kubenswrapper[36504]: I1203 22:17:05.800202 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2sxq\" (UniqueName: \"kubernetes.io/projected/05c0c9d6-6747-4c9c-946c-7e3eb71388a2-kube-api-access-k2sxq\") pod \"cert-manager-cainjector-855d9ccff4-vrzdf\" (UID: \"05c0c9d6-6747-4c9c-946c-7e3eb71388a2\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:05.899331 master-0 kubenswrapper[36504]: I1203 22:17:05.899278 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" Dec 03 22:17:06.589618 master-0 kubenswrapper[36504]: I1203 22:17:06.589555 36504 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:17:06.590288 master-0 kubenswrapper[36504]: I1203 22:17:06.589943 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="cluster-policy-controller" containerID="cri-o://5ecce271c239c91c72a501424ca7e835ff72e1bf3e5847efbd0d8ee1120b7b78" gracePeriod=30 Dec 03 22:17:06.590288 master-0 kubenswrapper[36504]: I1203 22:17:06.590145 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" containerID="cri-o://43a0ef9d17ea7612b3d56ad1047e202234ab9843605fa56bd910e99667c96ddf" gracePeriod=30 Dec 03 22:17:06.590288 master-0 kubenswrapper[36504]: I1203 22:17:06.590200 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://f632f4d3a35c98012f1cece56605d69139c5283e86fa145d2f6236cf3af716de" gracePeriod=30 Dec 03 22:17:06.590288 master-0 kubenswrapper[36504]: I1203 22:17:06.590243 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://affb4bc279c4e26b0213bf26fa803d2a6b54fe054c87700ae68e278a97fca108" gracePeriod=30 Dec 03 22:17:06.591511 master-0 kubenswrapper[36504]: I1203 22:17:06.591478 36504 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:17:06.591901 master-0 kubenswrapper[36504]: E1203 22:17:06.591872 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-recovery-controller" Dec 03 22:17:06.591901 master-0 kubenswrapper[36504]: I1203 22:17:06.591897 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-recovery-controller" Dec 03 22:17:06.592023 master-0 kubenswrapper[36504]: E1203 22:17:06.591931 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" Dec 03 22:17:06.592023 master-0 kubenswrapper[36504]: I1203 22:17:06.591942 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" Dec 03 22:17:06.592023 master-0 kubenswrapper[36504]: E1203 22:17:06.591951 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="cluster-policy-controller" Dec 03 22:17:06.592023 master-0 kubenswrapper[36504]: I1203 22:17:06.591958 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="cluster-policy-controller" Dec 03 22:17:06.592023 master-0 kubenswrapper[36504]: E1203 22:17:06.591983 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-cert-syncer" Dec 03 22:17:06.592023 master-0 kubenswrapper[36504]: I1203 22:17:06.591992 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-cert-syncer" Dec 03 22:17:06.592680 master-0 kubenswrapper[36504]: I1203 22:17:06.592651 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-cert-syncer" Dec 03 22:17:06.592750 master-0 kubenswrapper[36504]: I1203 22:17:06.592682 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="cluster-policy-controller" Dec 03 22:17:06.592750 master-0 kubenswrapper[36504]: I1203 22:17:06.592704 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" Dec 03 22:17:06.592750 master-0 kubenswrapper[36504]: I1203 22:17:06.592714 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" Dec 03 22:17:06.592750 master-0 kubenswrapper[36504]: I1203 22:17:06.592735 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager-recovery-controller" Dec 03 22:17:06.593158 master-0 kubenswrapper[36504]: E1203 22:17:06.593134 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" Dec 03 22:17:06.593158 master-0 kubenswrapper[36504]: I1203 22:17:06.593154 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="287a70e59cc5430b23b208b9a03b5ac7" containerName="kube-controller-manager" Dec 03 22:17:06.692806 master-0 kubenswrapper[36504]: I1203 22:17:06.692724 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0c5324c227fe84467262d69e571a132a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0c5324c227fe84467262d69e571a132a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:06.693150 master-0 kubenswrapper[36504]: I1203 22:17:06.692860 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0c5324c227fe84467262d69e571a132a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0c5324c227fe84467262d69e571a132a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:06.794856 master-0 kubenswrapper[36504]: I1203 22:17:06.794502 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0c5324c227fe84467262d69e571a132a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0c5324c227fe84467262d69e571a132a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:06.794856 master-0 kubenswrapper[36504]: I1203 22:17:06.794570 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0c5324c227fe84467262d69e571a132a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0c5324c227fe84467262d69e571a132a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:06.794856 master-0 kubenswrapper[36504]: I1203 22:17:06.794699 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0c5324c227fe84467262d69e571a132a-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0c5324c227fe84467262d69e571a132a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:06.794856 master-0 kubenswrapper[36504]: I1203 22:17:06.794745 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0c5324c227fe84467262d69e571a132a-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0c5324c227fe84467262d69e571a132a\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:07.142386 master-0 kubenswrapper[36504]: I1203 22:17:07.142320 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager-cert-syncer/0.log" Dec 03 22:17:07.146993 master-0 kubenswrapper[36504]: I1203 22:17:07.146797 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager/0.log" Dec 03 22:17:07.146993 master-0 kubenswrapper[36504]: I1203 22:17:07.146866 36504 generic.go:334] "Generic (PLEG): container finished" podID="287a70e59cc5430b23b208b9a03b5ac7" containerID="43a0ef9d17ea7612b3d56ad1047e202234ab9843605fa56bd910e99667c96ddf" exitCode=0 Dec 03 22:17:07.146993 master-0 kubenswrapper[36504]: I1203 22:17:07.146892 36504 generic.go:334] "Generic (PLEG): container finished" podID="287a70e59cc5430b23b208b9a03b5ac7" containerID="f632f4d3a35c98012f1cece56605d69139c5283e86fa145d2f6236cf3af716de" exitCode=0 Dec 03 22:17:07.146993 master-0 kubenswrapper[36504]: I1203 22:17:07.146901 36504 generic.go:334] "Generic (PLEG): container finished" podID="287a70e59cc5430b23b208b9a03b5ac7" containerID="affb4bc279c4e26b0213bf26fa803d2a6b54fe054c87700ae68e278a97fca108" exitCode=2 Dec 03 22:17:07.146993 master-0 kubenswrapper[36504]: I1203 22:17:07.146910 36504 generic.go:334] "Generic (PLEG): container finished" podID="287a70e59cc5430b23b208b9a03b5ac7" containerID="5ecce271c239c91c72a501424ca7e835ff72e1bf3e5847efbd0d8ee1120b7b78" exitCode=0 Dec 03 22:17:07.147272 master-0 kubenswrapper[36504]: I1203 22:17:07.146999 36504 scope.go:117] "RemoveContainer" containerID="7f76635edb60bf491a762b76e0a7b8207d158079fd706ba1977dae6cce9c45ed" Dec 03 22:17:07.153700 master-0 kubenswrapper[36504]: I1203 22:17:07.150171 36504 generic.go:334] "Generic (PLEG): container finished" podID="b839ef46-9f53-4843-a419-dcff6de87eaa" containerID="14acef51e8f6aeabc9e46d31b029f7c6cb1114df1176e1b2da5a6187fe6b7b3f" exitCode=0 Dec 03 22:17:07.153700 master-0 kubenswrapper[36504]: I1203 22:17:07.150256 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b839ef46-9f53-4843-a419-dcff6de87eaa","Type":"ContainerDied","Data":"14acef51e8f6aeabc9e46d31b029f7c6cb1114df1176e1b2da5a6187fe6b7b3f"} Dec 03 22:17:08.777603 master-0 kubenswrapper[36504]: I1203 22:17:08.777538 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:17:08.934170 master-0 kubenswrapper[36504]: I1203 22:17:08.934099 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-kubelet-dir\") pod \"b839ef46-9f53-4843-a419-dcff6de87eaa\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " Dec 03 22:17:08.934460 master-0 kubenswrapper[36504]: I1203 22:17:08.934195 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b839ef46-9f53-4843-a419-dcff6de87eaa-kube-api-access\") pod \"b839ef46-9f53-4843-a419-dcff6de87eaa\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " Dec 03 22:17:08.934460 master-0 kubenswrapper[36504]: I1203 22:17:08.934224 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b839ef46-9f53-4843-a419-dcff6de87eaa" (UID: "b839ef46-9f53-4843-a419-dcff6de87eaa"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:08.934460 master-0 kubenswrapper[36504]: I1203 22:17:08.934307 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-var-lock\") pod \"b839ef46-9f53-4843-a419-dcff6de87eaa\" (UID: \"b839ef46-9f53-4843-a419-dcff6de87eaa\") " Dec 03 22:17:08.934651 master-0 kubenswrapper[36504]: I1203 22:17:08.934617 36504 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:08.934830 master-0 kubenswrapper[36504]: I1203 22:17:08.934705 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-var-lock" (OuterVolumeSpecName: "var-lock") pod "b839ef46-9f53-4843-a419-dcff6de87eaa" (UID: "b839ef46-9f53-4843-a419-dcff6de87eaa"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:08.937526 master-0 kubenswrapper[36504]: I1203 22:17:08.937489 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b839ef46-9f53-4843-a419-dcff6de87eaa-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b839ef46-9f53-4843-a419-dcff6de87eaa" (UID: "b839ef46-9f53-4843-a419-dcff6de87eaa"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:17:09.036474 master-0 kubenswrapper[36504]: I1203 22:17:09.036371 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b839ef46-9f53-4843-a419-dcff6de87eaa-kube-api-access\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:09.036474 master-0 kubenswrapper[36504]: I1203 22:17:09.036433 36504 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b839ef46-9f53-4843-a419-dcff6de87eaa-var-lock\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:09.199043 master-0 kubenswrapper[36504]: I1203 22:17:09.198979 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager-cert-syncer/0.log" Dec 03 22:17:09.219897 master-0 kubenswrapper[36504]: I1203 22:17:09.218954 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b839ef46-9f53-4843-a419-dcff6de87eaa","Type":"ContainerDied","Data":"e388c12c51162e4e2e8d81c9ede79a9c4d62e67e22a76fae63f896978d794e72"} Dec 03 22:17:09.219897 master-0 kubenswrapper[36504]: I1203 22:17:09.219019 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e388c12c51162e4e2e8d81c9ede79a9c4d62e67e22a76fae63f896978d794e72" Dec 03 22:17:09.219897 master-0 kubenswrapper[36504]: I1203 22:17:09.219040 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Dec 03 22:17:09.673268 master-0 kubenswrapper[36504]: I1203 22:17:09.673230 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager-cert-syncer/0.log" Dec 03 22:17:09.674132 master-0 kubenswrapper[36504]: I1203 22:17:09.674036 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:09.677196 master-0 kubenswrapper[36504]: I1203 22:17:09.677150 36504 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="287a70e59cc5430b23b208b9a03b5ac7" podUID="0c5324c227fe84467262d69e571a132a" Dec 03 22:17:09.857103 master-0 kubenswrapper[36504]: I1203 22:17:09.856515 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") pod \"287a70e59cc5430b23b208b9a03b5ac7\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " Dec 03 22:17:09.857103 master-0 kubenswrapper[36504]: I1203 22:17:09.856750 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") pod \"287a70e59cc5430b23b208b9a03b5ac7\" (UID: \"287a70e59cc5430b23b208b9a03b5ac7\") " Dec 03 22:17:09.857103 master-0 kubenswrapper[36504]: I1203 22:17:09.856747 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "287a70e59cc5430b23b208b9a03b5ac7" (UID: "287a70e59cc5430b23b208b9a03b5ac7"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:09.857103 master-0 kubenswrapper[36504]: I1203 22:17:09.856819 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "287a70e59cc5430b23b208b9a03b5ac7" (UID: "287a70e59cc5430b23b208b9a03b5ac7"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:17:09.859964 master-0 kubenswrapper[36504]: I1203 22:17:09.859912 36504 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-resource-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:09.859964 master-0 kubenswrapper[36504]: I1203 22:17:09.859955 36504 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/287a70e59cc5430b23b208b9a03b5ac7-cert-dir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:10.058041 master-0 kubenswrapper[36504]: I1203 22:17:10.041803 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-78597876b7-mbz6f" Dec 03 22:17:10.186786 master-0 kubenswrapper[36504]: I1203 22:17:10.186707 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf"] Dec 03 22:17:10.228433 master-0 kubenswrapper[36504]: I1203 22:17:10.228371 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" event={"ID":"33be1f02-0ef4-497b-b25d-90a30e49f70a","Type":"ContainerStarted","Data":"2c13cd55cd0d46535b86c026ded38b33bdf99648ecfca455aa8279da70c81252"} Dec 03 22:17:10.230093 master-0 kubenswrapper[36504]: I1203 22:17:10.230030 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" event={"ID":"f1222f5f-afc5-4a2f-938b-f2f26567cab5","Type":"ContainerStarted","Data":"2dd8064b263e2dfcfbaa218b14dcc528e4938b990d2bd7a4eb4153d8c71d3651"} Dec 03 22:17:10.232031 master-0 kubenswrapper[36504]: I1203 22:17:10.231966 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" event={"ID":"951ea7b4-82ce-43f2-94a5-3890e0c71877","Type":"ContainerStarted","Data":"a5a1ff29eda14bd34b96a200ef3d72009c1e1ca3b2e15a828fa8f222cd799211"} Dec 03 22:17:10.238385 master-0 kubenswrapper[36504]: I1203 22:17:10.238327 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" event={"ID":"19acbdf7-1001-4193-a1f9-a1accf771fe9","Type":"ContainerStarted","Data":"6ffe13bbf765d1b2789a54c3c52a2c2c9eceeca1040543c1817f0979ae9cbb62"} Dec 03 22:17:10.238938 master-0 kubenswrapper[36504]: I1203 22:17:10.238882 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:17:10.241636 master-0 kubenswrapper[36504]: I1203 22:17:10.241540 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" event={"ID":"09b4c1d6-bc07-43aa-bdf5-22698a6d1075","Type":"ContainerStarted","Data":"964b7b7d718ad9462601825ee4ccf62cbb752fe7a903c9a08a03695904903108"} Dec 03 22:17:10.242083 master-0 kubenswrapper[36504]: I1203 22:17:10.241991 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:17:10.244649 master-0 kubenswrapper[36504]: I1203 22:17:10.244605 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" Dec 03 22:17:10.246436 master-0 kubenswrapper[36504]: I1203 22:17:10.246331 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" event={"ID":"05c0c9d6-6747-4c9c-946c-7e3eb71388a2","Type":"ContainerStarted","Data":"33cc4639b1f9c47c9339d141bb96532fdb418809b09064f1fbba8b76e7c7a43f"} Dec 03 22:17:10.264451 master-0 kubenswrapper[36504]: I1203 22:17:10.264351 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-mfxmm" podStartSLOduration=3.793436325 podStartE2EDuration="14.2643294s" podCreationTimestamp="2025-12-03 22:16:56 +0000 UTC" firstStartedPulling="2025-12-03 22:16:59.151474698 +0000 UTC m=+384.371246695" lastFinishedPulling="2025-12-03 22:17:09.622367763 +0000 UTC m=+394.842139770" observedRunningTime="2025-12-03 22:17:10.260134189 +0000 UTC m=+395.479906206" watchObservedRunningTime="2025-12-03 22:17:10.2643294 +0000 UTC m=+395.484101407" Dec 03 22:17:10.264902 master-0 kubenswrapper[36504]: I1203 22:17:10.264845 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_287a70e59cc5430b23b208b9a03b5ac7/kube-controller-manager-cert-syncer/0.log" Dec 03 22:17:10.269868 master-0 kubenswrapper[36504]: I1203 22:17:10.269824 36504 scope.go:117] "RemoveContainer" containerID="43a0ef9d17ea7612b3d56ad1047e202234ab9843605fa56bd910e99667c96ddf" Dec 03 22:17:10.270111 master-0 kubenswrapper[36504]: I1203 22:17:10.269994 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:10.292110 master-0 kubenswrapper[36504]: I1203 22:17:10.289952 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-vnp9x"] Dec 03 22:17:10.292110 master-0 kubenswrapper[36504]: I1203 22:17:10.290028 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" podStartSLOduration=2.337666542 podStartE2EDuration="12.290015311s" podCreationTimestamp="2025-12-03 22:16:58 +0000 UTC" firstStartedPulling="2025-12-03 22:16:59.668861408 +0000 UTC m=+384.888633425" lastFinishedPulling="2025-12-03 22:17:09.621210187 +0000 UTC m=+394.840982194" observedRunningTime="2025-12-03 22:17:10.281214447 +0000 UTC m=+395.500986464" watchObservedRunningTime="2025-12-03 22:17:10.290015311 +0000 UTC m=+395.509787318" Dec 03 22:17:10.298887 master-0 kubenswrapper[36504]: I1203 22:17:10.298300 36504 scope.go:117] "RemoveContainer" containerID="f632f4d3a35c98012f1cece56605d69139c5283e86fa145d2f6236cf3af716de" Dec 03 22:17:10.318795 master-0 kubenswrapper[36504]: I1203 22:17:10.318024 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-b5bfd8f5-45bq2" podStartSLOduration=3.9208237280000002 podStartE2EDuration="14.318001824s" podCreationTimestamp="2025-12-03 22:16:56 +0000 UTC" firstStartedPulling="2025-12-03 22:16:59.146947727 +0000 UTC m=+384.366719744" lastFinishedPulling="2025-12-03 22:17:09.544125833 +0000 UTC m=+394.763897840" observedRunningTime="2025-12-03 22:17:10.310914843 +0000 UTC m=+395.530686850" watchObservedRunningTime="2025-12-03 22:17:10.318001824 +0000 UTC m=+395.537773831" Dec 03 22:17:10.340298 master-0 kubenswrapper[36504]: I1203 22:17:10.340216 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-q7tnk" podStartSLOduration=2.9073986 podStartE2EDuration="13.340192967s" podCreationTimestamp="2025-12-03 22:16:57 +0000 UTC" firstStartedPulling="2025-12-03 22:16:59.190350701 +0000 UTC m=+384.410122708" lastFinishedPulling="2025-12-03 22:17:09.623145068 +0000 UTC m=+394.842917075" observedRunningTime="2025-12-03 22:17:10.331763424 +0000 UTC m=+395.551535431" watchObservedRunningTime="2025-12-03 22:17:10.340192967 +0000 UTC m=+395.559964974" Dec 03 22:17:10.357202 master-0 kubenswrapper[36504]: I1203 22:17:10.356285 36504 scope.go:117] "RemoveContainer" containerID="affb4bc279c4e26b0213bf26fa803d2a6b54fe054c87700ae68e278a97fca108" Dec 03 22:17:10.366737 master-0 kubenswrapper[36504]: I1203 22:17:10.366660 36504 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="287a70e59cc5430b23b208b9a03b5ac7" podUID="0c5324c227fe84467262d69e571a132a" Dec 03 22:17:10.368027 master-0 kubenswrapper[36504]: I1203 22:17:10.367966 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h2tcb" podStartSLOduration=3.869799677 podStartE2EDuration="14.367952623s" podCreationTimestamp="2025-12-03 22:16:56 +0000 UTC" firstStartedPulling="2025-12-03 22:16:59.124096354 +0000 UTC m=+384.343868361" lastFinishedPulling="2025-12-03 22:17:09.6222493 +0000 UTC m=+394.842021307" observedRunningTime="2025-12-03 22:17:10.361268534 +0000 UTC m=+395.581040551" watchObservedRunningTime="2025-12-03 22:17:10.367952623 +0000 UTC m=+395.587724630" Dec 03 22:17:10.386994 master-0 kubenswrapper[36504]: I1203 22:17:10.386913 36504 scope.go:117] "RemoveContainer" containerID="5ecce271c239c91c72a501424ca7e835ff72e1bf3e5847efbd0d8ee1120b7b78" Dec 03 22:17:11.107029 master-0 kubenswrapper[36504]: I1203 22:17:11.106953 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287a70e59cc5430b23b208b9a03b5ac7" path="/var/lib/kubelet/pods/287a70e59cc5430b23b208b9a03b5ac7/volumes" Dec 03 22:17:11.285108 master-0 kubenswrapper[36504]: I1203 22:17:11.285045 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" event={"ID":"387eac4b-3ef7-4290-822f-fcf198ad50b9","Type":"ContainerStarted","Data":"ccb54582afe7cdf1562e41cc1c54e8d6f0abf1c7714cbff5250ba80544ee212f"} Dec 03 22:17:11.480635 master-0 kubenswrapper[36504]: I1203 22:17:11.469908 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-66896f8657-9276x" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" containerID="cri-o://2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc" gracePeriod=15 Dec 03 22:17:11.969580 master-0 kubenswrapper[36504]: I1203 22:17:11.969529 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66896f8657-9276x_311227bb-eeea-40dd-b57d-ca50551be7d3/console/0.log" Dec 03 22:17:11.969883 master-0 kubenswrapper[36504]: I1203 22:17:11.969612 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.114630 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-serving-cert\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.114718 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-console-config\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.114758 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-oauth-serving-cert\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.114907 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-oauth-config\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.114933 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-trusted-ca-bundle\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.114989 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkqvg\" (UniqueName: \"kubernetes.io/projected/311227bb-eeea-40dd-b57d-ca50551be7d3-kube-api-access-rkqvg\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.115097 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-service-ca\") pod \"311227bb-eeea-40dd-b57d-ca50551be7d3\" (UID: \"311227bb-eeea-40dd-b57d-ca50551be7d3\") " Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.115370 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-console-config" (OuterVolumeSpecName: "console-config") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.115669 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.115728 36504 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.116495 master-0 kubenswrapper[36504]: I1203 22:17:12.115882 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:12.117827 master-0 kubenswrapper[36504]: I1203 22:17:12.117753 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:17:12.119762 master-0 kubenswrapper[36504]: I1203 22:17:12.119592 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311227bb-eeea-40dd-b57d-ca50551be7d3-kube-api-access-rkqvg" (OuterVolumeSpecName: "kube-api-access-rkqvg") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "kube-api-access-rkqvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:17:12.119762 master-0 kubenswrapper[36504]: I1203 22:17:12.119702 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:17:12.120659 master-0 kubenswrapper[36504]: I1203 22:17:12.120619 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "311227bb-eeea-40dd-b57d-ca50551be7d3" (UID: "311227bb-eeea-40dd-b57d-ca50551be7d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:17:12.217750 master-0 kubenswrapper[36504]: I1203 22:17:12.217579 36504 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.217750 master-0 kubenswrapper[36504]: I1203 22:17:12.217626 36504 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.217750 master-0 kubenswrapper[36504]: I1203 22:17:12.217636 36504 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.217750 master-0 kubenswrapper[36504]: I1203 22:17:12.217647 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkqvg\" (UniqueName: \"kubernetes.io/projected/311227bb-eeea-40dd-b57d-ca50551be7d3-kube-api-access-rkqvg\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.217750 master-0 kubenswrapper[36504]: I1203 22:17:12.217659 36504 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/311227bb-eeea-40dd-b57d-ca50551be7d3-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.217750 master-0 kubenswrapper[36504]: I1203 22:17:12.217669 36504 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/311227bb-eeea-40dd-b57d-ca50551be7d3-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:17:12.298886 master-0 kubenswrapper[36504]: I1203 22:17:12.298374 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66896f8657-9276x_311227bb-eeea-40dd-b57d-ca50551be7d3/console/0.log" Dec 03 22:17:12.298886 master-0 kubenswrapper[36504]: I1203 22:17:12.298459 36504 generic.go:334] "Generic (PLEG): container finished" podID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerID="2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc" exitCode=2 Dec 03 22:17:12.299394 master-0 kubenswrapper[36504]: I1203 22:17:12.299308 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66896f8657-9276x" event={"ID":"311227bb-eeea-40dd-b57d-ca50551be7d3","Type":"ContainerDied","Data":"2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc"} Dec 03 22:17:12.299466 master-0 kubenswrapper[36504]: I1203 22:17:12.299419 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66896f8657-9276x" event={"ID":"311227bb-eeea-40dd-b57d-ca50551be7d3","Type":"ContainerDied","Data":"cd3c4e79c48822b910d0a1710fddeecb42e1ffcb815d6472b9e2605a23c48de7"} Dec 03 22:17:12.299466 master-0 kubenswrapper[36504]: I1203 22:17:12.299448 36504 scope.go:117] "RemoveContainer" containerID="2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc" Dec 03 22:17:12.299541 master-0 kubenswrapper[36504]: I1203 22:17:12.299346 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66896f8657-9276x" Dec 03 22:17:12.336667 master-0 kubenswrapper[36504]: I1203 22:17:12.336622 36504 scope.go:117] "RemoveContainer" containerID="2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc" Dec 03 22:17:12.341212 master-0 kubenswrapper[36504]: E1203 22:17:12.337766 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc\": container with ID starting with 2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc not found: ID does not exist" containerID="2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc" Dec 03 22:17:12.341212 master-0 kubenswrapper[36504]: I1203 22:17:12.337819 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc"} err="failed to get container status \"2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc\": rpc error: code = NotFound desc = could not find container \"2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc\": container with ID starting with 2afc4e7ed943ce1e838106d5623de8aaba76afe754924fe4c9ad8ade02f318dc not found: ID does not exist" Dec 03 22:17:12.342401 master-0 kubenswrapper[36504]: I1203 22:17:12.341902 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66896f8657-9276x"] Dec 03 22:17:12.349573 master-0 kubenswrapper[36504]: I1203 22:17:12.349500 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66896f8657-9276x"] Dec 03 22:17:13.107146 master-0 kubenswrapper[36504]: I1203 22:17:13.106826 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" path="/var/lib/kubelet/pods/311227bb-eeea-40dd-b57d-ca50551be7d3/volumes" Dec 03 22:17:19.062696 master-0 kubenswrapper[36504]: I1203 22:17:19.062600 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-nqqrb" Dec 03 22:17:20.095401 master-0 kubenswrapper[36504]: I1203 22:17:20.095238 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:17:20.402831 master-0 kubenswrapper[36504]: I1203 22:17:20.402738 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" event={"ID":"387eac4b-3ef7-4290-822f-fcf198ad50b9","Type":"ContainerStarted","Data":"98e9ec01dd8ea216a5fbe41164ea18f94735b05ca24999d911f484762981b052"} Dec 03 22:17:20.403224 master-0 kubenswrapper[36504]: I1203 22:17:20.402964 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:20.408302 master-0 kubenswrapper[36504]: I1203 22:17:20.408244 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" event={"ID":"05c0c9d6-6747-4c9c-946c-7e3eb71388a2","Type":"ContainerStarted","Data":"d8a17333e758d299f754cf6b818446de9f7f3e780f33568f3a276b6149b0d6e6"} Dec 03 22:17:20.526194 master-0 kubenswrapper[36504]: I1203 22:17:20.526059 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" podStartSLOduration=6.13145347 podStartE2EDuration="15.52603045s" podCreationTimestamp="2025-12-03 22:17:05 +0000 UTC" firstStartedPulling="2025-12-03 22:17:10.312651697 +0000 UTC m=+395.532423714" lastFinishedPulling="2025-12-03 22:17:19.707228647 +0000 UTC m=+404.927000694" observedRunningTime="2025-12-03 22:17:20.523130239 +0000 UTC m=+405.742902246" watchObservedRunningTime="2025-12-03 22:17:20.52603045 +0000 UTC m=+405.745802457" Dec 03 22:17:20.581803 master-0 kubenswrapper[36504]: I1203 22:17:20.581689 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vrzdf" podStartSLOduration=6.085323721 podStartE2EDuration="15.581664965s" podCreationTimestamp="2025-12-03 22:17:05 +0000 UTC" firstStartedPulling="2025-12-03 22:17:10.186581084 +0000 UTC m=+395.406353091" lastFinishedPulling="2025-12-03 22:17:19.682922288 +0000 UTC m=+404.902694335" observedRunningTime="2025-12-03 22:17:20.572167919 +0000 UTC m=+405.791939936" watchObservedRunningTime="2025-12-03 22:17:20.581664965 +0000 UTC m=+405.801436972" Dec 03 22:17:22.095031 master-0 kubenswrapper[36504]: I1203 22:17:22.094946 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:22.111618 master-0 kubenswrapper[36504]: I1203 22:17:22.111574 36504 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="72a191df-f26d-482a-93b7-60c871485fda" Dec 03 22:17:22.111737 master-0 kubenswrapper[36504]: I1203 22:17:22.111619 36504 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="72a191df-f26d-482a-93b7-60c871485fda" Dec 03 22:17:22.135878 master-0 kubenswrapper[36504]: I1203 22:17:22.129917 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:17:22.149941 master-0 kubenswrapper[36504]: I1203 22:17:22.149872 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:17:22.155706 master-0 kubenswrapper[36504]: I1203 22:17:22.155590 36504 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:22.171233 master-0 kubenswrapper[36504]: I1203 22:17:22.171176 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:22.178891 master-0 kubenswrapper[36504]: I1203 22:17:22.178822 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Dec 03 22:17:22.194389 master-0 kubenswrapper[36504]: W1203 22:17:22.194316 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5324c227fe84467262d69e571a132a.slice/crio-da406f75db0c84371312fdde78b70f3a199e8cbb4c6757e7e2015fe3199168f2 WatchSource:0}: Error finding container da406f75db0c84371312fdde78b70f3a199e8cbb4c6757e7e2015fe3199168f2: Status 404 returned error can't find the container with id da406f75db0c84371312fdde78b70f3a199e8cbb4c6757e7e2015fe3199168f2 Dec 03 22:17:22.436558 master-0 kubenswrapper[36504]: I1203 22:17:22.436493 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0c5324c227fe84467262d69e571a132a","Type":"ContainerStarted","Data":"da406f75db0c84371312fdde78b70f3a199e8cbb4c6757e7e2015fe3199168f2"} Dec 03 22:17:23.448710 master-0 kubenswrapper[36504]: I1203 22:17:23.448647 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0c5324c227fe84467262d69e571a132a","Type":"ContainerStarted","Data":"188f525fb57d85345bc9c30cf3661726bbe199a9b97bb8695ea33e8b0c871cab"} Dec 03 22:17:23.448710 master-0 kubenswrapper[36504]: I1203 22:17:23.448703 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0c5324c227fe84467262d69e571a132a","Type":"ContainerStarted","Data":"72f94ddebf033c9c9044dfa82cc2bcf500c3602974a317e3ac8ebda1a1682c20"} Dec 03 22:17:23.448710 master-0 kubenswrapper[36504]: I1203 22:17:23.448714 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0c5324c227fe84467262d69e571a132a","Type":"ContainerStarted","Data":"88b99e10dfdf7c3d62fe3864054686eabb3a23561c72a7a5e36b190cf209f041"} Dec 03 22:17:23.448710 master-0 kubenswrapper[36504]: I1203 22:17:23.448723 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0c5324c227fe84467262d69e571a132a","Type":"ContainerStarted","Data":"e586f00857a97f6e668fe2240edacb381de136744e2794b0a07868c36530edcb"} Dec 03 22:17:23.477963 master-0 kubenswrapper[36504]: I1203 22:17:23.477810 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=1.477762621 podStartE2EDuration="1.477762621s" podCreationTimestamp="2025-12-03 22:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:17:23.469756031 +0000 UTC m=+408.689528048" watchObservedRunningTime="2025-12-03 22:17:23.477762621 +0000 UTC m=+408.697534628" Dec 03 22:17:25.753675 master-0 kubenswrapper[36504]: I1203 22:17:25.753593 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-vnp9x" Dec 03 22:17:32.171635 master-0 kubenswrapper[36504]: I1203 22:17:32.171562 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.172996 master-0 kubenswrapper[36504]: I1203 22:17:32.171735 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.172996 master-0 kubenswrapper[36504]: I1203 22:17:32.171832 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.172996 master-0 kubenswrapper[36504]: I1203 22:17:32.171847 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.176220 master-0 kubenswrapper[36504]: I1203 22:17:32.176159 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.177044 master-0 kubenswrapper[36504]: I1203 22:17:32.176988 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.536195 master-0 kubenswrapper[36504]: I1203 22:17:32.535989 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:32.537562 master-0 kubenswrapper[36504]: I1203 22:17:32.537486 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Dec 03 22:17:35.692001 master-0 kubenswrapper[36504]: E1203 22:17:35.691914 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: I1203 22:17:43.292461 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-pmtjz"] Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: E1203 22:17:43.292847 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: I1203 22:17:43.292863 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: E1203 22:17:43.292884 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b839ef46-9f53-4843-a419-dcff6de87eaa" containerName="installer" Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: I1203 22:17:43.292892 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b839ef46-9f53-4843-a419-dcff6de87eaa" containerName="installer" Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: I1203 22:17:43.293087 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b839ef46-9f53-4843-a419-dcff6de87eaa" containerName="installer" Dec 03 22:17:43.293096 master-0 kubenswrapper[36504]: I1203 22:17:43.293109 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="311227bb-eeea-40dd-b57d-ca50551be7d3" containerName="console" Dec 03 22:17:43.294276 master-0 kubenswrapper[36504]: I1203 22:17:43.293602 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.338632 master-0 kubenswrapper[36504]: I1203 22:17:43.336432 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-pmtjz"] Dec 03 22:17:43.429585 master-0 kubenswrapper[36504]: I1203 22:17:43.427639 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggmx5\" (UniqueName: \"kubernetes.io/projected/7f06b54a-85b9-4d0a-b544-0d591bec73cf-kube-api-access-ggmx5\") pod \"cert-manager-86cb77c54b-pmtjz\" (UID: \"7f06b54a-85b9-4d0a-b544-0d591bec73cf\") " pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.429585 master-0 kubenswrapper[36504]: I1203 22:17:43.428045 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f06b54a-85b9-4d0a-b544-0d591bec73cf-bound-sa-token\") pod \"cert-manager-86cb77c54b-pmtjz\" (UID: \"7f06b54a-85b9-4d0a-b544-0d591bec73cf\") " pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.530152 master-0 kubenswrapper[36504]: I1203 22:17:43.530086 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f06b54a-85b9-4d0a-b544-0d591bec73cf-bound-sa-token\") pod \"cert-manager-86cb77c54b-pmtjz\" (UID: \"7f06b54a-85b9-4d0a-b544-0d591bec73cf\") " pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.530419 master-0 kubenswrapper[36504]: I1203 22:17:43.530180 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggmx5\" (UniqueName: \"kubernetes.io/projected/7f06b54a-85b9-4d0a-b544-0d591bec73cf-kube-api-access-ggmx5\") pod \"cert-manager-86cb77c54b-pmtjz\" (UID: \"7f06b54a-85b9-4d0a-b544-0d591bec73cf\") " pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.549995 master-0 kubenswrapper[36504]: I1203 22:17:43.549892 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7f06b54a-85b9-4d0a-b544-0d591bec73cf-bound-sa-token\") pod \"cert-manager-86cb77c54b-pmtjz\" (UID: \"7f06b54a-85b9-4d0a-b544-0d591bec73cf\") " pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.556551 master-0 kubenswrapper[36504]: I1203 22:17:43.556506 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggmx5\" (UniqueName: \"kubernetes.io/projected/7f06b54a-85b9-4d0a-b544-0d591bec73cf-kube-api-access-ggmx5\") pod \"cert-manager-86cb77c54b-pmtjz\" (UID: \"7f06b54a-85b9-4d0a-b544-0d591bec73cf\") " pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:43.659355 master-0 kubenswrapper[36504]: I1203 22:17:43.659282 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-pmtjz" Dec 03 22:17:44.191105 master-0 kubenswrapper[36504]: I1203 22:17:44.190873 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-pmtjz"] Dec 03 22:17:44.208359 master-0 kubenswrapper[36504]: W1203 22:17:44.208145 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f06b54a_85b9_4d0a_b544_0d591bec73cf.slice/crio-8f6725532499dd41341b5f9273de851cc260698484aec7523b15271f99a3b79a WatchSource:0}: Error finding container 8f6725532499dd41341b5f9273de851cc260698484aec7523b15271f99a3b79a: Status 404 returned error can't find the container with id 8f6725532499dd41341b5f9273de851cc260698484aec7523b15271f99a3b79a Dec 03 22:17:44.634933 master-0 kubenswrapper[36504]: I1203 22:17:44.634841 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-pmtjz" event={"ID":"7f06b54a-85b9-4d0a-b544-0d591bec73cf","Type":"ContainerStarted","Data":"6f1599bf640f2c2b7369e040bf08d3c5e7b9304b678e7d3b61cd2f6865a4d874"} Dec 03 22:17:44.634933 master-0 kubenswrapper[36504]: I1203 22:17:44.634930 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-pmtjz" event={"ID":"7f06b54a-85b9-4d0a-b544-0d591bec73cf","Type":"ContainerStarted","Data":"8f6725532499dd41341b5f9273de851cc260698484aec7523b15271f99a3b79a"} Dec 03 22:17:44.870350 master-0 kubenswrapper[36504]: I1203 22:17:44.870227 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-pmtjz" podStartSLOduration=1.8701992170000001 podStartE2EDuration="1.870199217s" podCreationTimestamp="2025-12-03 22:17:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:17:44.863847679 +0000 UTC m=+430.083619686" watchObservedRunningTime="2025-12-03 22:17:44.870199217 +0000 UTC m=+430.089971214" Dec 03 22:17:49.148915 master-0 kubenswrapper[36504]: I1203 22:17:49.148846 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949"] Dec 03 22:17:49.159247 master-0 kubenswrapper[36504]: I1203 22:17:49.155574 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.170795 master-0 kubenswrapper[36504]: I1203 22:17:49.166402 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 22:17:49.178024 master-0 kubenswrapper[36504]: I1203 22:17:49.177931 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qb4bc"] Dec 03 22:17:49.186783 master-0 kubenswrapper[36504]: I1203 22:17:49.186702 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.190116 master-0 kubenswrapper[36504]: I1203 22:17:49.189222 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949"] Dec 03 22:17:49.192862 master-0 kubenswrapper[36504]: I1203 22:17:49.192741 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 22:17:49.193113 master-0 kubenswrapper[36504]: I1203 22:17:49.192735 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.258420 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlgks\" (UniqueName: \"kubernetes.io/projected/25f6f30f-28f4-4f42-861c-b413ab0691df-kube-api-access-qlgks\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.258641 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-sockets\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.258841 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.258985 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-reloader\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.259025 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbkd\" (UniqueName: \"kubernetes.io/projected/bb20fc19-0922-413a-b811-36351794d540-kube-api-access-5qbkd\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.259056 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-startup\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.259156 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb20fc19-0922-413a-b811-36351794d540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.259212 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics-certs\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.264181 master-0 kubenswrapper[36504]: I1203 22:17:49.259239 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-conf\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.267625 master-0 kubenswrapper[36504]: I1203 22:17:49.267560 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-fnjz4"] Dec 03 22:17:49.269800 master-0 kubenswrapper[36504]: I1203 22:17:49.269368 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.275019 master-0 kubenswrapper[36504]: I1203 22:17:49.274164 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 22:17:49.276383 master-0 kubenswrapper[36504]: I1203 22:17:49.276204 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 22:17:49.276586 master-0 kubenswrapper[36504]: I1203 22:17:49.276566 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 22:17:49.361064 master-0 kubenswrapper[36504]: I1203 22:17:49.360976 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361087 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-reloader\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361123 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbkd\" (UniqueName: \"kubernetes.io/projected/bb20fc19-0922-413a-b811-36351794d540-kube-api-access-5qbkd\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361155 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5jqz\" (UniqueName: \"kubernetes.io/projected/24153b70-6280-405c-b1d0-8bdd936bd646-kube-api-access-f5jqz\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361178 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-startup\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361221 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb20fc19-0922-413a-b811-36351794d540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361249 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics-certs\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361269 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-conf\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361291 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24153b70-6280-405c-b1d0-8bdd936bd646-metallb-excludel2\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361314 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.361329 master-0 kubenswrapper[36504]: I1203 22:17:49.361341 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlgks\" (UniqueName: \"kubernetes.io/projected/25f6f30f-28f4-4f42-861c-b413ab0691df-kube-api-access-qlgks\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361839 master-0 kubenswrapper[36504]: I1203 22:17:49.361358 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-metrics-certs\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.361839 master-0 kubenswrapper[36504]: I1203 22:17:49.361385 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-sockets\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.361839 master-0 kubenswrapper[36504]: E1203 22:17:49.361409 36504 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 03 22:17:49.361839 master-0 kubenswrapper[36504]: E1203 22:17:49.361513 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics-certs podName:25f6f30f-28f4-4f42-861c-b413ab0691df nodeName:}" failed. No retries permitted until 2025-12-03 22:17:49.861487965 +0000 UTC m=+435.081259992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics-certs") pod "frr-k8s-qb4bc" (UID: "25f6f30f-28f4-4f42-861c-b413ab0691df") : secret "frr-k8s-certs-secret" not found Dec 03 22:17:49.361839 master-0 kubenswrapper[36504]: E1203 22:17:49.361651 36504 secret.go:189] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 03 22:17:49.361839 master-0 kubenswrapper[36504]: E1203 22:17:49.361699 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb20fc19-0922-413a-b811-36351794d540-cert podName:bb20fc19-0922-413a-b811-36351794d540 nodeName:}" failed. No retries permitted until 2025-12-03 22:17:49.861682311 +0000 UTC m=+435.081454328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bb20fc19-0922-413a-b811-36351794d540-cert") pod "frr-k8s-webhook-server-7fcb986d4-jh949" (UID: "bb20fc19-0922-413a-b811-36351794d540") : secret "frr-k8s-webhook-server-cert" not found Dec 03 22:17:49.362120 master-0 kubenswrapper[36504]: I1203 22:17:49.361933 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-sockets\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.362174 master-0 kubenswrapper[36504]: I1203 22:17:49.362160 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-conf\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.362318 master-0 kubenswrapper[36504]: I1203 22:17:49.362253 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/25f6f30f-28f4-4f42-861c-b413ab0691df-frr-startup\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.362489 master-0 kubenswrapper[36504]: I1203 22:17:49.362446 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-reloader\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.362744 master-0 kubenswrapper[36504]: I1203 22:17:49.362710 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.463702 master-0 kubenswrapper[36504]: I1203 22:17:49.463447 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.464035 master-0 kubenswrapper[36504]: I1203 22:17:49.463724 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-metrics-certs\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.464035 master-0 kubenswrapper[36504]: E1203 22:17:49.463614 36504 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 22:17:49.464152 master-0 kubenswrapper[36504]: E1203 22:17:49.463799 36504 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 22:17:49.464207 master-0 kubenswrapper[36504]: E1203 22:17:49.464171 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist podName:24153b70-6280-405c-b1d0-8bdd936bd646 nodeName:}" failed. No retries permitted until 2025-12-03 22:17:49.964133578 +0000 UTC m=+435.183905595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist") pod "speaker-fnjz4" (UID: "24153b70-6280-405c-b1d0-8bdd936bd646") : secret "metallb-memberlist" not found Dec 03 22:17:49.464261 master-0 kubenswrapper[36504]: E1203 22:17:49.464214 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-metrics-certs podName:24153b70-6280-405c-b1d0-8bdd936bd646 nodeName:}" failed. No retries permitted until 2025-12-03 22:17:49.96419982 +0000 UTC m=+435.183971837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-metrics-certs") pod "speaker-fnjz4" (UID: "24153b70-6280-405c-b1d0-8bdd936bd646") : secret "speaker-certs-secret" not found Dec 03 22:17:49.464584 master-0 kubenswrapper[36504]: I1203 22:17:49.464488 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5jqz\" (UniqueName: \"kubernetes.io/projected/24153b70-6280-405c-b1d0-8bdd936bd646-kube-api-access-f5jqz\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.464749 master-0 kubenswrapper[36504]: I1203 22:17:49.464728 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24153b70-6280-405c-b1d0-8bdd936bd646-metallb-excludel2\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.465825 master-0 kubenswrapper[36504]: I1203 22:17:49.465790 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24153b70-6280-405c-b1d0-8bdd936bd646-metallb-excludel2\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.512048 master-0 kubenswrapper[36504]: I1203 22:17:49.511944 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-7lk9d"] Dec 03 22:17:49.513620 master-0 kubenswrapper[36504]: I1203 22:17:49.513577 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.515322 master-0 kubenswrapper[36504]: I1203 22:17:49.515249 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 22:17:49.567850 master-0 kubenswrapper[36504]: I1203 22:17:49.567785 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bf325e2-1bc1-4f4d-bccc-99757f834915-cert\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.568389 master-0 kubenswrapper[36504]: I1203 22:17:49.568301 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bf325e2-1bc1-4f4d-bccc-99757f834915-metrics-certs\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.568459 master-0 kubenswrapper[36504]: I1203 22:17:49.568397 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkjj\" (UniqueName: \"kubernetes.io/projected/4bf325e2-1bc1-4f4d-bccc-99757f834915-kube-api-access-vpkjj\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.602847 master-0 kubenswrapper[36504]: I1203 22:17:49.602736 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-7lk9d"] Dec 03 22:17:49.670485 master-0 kubenswrapper[36504]: I1203 22:17:49.670380 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bf325e2-1bc1-4f4d-bccc-99757f834915-cert\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.670485 master-0 kubenswrapper[36504]: I1203 22:17:49.670519 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bf325e2-1bc1-4f4d-bccc-99757f834915-metrics-certs\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.670926 master-0 kubenswrapper[36504]: I1203 22:17:49.670551 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpkjj\" (UniqueName: \"kubernetes.io/projected/4bf325e2-1bc1-4f4d-bccc-99757f834915-kube-api-access-vpkjj\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.673356 master-0 kubenswrapper[36504]: I1203 22:17:49.673291 36504 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 22:17:49.675671 master-0 kubenswrapper[36504]: I1203 22:17:49.675617 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4bf325e2-1bc1-4f4d-bccc-99757f834915-metrics-certs\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.686956 master-0 kubenswrapper[36504]: I1203 22:17:49.686902 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4bf325e2-1bc1-4f4d-bccc-99757f834915-cert\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:49.875078 master-0 kubenswrapper[36504]: I1203 22:17:49.874974 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb20fc19-0922-413a-b811-36351794d540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.875078 master-0 kubenswrapper[36504]: I1203 22:17:49.875087 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics-certs\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.879328 master-0 kubenswrapper[36504]: I1203 22:17:49.879269 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb20fc19-0922-413a-b811-36351794d540-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:49.879917 master-0 kubenswrapper[36504]: I1203 22:17:49.879863 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25f6f30f-28f4-4f42-861c-b413ab0691df-metrics-certs\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:49.977360 master-0 kubenswrapper[36504]: I1203 22:17:49.977266 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.977631 master-0 kubenswrapper[36504]: I1203 22:17:49.977388 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-metrics-certs\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:49.977797 master-0 kubenswrapper[36504]: E1203 22:17:49.977693 36504 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 22:17:49.977927 master-0 kubenswrapper[36504]: E1203 22:17:49.977896 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist podName:24153b70-6280-405c-b1d0-8bdd936bd646 nodeName:}" failed. No retries permitted until 2025-12-03 22:17:50.977861903 +0000 UTC m=+436.197633910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist") pod "speaker-fnjz4" (UID: "24153b70-6280-405c-b1d0-8bdd936bd646") : secret "metallb-memberlist" not found Dec 03 22:17:49.980728 master-0 kubenswrapper[36504]: I1203 22:17:49.980692 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-metrics-certs\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:50.256902 master-0 kubenswrapper[36504]: I1203 22:17:50.235871 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5jqz\" (UniqueName: \"kubernetes.io/projected/24153b70-6280-405c-b1d0-8bdd936bd646-kube-api-access-f5jqz\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:50.256902 master-0 kubenswrapper[36504]: I1203 22:17:50.242784 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpkjj\" (UniqueName: \"kubernetes.io/projected/4bf325e2-1bc1-4f4d-bccc-99757f834915-kube-api-access-vpkjj\") pod \"controller-f8648f98b-7lk9d\" (UID: \"4bf325e2-1bc1-4f4d-bccc-99757f834915\") " pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:50.256902 master-0 kubenswrapper[36504]: I1203 22:17:50.253675 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlgks\" (UniqueName: \"kubernetes.io/projected/25f6f30f-28f4-4f42-861c-b413ab0691df-kube-api-access-qlgks\") pod \"frr-k8s-qb4bc\" (UID: \"25f6f30f-28f4-4f42-861c-b413ab0691df\") " pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:50.256902 master-0 kubenswrapper[36504]: I1203 22:17:50.255261 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbkd\" (UniqueName: \"kubernetes.io/projected/bb20fc19-0922-413a-b811-36351794d540-kube-api-access-5qbkd\") pod \"frr-k8s-webhook-server-7fcb986d4-jh949\" (UID: \"bb20fc19-0922-413a-b811-36351794d540\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:50.430482 master-0 kubenswrapper[36504]: I1203 22:17:50.430379 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:17:50.513177 master-0 kubenswrapper[36504]: I1203 22:17:50.506111 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:17:50.530802 master-0 kubenswrapper[36504]: I1203 22:17:50.525041 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:17:51.074797 master-0 kubenswrapper[36504]: I1203 22:17:51.071851 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:51.096884 master-0 kubenswrapper[36504]: E1203 22:17:51.078045 36504 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 22:17:51.096884 master-0 kubenswrapper[36504]: E1203 22:17:51.078215 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist podName:24153b70-6280-405c-b1d0-8bdd936bd646 nodeName:}" failed. No retries permitted until 2025-12-03 22:17:53.078175208 +0000 UTC m=+438.297947215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist") pod "speaker-fnjz4" (UID: "24153b70-6280-405c-b1d0-8bdd936bd646") : secret "metallb-memberlist" not found Dec 03 22:17:51.281826 master-0 kubenswrapper[36504]: I1203 22:17:51.277131 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-7lk9d"] Dec 03 22:17:51.478987 master-0 kubenswrapper[36504]: I1203 22:17:51.478795 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949"] Dec 03 22:17:51.860473 master-0 kubenswrapper[36504]: I1203 22:17:51.860398 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-7lk9d" event={"ID":"4bf325e2-1bc1-4f4d-bccc-99757f834915","Type":"ContainerStarted","Data":"1c42b64479635fec51d929a0bcb5bbbfee14211056fa48ecff4fb3193748a798"} Dec 03 22:17:51.860473 master-0 kubenswrapper[36504]: I1203 22:17:51.860459 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-7lk9d" event={"ID":"4bf325e2-1bc1-4f4d-bccc-99757f834915","Type":"ContainerStarted","Data":"49a9f0e516ab4385a8e471aface43c6675d64f2cabc0d80a3d6ee0c43b718c08"} Dec 03 22:17:51.861972 master-0 kubenswrapper[36504]: I1203 22:17:51.861908 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" event={"ID":"bb20fc19-0922-413a-b811-36351794d540","Type":"ContainerStarted","Data":"f34ca0734e7cc9c7c3c4957de817e8b4385c6f12fe8b20f2deb79ec0b51ee2b0"} Dec 03 22:17:51.863020 master-0 kubenswrapper[36504]: I1203 22:17:51.862980 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"ba0638aee411b9791ad4cd2ee5450a976e4fecb2544a1cc14a8c2a8157fd03bb"} Dec 03 22:17:52.040320 master-0 kubenswrapper[36504]: I1203 22:17:52.040185 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g"] Dec 03 22:17:52.041946 master-0 kubenswrapper[36504]: I1203 22:17:52.041911 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" Dec 03 22:17:52.056295 master-0 kubenswrapper[36504]: I1203 22:17:52.055730 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g"] Dec 03 22:17:52.087801 master-0 kubenswrapper[36504]: I1203 22:17:52.084573 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5"] Dec 03 22:17:52.087801 master-0 kubenswrapper[36504]: I1203 22:17:52.086390 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.090373 master-0 kubenswrapper[36504]: I1203 22:17:52.090050 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 22:17:52.129225 master-0 kubenswrapper[36504]: I1203 22:17:52.129166 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5"] Dec 03 22:17:52.152278 master-0 kubenswrapper[36504]: I1203 22:17:52.152178 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-q66gg"] Dec 03 22:17:52.154063 master-0 kubenswrapper[36504]: I1203 22:17:52.154003 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.228436 master-0 kubenswrapper[36504]: I1203 22:17:52.228212 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/451399c6-0304-4409-a5a7-26a242268011-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph6w5\" (UID: \"451399c6-0304-4409-a5a7-26a242268011\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.228436 master-0 kubenswrapper[36504]: I1203 22:17:52.228263 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6z7v\" (UniqueName: \"kubernetes.io/projected/451399c6-0304-4409-a5a7-26a242268011-kube-api-access-s6z7v\") pod \"nmstate-webhook-5f6d4c5ccb-ph6w5\" (UID: \"451399c6-0304-4409-a5a7-26a242268011\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.229019 master-0 kubenswrapper[36504]: I1203 22:17:52.228659 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5q5\" (UniqueName: \"kubernetes.io/projected/eed33862-ae60-48df-b4ef-8406a4928db2-kube-api-access-7g5q5\") pod \"nmstate-metrics-7f946cbc9-mct9g\" (UID: \"eed33862-ae60-48df-b4ef-8406a4928db2\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.332750 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-dbus-socket\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.332969 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5q5\" (UniqueName: \"kubernetes.io/projected/eed33862-ae60-48df-b4ef-8406a4928db2-kube-api-access-7g5q5\") pod \"nmstate-metrics-7f946cbc9-mct9g\" (UID: \"eed33862-ae60-48df-b4ef-8406a4928db2\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.333086 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-nmstate-lock\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.333235 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnd86\" (UniqueName: \"kubernetes.io/projected/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-kube-api-access-bnd86\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.333296 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/451399c6-0304-4409-a5a7-26a242268011-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph6w5\" (UID: \"451399c6-0304-4409-a5a7-26a242268011\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.333325 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6z7v\" (UniqueName: \"kubernetes.io/projected/451399c6-0304-4409-a5a7-26a242268011-kube-api-access-s6z7v\") pod \"nmstate-webhook-5f6d4c5ccb-ph6w5\" (UID: \"451399c6-0304-4409-a5a7-26a242268011\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.333381 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-ovs-socket\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.340907 master-0 kubenswrapper[36504]: I1203 22:17:52.339578 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/451399c6-0304-4409-a5a7-26a242268011-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-ph6w5\" (UID: \"451399c6-0304-4409-a5a7-26a242268011\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.367982 master-0 kubenswrapper[36504]: I1203 22:17:52.365733 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5q5\" (UniqueName: \"kubernetes.io/projected/eed33862-ae60-48df-b4ef-8406a4928db2-kube-api-access-7g5q5\") pod \"nmstate-metrics-7f946cbc9-mct9g\" (UID: \"eed33862-ae60-48df-b4ef-8406a4928db2\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" Dec 03 22:17:52.376536 master-0 kubenswrapper[36504]: I1203 22:17:52.370617 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6z7v\" (UniqueName: \"kubernetes.io/projected/451399c6-0304-4409-a5a7-26a242268011-kube-api-access-s6z7v\") pod \"nmstate-webhook-5f6d4c5ccb-ph6w5\" (UID: \"451399c6-0304-4409-a5a7-26a242268011\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.428919 master-0 kubenswrapper[36504]: I1203 22:17:52.421167 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" Dec 03 22:17:52.441180 master-0 kubenswrapper[36504]: I1203 22:17:52.441120 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-dbus-socket\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.447490 master-0 kubenswrapper[36504]: I1203 22:17:52.447439 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-nmstate-lock\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.447877 master-0 kubenswrapper[36504]: I1203 22:17:52.447860 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnd86\" (UniqueName: \"kubernetes.io/projected/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-kube-api-access-bnd86\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.449906 master-0 kubenswrapper[36504]: I1203 22:17:52.449876 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-ovs-socket\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.450293 master-0 kubenswrapper[36504]: I1203 22:17:52.450213 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-ovs-socket\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.450410 master-0 kubenswrapper[36504]: I1203 22:17:52.441970 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-dbus-socket\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.450506 master-0 kubenswrapper[36504]: I1203 22:17:52.450492 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-nmstate-lock\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.456760 master-0 kubenswrapper[36504]: I1203 22:17:52.456476 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg"] Dec 03 22:17:52.515370 master-0 kubenswrapper[36504]: I1203 22:17:52.504629 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:17:52.515370 master-0 kubenswrapper[36504]: I1203 22:17:52.506262 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.515370 master-0 kubenswrapper[36504]: I1203 22:17:52.513597 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnd86\" (UniqueName: \"kubernetes.io/projected/f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003-kube-api-access-bnd86\") pod \"nmstate-handler-q66gg\" (UID: \"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003\") " pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.515370 master-0 kubenswrapper[36504]: I1203 22:17:52.515125 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 22:17:52.515370 master-0 kubenswrapper[36504]: I1203 22:17:52.515350 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 22:17:52.518711 master-0 kubenswrapper[36504]: I1203 22:17:52.518610 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg"] Dec 03 22:17:52.564806 master-0 kubenswrapper[36504]: I1203 22:17:52.563136 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:17:52.677784 master-0 kubenswrapper[36504]: I1203 22:17:52.675381 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.677784 master-0 kubenswrapper[36504]: I1203 22:17:52.675496 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g76n\" (UniqueName: \"kubernetes.io/projected/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-kube-api-access-7g76n\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.677784 master-0 kubenswrapper[36504]: I1203 22:17:52.675614 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.789798 master-0 kubenswrapper[36504]: I1203 22:17:52.784807 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.789798 master-0 kubenswrapper[36504]: I1203 22:17:52.784882 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g76n\" (UniqueName: \"kubernetes.io/projected/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-kube-api-access-7g76n\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.789798 master-0 kubenswrapper[36504]: I1203 22:17:52.784964 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.789798 master-0 kubenswrapper[36504]: I1203 22:17:52.785857 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.790254 master-0 kubenswrapper[36504]: I1203 22:17:52.790107 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:52.920150 master-0 kubenswrapper[36504]: I1203 22:17:52.919498 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q66gg" event={"ID":"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003","Type":"ContainerStarted","Data":"e45a412a2d17b8cee00a202341b2ec550221ee5170812949b046cd8f4921a685"} Dec 03 22:17:52.934544 master-0 kubenswrapper[36504]: I1203 22:17:52.932431 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g76n\" (UniqueName: \"kubernetes.io/projected/ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3-kube-api-access-7g76n\") pod \"nmstate-console-plugin-7fbb5f6569-qjfzg\" (UID: \"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:53.053455 master-0 kubenswrapper[36504]: I1203 22:17:53.053388 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6b7b7b5c-jwfc2"] Dec 03 22:17:53.055504 master-0 kubenswrapper[36504]: I1203 22:17:53.055464 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.096447 master-0 kubenswrapper[36504]: I1203 22:17:53.096356 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:53.097329 master-0 kubenswrapper[36504]: I1203 22:17:53.097112 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6b7b7b5c-jwfc2"] Dec 03 22:17:53.137545 master-0 kubenswrapper[36504]: I1203 22:17:53.137291 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24153b70-6280-405c-b1d0-8bdd936bd646-memberlist\") pod \"speaker-fnjz4\" (UID: \"24153b70-6280-405c-b1d0-8bdd936bd646\") " pod="metallb-system/speaker-fnjz4" Dec 03 22:17:53.201116 master-0 kubenswrapper[36504]: I1203 22:17:53.200994 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-oauth-config\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.201116 master-0 kubenswrapper[36504]: I1203 22:17:53.201095 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-serving-cert\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.201347 master-0 kubenswrapper[36504]: I1203 22:17:53.201129 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-config\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.201347 master-0 kubenswrapper[36504]: I1203 22:17:53.201182 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-oauth-serving-cert\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.201347 master-0 kubenswrapper[36504]: I1203 22:17:53.201226 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-service-ca\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.201347 master-0 kubenswrapper[36504]: I1203 22:17:53.201268 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-trusted-ca-bundle\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.201347 master-0 kubenswrapper[36504]: I1203 22:17:53.201295 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9wkl\" (UniqueName: \"kubernetes.io/projected/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-kube-api-access-f9wkl\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.215094 master-0 kubenswrapper[36504]: I1203 22:17:53.215029 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" Dec 03 22:17:53.243291 master-0 kubenswrapper[36504]: I1203 22:17:53.243207 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-fnjz4" Dec 03 22:17:53.304757 master-0 kubenswrapper[36504]: I1203 22:17:53.304658 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-oauth-config\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.304757 master-0 kubenswrapper[36504]: I1203 22:17:53.304756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-serving-cert\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.305172 master-0 kubenswrapper[36504]: I1203 22:17:53.304814 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-config\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.305172 master-0 kubenswrapper[36504]: I1203 22:17:53.304881 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-oauth-serving-cert\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.305172 master-0 kubenswrapper[36504]: I1203 22:17:53.304941 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-service-ca\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.305172 master-0 kubenswrapper[36504]: I1203 22:17:53.304983 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-trusted-ca-bundle\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.305172 master-0 kubenswrapper[36504]: I1203 22:17:53.305021 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9wkl\" (UniqueName: \"kubernetes.io/projected/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-kube-api-access-f9wkl\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.316860 master-0 kubenswrapper[36504]: I1203 22:17:53.311536 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-oauth-serving-cert\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.316860 master-0 kubenswrapper[36504]: I1203 22:17:53.313577 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-oauth-config\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.316860 master-0 kubenswrapper[36504]: I1203 22:17:53.315032 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-service-ca\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.316860 master-0 kubenswrapper[36504]: W1203 22:17:53.316278 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24153b70_6280_405c_b1d0_8bdd936bd646.slice/crio-9df1d01fbc1370371e825bac838b78a22e9a221f62143bfc84e7c4049fb6bcd0 WatchSource:0}: Error finding container 9df1d01fbc1370371e825bac838b78a22e9a221f62143bfc84e7c4049fb6bcd0: Status 404 returned error can't find the container with id 9df1d01fbc1370371e825bac838b78a22e9a221f62143bfc84e7c4049fb6bcd0 Dec 03 22:17:53.333840 master-0 kubenswrapper[36504]: I1203 22:17:53.318127 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-trusted-ca-bundle\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.333840 master-0 kubenswrapper[36504]: I1203 22:17:53.318275 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-config\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.335837 master-0 kubenswrapper[36504]: I1203 22:17:53.335795 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-console-serving-cert\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.362608 master-0 kubenswrapper[36504]: I1203 22:17:53.353443 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9wkl\" (UniqueName: \"kubernetes.io/projected/8cb7ccb1-c763-4f80-8d15-f26be3cf795b-kube-api-access-f9wkl\") pod \"console-5c6b7b7b5c-jwfc2\" (UID: \"8cb7ccb1-c763-4f80-8d15-f26be3cf795b\") " pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.453555 master-0 kubenswrapper[36504]: I1203 22:17:53.453447 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g"] Dec 03 22:17:53.484344 master-0 kubenswrapper[36504]: I1203 22:17:53.484237 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:17:53.680950 master-0 kubenswrapper[36504]: I1203 22:17:53.680760 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5"] Dec 03 22:17:53.920836 master-0 kubenswrapper[36504]: I1203 22:17:53.920739 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg"] Dec 03 22:17:53.935406 master-0 kubenswrapper[36504]: I1203 22:17:53.935331 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fnjz4" event={"ID":"24153b70-6280-405c-b1d0-8bdd936bd646","Type":"ContainerStarted","Data":"b038d5ada8a85d21a2681bff5ce8eb6299492d0172e1d26fefc49863a67b0196"} Dec 03 22:17:53.935406 master-0 kubenswrapper[36504]: I1203 22:17:53.935402 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fnjz4" event={"ID":"24153b70-6280-405c-b1d0-8bdd936bd646","Type":"ContainerStarted","Data":"9df1d01fbc1370371e825bac838b78a22e9a221f62143bfc84e7c4049fb6bcd0"} Dec 03 22:17:53.936889 master-0 kubenswrapper[36504]: I1203 22:17:53.936828 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" event={"ID":"eed33862-ae60-48df-b4ef-8406a4928db2","Type":"ContainerStarted","Data":"ce82d214d3fa5d713702480815f1902f5d964c68df2e7821b236aad53df16b52"} Dec 03 22:17:53.938245 master-0 kubenswrapper[36504]: I1203 22:17:53.938220 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" event={"ID":"451399c6-0304-4409-a5a7-26a242268011","Type":"ContainerStarted","Data":"a14a699f74cc1b2e2415c77127207eb80a75148873a7abf9a682e71b15c6a4bb"} Dec 03 22:17:54.170115 master-0 kubenswrapper[36504]: I1203 22:17:54.170049 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6b7b7b5c-jwfc2"] Dec 03 22:17:54.186893 master-0 kubenswrapper[36504]: W1203 22:17:54.186845 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cb7ccb1_c763_4f80_8d15_f26be3cf795b.slice/crio-bc317b1128cf221749fe02a1c4db4bd10d1787e345925a541d38299d276c886b WatchSource:0}: Error finding container bc317b1128cf221749fe02a1c4db4bd10d1787e345925a541d38299d276c886b: Status 404 returned error can't find the container with id bc317b1128cf221749fe02a1c4db4bd10d1787e345925a541d38299d276c886b Dec 03 22:17:54.953451 master-0 kubenswrapper[36504]: I1203 22:17:54.953385 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6b7b7b5c-jwfc2" event={"ID":"8cb7ccb1-c763-4f80-8d15-f26be3cf795b","Type":"ContainerStarted","Data":"6ecb3558f14623d05ad3464f410cd054c9e419c58d964fed7ff470004ed49d6d"} Dec 03 22:17:54.953451 master-0 kubenswrapper[36504]: I1203 22:17:54.953438 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6b7b7b5c-jwfc2" event={"ID":"8cb7ccb1-c763-4f80-8d15-f26be3cf795b","Type":"ContainerStarted","Data":"bc317b1128cf221749fe02a1c4db4bd10d1787e345925a541d38299d276c886b"} Dec 03 22:17:54.956267 master-0 kubenswrapper[36504]: I1203 22:17:54.956234 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" event={"ID":"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3","Type":"ContainerStarted","Data":"a5f2b07954bcac1d8ac45f6c328b7c84367fddaeb10688505a57ba6ac41de7a5"} Dec 03 22:17:55.347756 master-0 kubenswrapper[36504]: I1203 22:17:55.346181 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6b7b7b5c-jwfc2" podStartSLOduration=3.34610709 podStartE2EDuration="3.34610709s" podCreationTimestamp="2025-12-03 22:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:17:55.293922471 +0000 UTC m=+440.513694488" watchObservedRunningTime="2025-12-03 22:17:55.34610709 +0000 UTC m=+440.565879097" Dec 03 22:18:03.485424 master-0 kubenswrapper[36504]: I1203 22:18:03.485381 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:18:03.485850 master-0 kubenswrapper[36504]: I1203 22:18:03.485462 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:18:03.492025 master-0 kubenswrapper[36504]: I1203 22:18:03.491986 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:18:04.252371 master-0 kubenswrapper[36504]: I1203 22:18:04.252301 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-fnjz4" event={"ID":"24153b70-6280-405c-b1d0-8bdd936bd646","Type":"ContainerStarted","Data":"73bf811d52d11b9026073f31360b5f1bcef09beac6f2ee2fdb7c6f4dee310f42"} Dec 03 22:18:04.253179 master-0 kubenswrapper[36504]: I1203 22:18:04.253098 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-fnjz4" Dec 03 22:18:04.255347 master-0 kubenswrapper[36504]: I1203 22:18:04.255306 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-fnjz4" Dec 03 22:18:04.256290 master-0 kubenswrapper[36504]: I1203 22:18:04.256186 36504 generic.go:334] "Generic (PLEG): container finished" podID="25f6f30f-28f4-4f42-861c-b413ab0691df" containerID="d7c2ba2b9639ec3032aee032b446c7a16e5d2d7d58015b5bc50914ebebf62e4a" exitCode=0 Dec 03 22:18:04.256356 master-0 kubenswrapper[36504]: I1203 22:18:04.256287 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerDied","Data":"d7c2ba2b9639ec3032aee032b446c7a16e5d2d7d58015b5bc50914ebebf62e4a"} Dec 03 22:18:04.260043 master-0 kubenswrapper[36504]: I1203 22:18:04.259519 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" event={"ID":"eed33862-ae60-48df-b4ef-8406a4928db2","Type":"ContainerStarted","Data":"b290f5ee8e1a86ac10f9adc4db5f39198abc487095a0da42ea3f247fd31b37d2"} Dec 03 22:18:04.260043 master-0 kubenswrapper[36504]: I1203 22:18:04.259569 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" event={"ID":"eed33862-ae60-48df-b4ef-8406a4928db2","Type":"ContainerStarted","Data":"175f050794bda60c7bf19c4cb3cf96babf929e4916b17460b9ebfaeb1c932811"} Dec 03 22:18:04.263704 master-0 kubenswrapper[36504]: I1203 22:18:04.263664 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" event={"ID":"ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3","Type":"ContainerStarted","Data":"f16ee1d6f04063151b55fb1c1fb63e0a89665bdc887d32faa9d5f60dbc72dc88"} Dec 03 22:18:04.266830 master-0 kubenswrapper[36504]: I1203 22:18:04.266798 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-7lk9d" event={"ID":"4bf325e2-1bc1-4f4d-bccc-99757f834915","Type":"ContainerStarted","Data":"1cd05b1941335ce84fa93d8d249e0903b0bcafaaf2c319cb85e58e630884bf59"} Dec 03 22:18:04.268943 master-0 kubenswrapper[36504]: I1203 22:18:04.267134 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:18:04.276199 master-0 kubenswrapper[36504]: I1203 22:18:04.276164 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-7lk9d" Dec 03 22:18:04.276886 master-0 kubenswrapper[36504]: I1203 22:18:04.276797 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" event={"ID":"bb20fc19-0922-413a-b811-36351794d540","Type":"ContainerStarted","Data":"9f5b8214e232ead89932e562719a4e7df28d15c213990a9dc35be88b0c3cb52a"} Dec 03 22:18:04.277519 master-0 kubenswrapper[36504]: I1203 22:18:04.277488 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:18:04.279147 master-0 kubenswrapper[36504]: I1203 22:18:04.279068 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-fnjz4" podStartSLOduration=5.711588915 podStartE2EDuration="15.279049627s" podCreationTimestamp="2025-12-03 22:17:49 +0000 UTC" firstStartedPulling="2025-12-03 22:17:53.81450349 +0000 UTC m=+439.034275497" lastFinishedPulling="2025-12-03 22:18:03.381964192 +0000 UTC m=+448.601736209" observedRunningTime="2025-12-03 22:18:04.272819963 +0000 UTC m=+449.492592000" watchObservedRunningTime="2025-12-03 22:18:04.279049627 +0000 UTC m=+449.498821644" Dec 03 22:18:04.279907 master-0 kubenswrapper[36504]: I1203 22:18:04.279876 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" event={"ID":"451399c6-0304-4409-a5a7-26a242268011","Type":"ContainerStarted","Data":"d554c886e49f1e40912309b08a9abf25200ff4a102bb0cc5faf94a9b0fbbe58f"} Dec 03 22:18:04.280042 master-0 kubenswrapper[36504]: I1203 22:18:04.280017 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:18:04.285385 master-0 kubenswrapper[36504]: I1203 22:18:04.285215 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q66gg" event={"ID":"f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003","Type":"ContainerStarted","Data":"bdfbc914a19ae119033ad37caa2e5e6f0b0b39007de1a7c0b6e598931a6618aa"} Dec 03 22:18:04.285534 master-0 kubenswrapper[36504]: I1203 22:18:04.285408 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:18:04.292943 master-0 kubenswrapper[36504]: I1203 22:18:04.292894 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c6b7b7b5c-jwfc2" Dec 03 22:18:04.297456 master-0 kubenswrapper[36504]: I1203 22:18:04.297393 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-mct9g" podStartSLOduration=2.353330449 podStartE2EDuration="12.297378799s" podCreationTimestamp="2025-12-03 22:17:52 +0000 UTC" firstStartedPulling="2025-12-03 22:17:53.459542936 +0000 UTC m=+438.679314943" lastFinishedPulling="2025-12-03 22:18:03.403591286 +0000 UTC m=+448.623363293" observedRunningTime="2025-12-03 22:18:04.291474154 +0000 UTC m=+449.511246181" watchObservedRunningTime="2025-12-03 22:18:04.297378799 +0000 UTC m=+449.517150806" Dec 03 22:18:04.367104 master-0 kubenswrapper[36504]: I1203 22:18:04.367008 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-qjfzg" podStartSLOduration=2.89961205 podStartE2EDuration="12.36697768s" podCreationTimestamp="2025-12-03 22:17:52 +0000 UTC" firstStartedPulling="2025-12-03 22:17:53.987368882 +0000 UTC m=+439.207140889" lastFinishedPulling="2025-12-03 22:18:03.454734502 +0000 UTC m=+448.674506519" observedRunningTime="2025-12-03 22:18:04.362633065 +0000 UTC m=+449.582405222" watchObservedRunningTime="2025-12-03 22:18:04.36697768 +0000 UTC m=+449.586749697" Dec 03 22:18:04.390574 master-0 kubenswrapper[36504]: I1203 22:18:04.390462 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-7lk9d" podStartSLOduration=3.754115781 podStartE2EDuration="15.390435582s" podCreationTimestamp="2025-12-03 22:17:49 +0000 UTC" firstStartedPulling="2025-12-03 22:17:51.745591769 +0000 UTC m=+436.965363766" lastFinishedPulling="2025-12-03 22:18:03.38191156 +0000 UTC m=+448.601683567" observedRunningTime="2025-12-03 22:18:04.387906073 +0000 UTC m=+449.607678090" watchObservedRunningTime="2025-12-03 22:18:04.390435582 +0000 UTC m=+449.610207589" Dec 03 22:18:04.419097 master-0 kubenswrapper[36504]: I1203 22:18:04.418793 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" podStartSLOduration=3.576952633 podStartE2EDuration="15.418753815s" podCreationTimestamp="2025-12-03 22:17:49 +0000 UTC" firstStartedPulling="2025-12-03 22:17:51.539713116 +0000 UTC m=+436.759485133" lastFinishedPulling="2025-12-03 22:18:03.381514308 +0000 UTC m=+448.601286315" observedRunningTime="2025-12-03 22:18:04.409279939 +0000 UTC m=+449.629051966" watchObservedRunningTime="2025-12-03 22:18:04.418753815 +0000 UTC m=+449.638525822" Dec 03 22:18:04.432626 master-0 kubenswrapper[36504]: I1203 22:18:04.432557 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-q66gg" podStartSLOduration=1.769789745 podStartE2EDuration="12.432535775s" podCreationTimestamp="2025-12-03 22:17:52 +0000 UTC" firstStartedPulling="2025-12-03 22:17:52.719326425 +0000 UTC m=+437.939098432" lastFinishedPulling="2025-12-03 22:18:03.382072405 +0000 UTC m=+448.601844462" observedRunningTime="2025-12-03 22:18:04.431757121 +0000 UTC m=+449.651529138" watchObservedRunningTime="2025-12-03 22:18:04.432535775 +0000 UTC m=+449.652307792" Dec 03 22:18:04.660688 master-0 kubenswrapper[36504]: I1203 22:18:04.660470 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" podStartSLOduration=2.924450105 podStartE2EDuration="12.660227869s" podCreationTimestamp="2025-12-03 22:17:52 +0000 UTC" firstStartedPulling="2025-12-03 22:17:53.670217918 +0000 UTC m=+438.889989925" lastFinishedPulling="2025-12-03 22:18:03.405995642 +0000 UTC m=+448.625767689" observedRunningTime="2025-12-03 22:18:04.653070655 +0000 UTC m=+449.872842672" watchObservedRunningTime="2025-12-03 22:18:04.660227869 +0000 UTC m=+449.879999876" Dec 03 22:18:04.727954 master-0 kubenswrapper[36504]: I1203 22:18:04.727201 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-794866c946-5q8lb"] Dec 03 22:18:05.302799 master-0 kubenswrapper[36504]: I1203 22:18:05.302101 36504 generic.go:334] "Generic (PLEG): container finished" podID="25f6f30f-28f4-4f42-861c-b413ab0691df" containerID="45c510b7af565c1212adb24d98f558080ee22e754323dee36b17e8154f53dc8c" exitCode=0 Dec 03 22:18:05.302799 master-0 kubenswrapper[36504]: I1203 22:18:05.302334 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerDied","Data":"45c510b7af565c1212adb24d98f558080ee22e754323dee36b17e8154f53dc8c"} Dec 03 22:18:06.323944 master-0 kubenswrapper[36504]: I1203 22:18:06.323872 36504 generic.go:334] "Generic (PLEG): container finished" podID="25f6f30f-28f4-4f42-861c-b413ab0691df" containerID="65d179ad2eaa838c2150028134ff7ca8cd8fe944ca929c3239c4bdb5633d6491" exitCode=0 Dec 03 22:18:06.324686 master-0 kubenswrapper[36504]: I1203 22:18:06.324026 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerDied","Data":"65d179ad2eaa838c2150028134ff7ca8cd8fe944ca929c3239c4bdb5633d6491"} Dec 03 22:18:07.336496 master-0 kubenswrapper[36504]: I1203 22:18:07.336445 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"4933d3b75b24cca3e1829d8c663f326742965e4858e0e9a95ba2bcb0a5e2a6a3"} Dec 03 22:18:07.336496 master-0 kubenswrapper[36504]: I1203 22:18:07.336495 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"ba8f558dad11debbccfa72c5c21df835a0d2d8f3a88252e674015ebd343433ba"} Dec 03 22:18:09.369800 master-0 kubenswrapper[36504]: I1203 22:18:09.369726 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"00f6619cf2ec356b4cfb0df5528145f962b53fcb0be4505c06136b25e092549e"} Dec 03 22:18:09.372158 master-0 kubenswrapper[36504]: I1203 22:18:09.369838 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"68c4349cfe5ba644c2b2e787c825374bff096a29d8147200f2281d3a021c1bdb"} Dec 03 22:18:09.372158 master-0 kubenswrapper[36504]: I1203 22:18:09.369850 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"dc91808d588b1e297af564cbf6af1165b799f0028f6085eb95952224272b5979"} Dec 03 22:18:09.372158 master-0 kubenswrapper[36504]: I1203 22:18:09.369859 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qb4bc" event={"ID":"25f6f30f-28f4-4f42-861c-b413ab0691df","Type":"ContainerStarted","Data":"2f6f4c9c6c77950adfdc0961614357641f64cd8015df6851e8e38c77e4f5561d"} Dec 03 22:18:09.372158 master-0 kubenswrapper[36504]: I1203 22:18:09.369989 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:18:09.408427 master-0 kubenswrapper[36504]: I1203 22:18:09.408322 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qb4bc" podStartSLOduration=7.793205372 podStartE2EDuration="20.408299087s" podCreationTimestamp="2025-12-03 22:17:49 +0000 UTC" firstStartedPulling="2025-12-03 22:17:50.840634198 +0000 UTC m=+436.060406205" lastFinishedPulling="2025-12-03 22:18:03.455727873 +0000 UTC m=+448.675499920" observedRunningTime="2025-12-03 22:18:09.403235658 +0000 UTC m=+454.623007675" watchObservedRunningTime="2025-12-03 22:18:09.408299087 +0000 UTC m=+454.628071094" Dec 03 22:18:10.526536 master-0 kubenswrapper[36504]: I1203 22:18:10.526416 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:18:10.565893 master-0 kubenswrapper[36504]: I1203 22:18:10.565489 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:18:12.597109 master-0 kubenswrapper[36504]: I1203 22:18:12.597063 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-q66gg" Dec 03 22:18:20.515332 master-0 kubenswrapper[36504]: I1203 22:18:20.515144 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-jh949" Dec 03 22:18:20.529730 master-0 kubenswrapper[36504]: I1203 22:18:20.529666 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qb4bc" Dec 03 22:18:22.514884 master-0 kubenswrapper[36504]: I1203 22:18:22.514792 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-ph6w5" Dec 03 22:18:28.727082 master-0 kubenswrapper[36504]: I1203 22:18:28.726997 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-4zr4m"] Dec 03 22:18:28.728217 master-0 kubenswrapper[36504]: I1203 22:18:28.728186 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.734845 master-0 kubenswrapper[36504]: I1203 22:18:28.734817 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Dec 03 22:18:28.742675 master-0 kubenswrapper[36504]: I1203 22:18:28.742582 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-4zr4m"] Dec 03 22:18:28.864296 master-0 kubenswrapper[36504]: I1203 22:18:28.864212 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-node-plugin-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864296 master-0 kubenswrapper[36504]: I1203 22:18:28.864299 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9a29f02-f095-4963-80f0-45503a0294e1-metrics-cert\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864683 master-0 kubenswrapper[36504]: I1203 22:18:28.864386 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-sys\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864683 master-0 kubenswrapper[36504]: I1203 22:18:28.864438 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-file-lock-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864683 master-0 kubenswrapper[36504]: I1203 22:18:28.864471 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-csi-plugin-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864683 master-0 kubenswrapper[36504]: I1203 22:18:28.864589 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-pod-volumes-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864683 master-0 kubenswrapper[36504]: I1203 22:18:28.864666 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q76ln\" (UniqueName: \"kubernetes.io/projected/e9a29f02-f095-4963-80f0-45503a0294e1-kube-api-access-q76ln\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864951 master-0 kubenswrapper[36504]: I1203 22:18:28.864716 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-lvmd-config\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864951 master-0 kubenswrapper[36504]: I1203 22:18:28.864762 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-run-udev\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864951 master-0 kubenswrapper[36504]: I1203 22:18:28.864853 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-registration-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.864951 master-0 kubenswrapper[36504]: I1203 22:18:28.864906 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-device-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.966846 master-0 kubenswrapper[36504]: I1203 22:18:28.966740 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-pod-volumes-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967158 master-0 kubenswrapper[36504]: I1203 22:18:28.966899 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q76ln\" (UniqueName: \"kubernetes.io/projected/e9a29f02-f095-4963-80f0-45503a0294e1-kube-api-access-q76ln\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967158 master-0 kubenswrapper[36504]: I1203 22:18:28.966955 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-lvmd-config\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967158 master-0 kubenswrapper[36504]: I1203 22:18:28.967001 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-run-udev\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967158 master-0 kubenswrapper[36504]: I1203 22:18:28.967045 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-registration-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967158 master-0 kubenswrapper[36504]: I1203 22:18:28.967095 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-device-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967324 master-0 kubenswrapper[36504]: I1203 22:18:28.967240 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-node-plugin-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967371 master-0 kubenswrapper[36504]: I1203 22:18:28.967312 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9a29f02-f095-4963-80f0-45503a0294e1-metrics-cert\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967408 master-0 kubenswrapper[36504]: I1203 22:18:28.967366 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-sys\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967441 master-0 kubenswrapper[36504]: I1203 22:18:28.967407 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-file-lock-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.967474 master-0 kubenswrapper[36504]: I1203 22:18:28.967437 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-csi-plugin-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968014 master-0 kubenswrapper[36504]: I1203 22:18:28.967942 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-csi-plugin-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968076 master-0 kubenswrapper[36504]: I1203 22:18:28.968027 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-pod-volumes-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968076 master-0 kubenswrapper[36504]: I1203 22:18:28.967987 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-registration-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968141 master-0 kubenswrapper[36504]: I1203 22:18:28.968081 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-device-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968318 master-0 kubenswrapper[36504]: I1203 22:18:28.968291 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-node-plugin-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968509 master-0 kubenswrapper[36504]: I1203 22:18:28.968485 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-run-udev\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968730 master-0 kubenswrapper[36504]: I1203 22:18:28.968646 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-sys\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.968856 master-0 kubenswrapper[36504]: I1203 22:18:28.968831 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-file-lock-dir\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.969042 master-0 kubenswrapper[36504]: I1203 22:18:28.969019 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/e9a29f02-f095-4963-80f0-45503a0294e1-lvmd-config\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:28.973700 master-0 kubenswrapper[36504]: I1203 22:18:28.973652 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e9a29f02-f095-4963-80f0-45503a0294e1-metrics-cert\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:29.089970 master-0 kubenswrapper[36504]: I1203 22:18:29.089742 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q76ln\" (UniqueName: \"kubernetes.io/projected/e9a29f02-f095-4963-80f0-45503a0294e1-kube-api-access-q76ln\") pod \"vg-manager-4zr4m\" (UID: \"e9a29f02-f095-4963-80f0-45503a0294e1\") " pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:29.358598 master-0 kubenswrapper[36504]: I1203 22:18:29.358440 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:29.779209 master-0 kubenswrapper[36504]: I1203 22:18:29.779078 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-794866c946-5q8lb" podUID="1832a08a-47a4-4449-a9db-d11f44f4c154" containerName="console" containerID="cri-o://cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644" gracePeriod=15 Dec 03 22:18:30.057395 master-0 kubenswrapper[36504]: I1203 22:18:30.057321 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-4zr4m"] Dec 03 22:18:30.065196 master-0 kubenswrapper[36504]: W1203 22:18:30.065135 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a29f02_f095_4963_80f0_45503a0294e1.slice/crio-3f458cc07e371ba2fac6c697df1863169241ff8c2db8e32741560aabdbccc903 WatchSource:0}: Error finding container 3f458cc07e371ba2fac6c697df1863169241ff8c2db8e32741560aabdbccc903: Status 404 returned error can't find the container with id 3f458cc07e371ba2fac6c697df1863169241ff8c2db8e32741560aabdbccc903 Dec 03 22:18:30.098255 master-0 kubenswrapper[36504]: I1203 22:18:30.095857 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:18:30.267072 master-0 kubenswrapper[36504]: I1203 22:18:30.267025 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-794866c946-5q8lb_1832a08a-47a4-4449-a9db-d11f44f4c154/console/0.log" Dec 03 22:18:30.267241 master-0 kubenswrapper[36504]: I1203 22:18:30.267123 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:18:30.400738 master-0 kubenswrapper[36504]: I1203 22:18:30.400667 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-serving-cert\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401081 master-0 kubenswrapper[36504]: I1203 22:18:30.400797 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-trusted-ca-bundle\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401081 master-0 kubenswrapper[36504]: I1203 22:18:30.400833 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-service-ca\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401081 master-0 kubenswrapper[36504]: I1203 22:18:30.400885 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-console-config\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401081 master-0 kubenswrapper[36504]: I1203 22:18:30.400920 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-oauth-config\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401081 master-0 kubenswrapper[36504]: I1203 22:18:30.400941 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-oauth-serving-cert\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401081 master-0 kubenswrapper[36504]: I1203 22:18:30.401023 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5nks\" (UniqueName: \"kubernetes.io/projected/1832a08a-47a4-4449-a9db-d11f44f4c154-kube-api-access-t5nks\") pod \"1832a08a-47a4-4449-a9db-d11f44f4c154\" (UID: \"1832a08a-47a4-4449-a9db-d11f44f4c154\") " Dec 03 22:18:30.401816 master-0 kubenswrapper[36504]: I1203 22:18:30.401531 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:18:30.401892 master-0 kubenswrapper[36504]: I1203 22:18:30.401747 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-console-config" (OuterVolumeSpecName: "console-config") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:18:30.401892 master-0 kubenswrapper[36504]: I1203 22:18:30.401818 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:18:30.402714 master-0 kubenswrapper[36504]: I1203 22:18:30.402661 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-service-ca" (OuterVolumeSpecName: "service-ca") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:18:30.403962 master-0 kubenswrapper[36504]: I1203 22:18:30.403922 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:18:30.404481 master-0 kubenswrapper[36504]: I1203 22:18:30.404446 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1832a08a-47a4-4449-a9db-d11f44f4c154-kube-api-access-t5nks" (OuterVolumeSpecName: "kube-api-access-t5nks") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "kube-api-access-t5nks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:18:30.404936 master-0 kubenswrapper[36504]: I1203 22:18:30.404903 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1832a08a-47a4-4449-a9db-d11f44f4c154" (UID: "1832a08a-47a4-4449-a9db-d11f44f4c154"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:18:30.503263 master-0 kubenswrapper[36504]: I1203 22:18:30.503202 36504 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.503560 master-0 kubenswrapper[36504]: I1203 22:18:30.503549 36504 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.503628 master-0 kubenswrapper[36504]: I1203 22:18:30.503617 36504 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-service-ca\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.503704 master-0 kubenswrapper[36504]: I1203 22:18:30.503693 36504 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-console-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.503795 master-0 kubenswrapper[36504]: I1203 22:18:30.503762 36504 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1832a08a-47a4-4449-a9db-d11f44f4c154-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.503868 master-0 kubenswrapper[36504]: I1203 22:18:30.503855 36504 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1832a08a-47a4-4449-a9db-d11f44f4c154-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.503943 master-0 kubenswrapper[36504]: I1203 22:18:30.503931 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5nks\" (UniqueName: \"kubernetes.io/projected/1832a08a-47a4-4449-a9db-d11f44f4c154-kube-api-access-t5nks\") on node \"master-0\" DevicePath \"\"" Dec 03 22:18:30.621303 master-0 kubenswrapper[36504]: I1203 22:18:30.621206 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-794866c946-5q8lb_1832a08a-47a4-4449-a9db-d11f44f4c154/console/0.log" Dec 03 22:18:30.621303 master-0 kubenswrapper[36504]: I1203 22:18:30.621257 36504 generic.go:334] "Generic (PLEG): container finished" podID="1832a08a-47a4-4449-a9db-d11f44f4c154" containerID="cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644" exitCode=2 Dec 03 22:18:30.621757 master-0 kubenswrapper[36504]: I1203 22:18:30.621338 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-794866c946-5q8lb" Dec 03 22:18:30.629189 master-0 kubenswrapper[36504]: I1203 22:18:30.629066 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-794866c946-5q8lb" event={"ID":"1832a08a-47a4-4449-a9db-d11f44f4c154","Type":"ContainerDied","Data":"cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644"} Dec 03 22:18:30.629189 master-0 kubenswrapper[36504]: I1203 22:18:30.629145 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-794866c946-5q8lb" event={"ID":"1832a08a-47a4-4449-a9db-d11f44f4c154","Type":"ContainerDied","Data":"d170b552383730edbee7142866fdba388c31fa78fb0786d3030834fe13e18e6e"} Dec 03 22:18:30.629189 master-0 kubenswrapper[36504]: I1203 22:18:30.629188 36504 scope.go:117] "RemoveContainer" containerID="cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644" Dec 03 22:18:30.633726 master-0 kubenswrapper[36504]: I1203 22:18:30.633658 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-4zr4m" event={"ID":"e9a29f02-f095-4963-80f0-45503a0294e1","Type":"ContainerStarted","Data":"51ad670b5ec4afbc38a04e410961deb4b9460427e259bc1f69753485e3ca9717"} Dec 03 22:18:30.633895 master-0 kubenswrapper[36504]: I1203 22:18:30.633805 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-4zr4m" event={"ID":"e9a29f02-f095-4963-80f0-45503a0294e1","Type":"ContainerStarted","Data":"3f458cc07e371ba2fac6c697df1863169241ff8c2db8e32741560aabdbccc903"} Dec 03 22:18:30.658719 master-0 kubenswrapper[36504]: I1203 22:18:30.658656 36504 scope.go:117] "RemoveContainer" containerID="cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644" Dec 03 22:18:30.659422 master-0 kubenswrapper[36504]: E1203 22:18:30.659384 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644\": container with ID starting with cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644 not found: ID does not exist" containerID="cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644" Dec 03 22:18:30.659499 master-0 kubenswrapper[36504]: I1203 22:18:30.659416 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644"} err="failed to get container status \"cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644\": rpc error: code = NotFound desc = could not find container \"cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644\": container with ID starting with cc27b5008251d125713c9ae3a6030895d9d758436db151b94816740a008cb644 not found: ID does not exist" Dec 03 22:18:30.667119 master-0 kubenswrapper[36504]: I1203 22:18:30.667035 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-4zr4m" podStartSLOduration=2.667001391 podStartE2EDuration="2.667001391s" podCreationTimestamp="2025-12-03 22:18:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:18:30.662229362 +0000 UTC m=+475.882001379" watchObservedRunningTime="2025-12-03 22:18:30.667001391 +0000 UTC m=+475.886773388" Dec 03 22:18:30.692576 master-0 kubenswrapper[36504]: I1203 22:18:30.692489 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-794866c946-5q8lb"] Dec 03 22:18:30.699102 master-0 kubenswrapper[36504]: I1203 22:18:30.699034 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-794866c946-5q8lb"] Dec 03 22:18:31.109596 master-0 kubenswrapper[36504]: I1203 22:18:31.109371 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1832a08a-47a4-4449-a9db-d11f44f4c154" path="/var/lib/kubelet/pods/1832a08a-47a4-4449-a9db-d11f44f4c154/volumes" Dec 03 22:18:32.663827 master-0 kubenswrapper[36504]: I1203 22:18:32.663652 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-4zr4m_e9a29f02-f095-4963-80f0-45503a0294e1/vg-manager/0.log" Dec 03 22:18:32.663827 master-0 kubenswrapper[36504]: I1203 22:18:32.663755 36504 generic.go:334] "Generic (PLEG): container finished" podID="e9a29f02-f095-4963-80f0-45503a0294e1" containerID="51ad670b5ec4afbc38a04e410961deb4b9460427e259bc1f69753485e3ca9717" exitCode=1 Dec 03 22:18:32.663827 master-0 kubenswrapper[36504]: I1203 22:18:32.663830 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-4zr4m" event={"ID":"e9a29f02-f095-4963-80f0-45503a0294e1","Type":"ContainerDied","Data":"51ad670b5ec4afbc38a04e410961deb4b9460427e259bc1f69753485e3ca9717"} Dec 03 22:18:32.683982 master-0 kubenswrapper[36504]: I1203 22:18:32.683911 36504 scope.go:117] "RemoveContainer" containerID="51ad670b5ec4afbc38a04e410961deb4b9460427e259bc1f69753485e3ca9717" Dec 03 22:18:33.027494 master-0 kubenswrapper[36504]: I1203 22:18:33.027327 36504 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Dec 03 22:18:33.101283 master-0 kubenswrapper[36504]: I1203 22:18:33.101103 36504 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2025-12-03T22:18:33.027366774Z","Handler":null,"Name":""} Dec 03 22:18:33.104886 master-0 kubenswrapper[36504]: I1203 22:18:33.104852 36504 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Dec 03 22:18:33.105000 master-0 kubenswrapper[36504]: I1203 22:18:33.104924 36504 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Dec 03 22:18:33.678638 master-0 kubenswrapper[36504]: I1203 22:18:33.678585 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-4zr4m_e9a29f02-f095-4963-80f0-45503a0294e1/vg-manager/0.log" Dec 03 22:18:33.679276 master-0 kubenswrapper[36504]: I1203 22:18:33.678656 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-4zr4m" event={"ID":"e9a29f02-f095-4963-80f0-45503a0294e1","Type":"ContainerStarted","Data":"ed2048b09468ddddc869ac01f20cd4c1ce0c5c717ad37d37bb5273d8764775b0"} Dec 03 22:18:35.687903 master-0 kubenswrapper[36504]: E1203 22:18:35.687833 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:18:35.890153 master-0 kubenswrapper[36504]: I1203 22:18:35.888387 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jpwp2"] Dec 03 22:18:35.890153 master-0 kubenswrapper[36504]: E1203 22:18:35.888798 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1832a08a-47a4-4449-a9db-d11f44f4c154" containerName="console" Dec 03 22:18:35.890153 master-0 kubenswrapper[36504]: I1203 22:18:35.888812 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1832a08a-47a4-4449-a9db-d11f44f4c154" containerName="console" Dec 03 22:18:35.890153 master-0 kubenswrapper[36504]: I1203 22:18:35.888985 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1832a08a-47a4-4449-a9db-d11f44f4c154" containerName="console" Dec 03 22:18:35.890153 master-0 kubenswrapper[36504]: I1203 22:18:35.889522 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:35.897805 master-0 kubenswrapper[36504]: I1203 22:18:35.895048 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 22:18:35.897805 master-0 kubenswrapper[36504]: I1203 22:18:35.895269 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 22:18:35.960889 master-0 kubenswrapper[36504]: I1203 22:18:35.960366 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jpwp2"] Dec 03 22:18:36.065802 master-0 kubenswrapper[36504]: I1203 22:18:36.061792 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwfxp\" (UniqueName: \"kubernetes.io/projected/ee98a796-0dc0-47a8-aa7e-f1724786d3a9-kube-api-access-pwfxp\") pod \"openstack-operator-index-jpwp2\" (UID: \"ee98a796-0dc0-47a8-aa7e-f1724786d3a9\") " pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:36.162983 master-0 kubenswrapper[36504]: I1203 22:18:36.162886 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwfxp\" (UniqueName: \"kubernetes.io/projected/ee98a796-0dc0-47a8-aa7e-f1724786d3a9-kube-api-access-pwfxp\") pod \"openstack-operator-index-jpwp2\" (UID: \"ee98a796-0dc0-47a8-aa7e-f1724786d3a9\") " pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:36.184831 master-0 kubenswrapper[36504]: I1203 22:18:36.184233 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwfxp\" (UniqueName: \"kubernetes.io/projected/ee98a796-0dc0-47a8-aa7e-f1724786d3a9-kube-api-access-pwfxp\") pod \"openstack-operator-index-jpwp2\" (UID: \"ee98a796-0dc0-47a8-aa7e-f1724786d3a9\") " pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:36.391809 master-0 kubenswrapper[36504]: I1203 22:18:36.391416 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:36.920746 master-0 kubenswrapper[36504]: I1203 22:18:36.920693 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jpwp2"] Dec 03 22:18:36.922251 master-0 kubenswrapper[36504]: W1203 22:18:36.922197 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee98a796_0dc0_47a8_aa7e_f1724786d3a9.slice/crio-2a11147db3cbe2c5f3a71c947c31b1d57b5587621d1db7245a2af522516d850e WatchSource:0}: Error finding container 2a11147db3cbe2c5f3a71c947c31b1d57b5587621d1db7245a2af522516d850e: Status 404 returned error can't find the container with id 2a11147db3cbe2c5f3a71c947c31b1d57b5587621d1db7245a2af522516d850e Dec 03 22:18:37.733661 master-0 kubenswrapper[36504]: I1203 22:18:37.731845 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jpwp2" event={"ID":"ee98a796-0dc0-47a8-aa7e-f1724786d3a9","Type":"ContainerStarted","Data":"2a11147db3cbe2c5f3a71c947c31b1d57b5587621d1db7245a2af522516d850e"} Dec 03 22:18:38.793409 master-0 kubenswrapper[36504]: I1203 22:18:38.793272 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jpwp2" event={"ID":"ee98a796-0dc0-47a8-aa7e-f1724786d3a9","Type":"ContainerStarted","Data":"000352a7d1d4e587a8eb1650ac5b81e35a7fc7ea07ddfaf1db7bcf99e50801f4"} Dec 03 22:18:38.840206 master-0 kubenswrapper[36504]: I1203 22:18:38.840096 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jpwp2" podStartSLOduration=2.900304929 podStartE2EDuration="3.840067945s" podCreationTimestamp="2025-12-03 22:18:35 +0000 UTC" firstStartedPulling="2025-12-03 22:18:36.925074956 +0000 UTC m=+482.144846963" lastFinishedPulling="2025-12-03 22:18:37.864837972 +0000 UTC m=+483.084609979" observedRunningTime="2025-12-03 22:18:38.82934181 +0000 UTC m=+484.049113827" watchObservedRunningTime="2025-12-03 22:18:38.840067945 +0000 UTC m=+484.059839972" Dec 03 22:18:39.360372 master-0 kubenswrapper[36504]: I1203 22:18:39.360210 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:39.373981 master-0 kubenswrapper[36504]: I1203 22:18:39.373922 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:39.806624 master-0 kubenswrapper[36504]: I1203 22:18:39.805905 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:39.806624 master-0 kubenswrapper[36504]: I1203 22:18:39.806316 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-4zr4m" Dec 03 22:18:42.094998 master-0 kubenswrapper[36504]: I1203 22:18:42.094919 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:18:46.391757 master-0 kubenswrapper[36504]: I1203 22:18:46.391647 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:46.392406 master-0 kubenswrapper[36504]: I1203 22:18:46.391866 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:46.431070 master-0 kubenswrapper[36504]: I1203 22:18:46.431015 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:46.930845 master-0 kubenswrapper[36504]: I1203 22:18:46.930749 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-jpwp2" Dec 03 22:18:53.954094 master-0 kubenswrapper[36504]: I1203 22:18:53.953966 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2"] Dec 03 22:18:53.956839 master-0 kubenswrapper[36504]: I1203 22:18:53.956760 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:53.976134 master-0 kubenswrapper[36504]: I1203 22:18:53.976059 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2"] Dec 03 22:18:54.042263 master-0 kubenswrapper[36504]: I1203 22:18:54.042167 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zf2t\" (UniqueName: \"kubernetes.io/projected/37b23a6f-c0bc-4cad-be79-98fe61330018-kube-api-access-9zf2t\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.042263 master-0 kubenswrapper[36504]: I1203 22:18:54.042269 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.042700 master-0 kubenswrapper[36504]: I1203 22:18:54.042369 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.144859 master-0 kubenswrapper[36504]: I1203 22:18:54.144230 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zf2t\" (UniqueName: \"kubernetes.io/projected/37b23a6f-c0bc-4cad-be79-98fe61330018-kube-api-access-9zf2t\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.144859 master-0 kubenswrapper[36504]: I1203 22:18:54.144318 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.144859 master-0 kubenswrapper[36504]: I1203 22:18:54.144395 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.145227 master-0 kubenswrapper[36504]: I1203 22:18:54.145035 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.147050 master-0 kubenswrapper[36504]: I1203 22:18:54.147014 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.166645 master-0 kubenswrapper[36504]: I1203 22:18:54.166581 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zf2t\" (UniqueName: \"kubernetes.io/projected/37b23a6f-c0bc-4cad-be79-98fe61330018-kube-api-access-9zf2t\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.279381 master-0 kubenswrapper[36504]: I1203 22:18:54.279207 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:18:54.986875 master-0 kubenswrapper[36504]: I1203 22:18:54.976937 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2"] Dec 03 22:18:55.000856 master-0 kubenswrapper[36504]: W1203 22:18:55.000374 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b23a6f_c0bc_4cad_be79_98fe61330018.slice/crio-507cb9e67bf49b5240cf023db65dee297dc1c877d39de975900fb7bebc94511b WatchSource:0}: Error finding container 507cb9e67bf49b5240cf023db65dee297dc1c877d39de975900fb7bebc94511b: Status 404 returned error can't find the container with id 507cb9e67bf49b5240cf023db65dee297dc1c877d39de975900fb7bebc94511b Dec 03 22:18:55.114943 master-0 kubenswrapper[36504]: I1203 22:18:55.112504 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" event={"ID":"37b23a6f-c0bc-4cad-be79-98fe61330018","Type":"ContainerStarted","Data":"507cb9e67bf49b5240cf023db65dee297dc1c877d39de975900fb7bebc94511b"} Dec 03 22:18:56.122456 master-0 kubenswrapper[36504]: I1203 22:18:56.122370 36504 generic.go:334] "Generic (PLEG): container finished" podID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerID="cae734ef98f0c5f128ba472f95d09455d6d48f0be5c6a97fb594b9b61d035d59" exitCode=0 Dec 03 22:18:56.122456 master-0 kubenswrapper[36504]: I1203 22:18:56.122436 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" event={"ID":"37b23a6f-c0bc-4cad-be79-98fe61330018","Type":"ContainerDied","Data":"cae734ef98f0c5f128ba472f95d09455d6d48f0be5c6a97fb594b9b61d035d59"} Dec 03 22:18:58.147725 master-0 kubenswrapper[36504]: I1203 22:18:58.147625 36504 generic.go:334] "Generic (PLEG): container finished" podID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerID="d0f8d81128c2b0567125075705dfae96c1be014e155e82a65310e888c6b4e46d" exitCode=0 Dec 03 22:18:58.147725 master-0 kubenswrapper[36504]: I1203 22:18:58.147704 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" event={"ID":"37b23a6f-c0bc-4cad-be79-98fe61330018","Type":"ContainerDied","Data":"d0f8d81128c2b0567125075705dfae96c1be014e155e82a65310e888c6b4e46d"} Dec 03 22:18:59.163955 master-0 kubenswrapper[36504]: I1203 22:18:59.163875 36504 generic.go:334] "Generic (PLEG): container finished" podID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerID="95c6fa56b252b50a86c992b437d4784ae940e0a46570ff59b6a952120b4088df" exitCode=0 Dec 03 22:18:59.165032 master-0 kubenswrapper[36504]: I1203 22:18:59.164630 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" event={"ID":"37b23a6f-c0bc-4cad-be79-98fe61330018","Type":"ContainerDied","Data":"95c6fa56b252b50a86c992b437d4784ae940e0a46570ff59b6a952120b4088df"} Dec 03 22:19:00.553441 master-0 kubenswrapper[36504]: I1203 22:19:00.553376 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:19:00.592015 master-0 kubenswrapper[36504]: I1203 22:19:00.591932 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-util\") pod \"37b23a6f-c0bc-4cad-be79-98fe61330018\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " Dec 03 22:19:00.592015 master-0 kubenswrapper[36504]: I1203 22:19:00.592028 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zf2t\" (UniqueName: \"kubernetes.io/projected/37b23a6f-c0bc-4cad-be79-98fe61330018-kube-api-access-9zf2t\") pod \"37b23a6f-c0bc-4cad-be79-98fe61330018\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " Dec 03 22:19:00.592564 master-0 kubenswrapper[36504]: I1203 22:19:00.592110 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-bundle\") pod \"37b23a6f-c0bc-4cad-be79-98fe61330018\" (UID: \"37b23a6f-c0bc-4cad-be79-98fe61330018\") " Dec 03 22:19:00.593440 master-0 kubenswrapper[36504]: I1203 22:19:00.593379 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-bundle" (OuterVolumeSpecName: "bundle") pod "37b23a6f-c0bc-4cad-be79-98fe61330018" (UID: "37b23a6f-c0bc-4cad-be79-98fe61330018"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:00.595876 master-0 kubenswrapper[36504]: I1203 22:19:00.595801 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b23a6f-c0bc-4cad-be79-98fe61330018-kube-api-access-9zf2t" (OuterVolumeSpecName: "kube-api-access-9zf2t") pod "37b23a6f-c0bc-4cad-be79-98fe61330018" (UID: "37b23a6f-c0bc-4cad-be79-98fe61330018"). InnerVolumeSpecName "kube-api-access-9zf2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:19:00.664377 master-0 kubenswrapper[36504]: I1203 22:19:00.664212 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-util" (OuterVolumeSpecName: "util") pod "37b23a6f-c0bc-4cad-be79-98fe61330018" (UID: "37b23a6f-c0bc-4cad-be79-98fe61330018"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:19:00.694582 master-0 kubenswrapper[36504]: I1203 22:19:00.694468 36504 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:19:00.694582 master-0 kubenswrapper[36504]: I1203 22:19:00.694556 36504 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/37b23a6f-c0bc-4cad-be79-98fe61330018-util\") on node \"master-0\" DevicePath \"\"" Dec 03 22:19:00.694582 master-0 kubenswrapper[36504]: I1203 22:19:00.694594 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zf2t\" (UniqueName: \"kubernetes.io/projected/37b23a6f-c0bc-4cad-be79-98fe61330018-kube-api-access-9zf2t\") on node \"master-0\" DevicePath \"\"" Dec 03 22:19:01.188588 master-0 kubenswrapper[36504]: I1203 22:19:01.188521 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" event={"ID":"37b23a6f-c0bc-4cad-be79-98fe61330018","Type":"ContainerDied","Data":"507cb9e67bf49b5240cf023db65dee297dc1c877d39de975900fb7bebc94511b"} Dec 03 22:19:01.188588 master-0 kubenswrapper[36504]: I1203 22:19:01.188573 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507cb9e67bf49b5240cf023db65dee297dc1c877d39de975900fb7bebc94511b" Dec 03 22:19:01.188941 master-0 kubenswrapper[36504]: I1203 22:19:01.188663 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864z45v2" Dec 03 22:19:06.071927 master-0 kubenswrapper[36504]: I1203 22:19:06.071843 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv"] Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: E1203 22:19:06.072255 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="pull" Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: I1203 22:19:06.072271 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="pull" Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: E1203 22:19:06.072286 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="extract" Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: I1203 22:19:06.072293 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="extract" Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: E1203 22:19:06.072339 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="util" Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: I1203 22:19:06.072353 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="util" Dec 03 22:19:06.072732 master-0 kubenswrapper[36504]: I1203 22:19:06.072587 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b23a6f-c0bc-4cad-be79-98fe61330018" containerName="extract" Dec 03 22:19:06.073253 master-0 kubenswrapper[36504]: I1203 22:19:06.073228 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:06.115684 master-0 kubenswrapper[36504]: I1203 22:19:06.115597 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv"] Dec 03 22:19:06.197685 master-0 kubenswrapper[36504]: I1203 22:19:06.197612 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfwgj\" (UniqueName: \"kubernetes.io/projected/90692872-8469-4cd5-995d-e68ef985bcc7-kube-api-access-sfwgj\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-v22wv\" (UID: \"90692872-8469-4cd5-995d-e68ef985bcc7\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:06.300691 master-0 kubenswrapper[36504]: I1203 22:19:06.300603 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfwgj\" (UniqueName: \"kubernetes.io/projected/90692872-8469-4cd5-995d-e68ef985bcc7-kube-api-access-sfwgj\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-v22wv\" (UID: \"90692872-8469-4cd5-995d-e68ef985bcc7\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:06.331233 master-0 kubenswrapper[36504]: I1203 22:19:06.331096 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfwgj\" (UniqueName: \"kubernetes.io/projected/90692872-8469-4cd5-995d-e68ef985bcc7-kube-api-access-sfwgj\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-v22wv\" (UID: \"90692872-8469-4cd5-995d-e68ef985bcc7\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:06.392049 master-0 kubenswrapper[36504]: I1203 22:19:06.391954 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:07.057094 master-0 kubenswrapper[36504]: W1203 22:19:07.057006 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90692872_8469_4cd5_995d_e68ef985bcc7.slice/crio-fd7d980199c5e1333dade8d6eb3e2171693afcbe90a1464482d7a92cae54c9cf WatchSource:0}: Error finding container fd7d980199c5e1333dade8d6eb3e2171693afcbe90a1464482d7a92cae54c9cf: Status 404 returned error can't find the container with id fd7d980199c5e1333dade8d6eb3e2171693afcbe90a1464482d7a92cae54c9cf Dec 03 22:19:07.057423 master-0 kubenswrapper[36504]: I1203 22:19:07.057349 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv"] Dec 03 22:19:07.255742 master-0 kubenswrapper[36504]: I1203 22:19:07.255648 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" event={"ID":"90692872-8469-4cd5-995d-e68ef985bcc7","Type":"ContainerStarted","Data":"fd7d980199c5e1333dade8d6eb3e2171693afcbe90a1464482d7a92cae54c9cf"} Dec 03 22:19:14.576319 master-0 kubenswrapper[36504]: I1203 22:19:14.576232 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" event={"ID":"90692872-8469-4cd5-995d-e68ef985bcc7","Type":"ContainerStarted","Data":"90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619"} Dec 03 22:19:14.577097 master-0 kubenswrapper[36504]: I1203 22:19:14.576448 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:14.629945 master-0 kubenswrapper[36504]: I1203 22:19:14.629825 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" podStartSLOduration=1.6195979230000002 podStartE2EDuration="8.62978902s" podCreationTimestamp="2025-12-03 22:19:06 +0000 UTC" firstStartedPulling="2025-12-03 22:19:07.062366032 +0000 UTC m=+512.282138049" lastFinishedPulling="2025-12-03 22:19:14.072557149 +0000 UTC m=+519.292329146" observedRunningTime="2025-12-03 22:19:14.619825953 +0000 UTC m=+519.839597980" watchObservedRunningTime="2025-12-03 22:19:14.62978902 +0000 UTC m=+519.849561027" Dec 03 22:19:26.395628 master-0 kubenswrapper[36504]: I1203 22:19:26.395561 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:29.663649 master-0 kubenswrapper[36504]: I1203 22:19:29.663573 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs"] Dec 03 22:19:29.666056 master-0 kubenswrapper[36504]: I1203 22:19:29.666029 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:29.674607 master-0 kubenswrapper[36504]: I1203 22:19:29.674534 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpx6j\" (UniqueName: \"kubernetes.io/projected/17b1ce67-e410-492c-a96b-5b1fbba5e67c-kube-api-access-kpx6j\") pod \"openstack-operator-controller-operator-7b84d49558-zfbjs\" (UID: \"17b1ce67-e410-492c-a96b-5b1fbba5e67c\") " pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:29.706672 master-0 kubenswrapper[36504]: I1203 22:19:29.706602 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs"] Dec 03 22:19:29.822051 master-0 kubenswrapper[36504]: I1203 22:19:29.821949 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpx6j\" (UniqueName: \"kubernetes.io/projected/17b1ce67-e410-492c-a96b-5b1fbba5e67c-kube-api-access-kpx6j\") pod \"openstack-operator-controller-operator-7b84d49558-zfbjs\" (UID: \"17b1ce67-e410-492c-a96b-5b1fbba5e67c\") " pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:29.877554 master-0 kubenswrapper[36504]: I1203 22:19:29.877469 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpx6j\" (UniqueName: \"kubernetes.io/projected/17b1ce67-e410-492c-a96b-5b1fbba5e67c-kube-api-access-kpx6j\") pod \"openstack-operator-controller-operator-7b84d49558-zfbjs\" (UID: \"17b1ce67-e410-492c-a96b-5b1fbba5e67c\") " pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:29.983504 master-0 kubenswrapper[36504]: I1203 22:19:29.983336 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:30.701114 master-0 kubenswrapper[36504]: I1203 22:19:30.701050 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs"] Dec 03 22:19:30.702800 master-0 kubenswrapper[36504]: W1203 22:19:30.702285 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b1ce67_e410_492c_a96b_5b1fbba5e67c.slice/crio-e5ef564f2063001624362d7a006a0b832e25b8e82a0e872bf023a63930e5ded5 WatchSource:0}: Error finding container e5ef564f2063001624362d7a006a0b832e25b8e82a0e872bf023a63930e5ded5: Status 404 returned error can't find the container with id e5ef564f2063001624362d7a006a0b832e25b8e82a0e872bf023a63930e5ded5 Dec 03 22:19:30.786954 master-0 kubenswrapper[36504]: I1203 22:19:30.786892 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" event={"ID":"17b1ce67-e410-492c-a96b-5b1fbba5e67c","Type":"ContainerStarted","Data":"e5ef564f2063001624362d7a006a0b832e25b8e82a0e872bf023a63930e5ded5"} Dec 03 22:19:31.797422 master-0 kubenswrapper[36504]: I1203 22:19:31.797339 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" event={"ID":"17b1ce67-e410-492c-a96b-5b1fbba5e67c","Type":"ContainerStarted","Data":"b7528aafd9dd4e60fe92cfc22100828cdba02bb9b32264249f4f66668ec22d06"} Dec 03 22:19:31.798582 master-0 kubenswrapper[36504]: I1203 22:19:31.798555 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:31.838049 master-0 kubenswrapper[36504]: I1203 22:19:31.837966 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" podStartSLOduration=2.837941843 podStartE2EDuration="2.837941843s" podCreationTimestamp="2025-12-03 22:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:19:31.835906528 +0000 UTC m=+537.055678565" watchObservedRunningTime="2025-12-03 22:19:31.837941843 +0000 UTC m=+537.057713860" Dec 03 22:19:35.702413 master-0 kubenswrapper[36504]: E1203 22:19:35.702301 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:19:39.987994 master-0 kubenswrapper[36504]: I1203 22:19:39.987913 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7b84d49558-zfbjs" Dec 03 22:19:40.249793 master-0 kubenswrapper[36504]: I1203 22:19:40.249507 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv"] Dec 03 22:19:40.250166 master-0 kubenswrapper[36504]: I1203 22:19:40.249875 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" podUID="90692872-8469-4cd5-995d-e68ef985bcc7" containerName="operator" containerID="cri-o://90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619" gracePeriod=10 Dec 03 22:19:40.749763 master-0 kubenswrapper[36504]: I1203 22:19:40.749681 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:40.866805 master-0 kubenswrapper[36504]: I1203 22:19:40.861252 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfwgj\" (UniqueName: \"kubernetes.io/projected/90692872-8469-4cd5-995d-e68ef985bcc7-kube-api-access-sfwgj\") pod \"90692872-8469-4cd5-995d-e68ef985bcc7\" (UID: \"90692872-8469-4cd5-995d-e68ef985bcc7\") " Dec 03 22:19:40.866805 master-0 kubenswrapper[36504]: I1203 22:19:40.865851 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90692872-8469-4cd5-995d-e68ef985bcc7-kube-api-access-sfwgj" (OuterVolumeSpecName: "kube-api-access-sfwgj") pod "90692872-8469-4cd5-995d-e68ef985bcc7" (UID: "90692872-8469-4cd5-995d-e68ef985bcc7"). InnerVolumeSpecName "kube-api-access-sfwgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:19:40.909749 master-0 kubenswrapper[36504]: I1203 22:19:40.907625 36504 generic.go:334] "Generic (PLEG): container finished" podID="90692872-8469-4cd5-995d-e68ef985bcc7" containerID="90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619" exitCode=0 Dec 03 22:19:40.909749 master-0 kubenswrapper[36504]: I1203 22:19:40.907686 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" event={"ID":"90692872-8469-4cd5-995d-e68ef985bcc7","Type":"ContainerDied","Data":"90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619"} Dec 03 22:19:40.909749 master-0 kubenswrapper[36504]: I1203 22:19:40.907719 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" event={"ID":"90692872-8469-4cd5-995d-e68ef985bcc7","Type":"ContainerDied","Data":"fd7d980199c5e1333dade8d6eb3e2171693afcbe90a1464482d7a92cae54c9cf"} Dec 03 22:19:40.909749 master-0 kubenswrapper[36504]: I1203 22:19:40.907737 36504 scope.go:117] "RemoveContainer" containerID="90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619" Dec 03 22:19:40.909749 master-0 kubenswrapper[36504]: I1203 22:19:40.907944 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv" Dec 03 22:19:40.960175 master-0 kubenswrapper[36504]: I1203 22:19:40.960123 36504 scope.go:117] "RemoveContainer" containerID="90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619" Dec 03 22:19:40.960730 master-0 kubenswrapper[36504]: E1203 22:19:40.960667 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619\": container with ID starting with 90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619 not found: ID does not exist" containerID="90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619" Dec 03 22:19:40.960826 master-0 kubenswrapper[36504]: I1203 22:19:40.960739 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619"} err="failed to get container status \"90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619\": rpc error: code = NotFound desc = could not find container \"90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619\": container with ID starting with 90f07cadae5f9a9930b7053ea03fd6fb87da0e1a9139977d37eb329226598619 not found: ID does not exist" Dec 03 22:19:40.965491 master-0 kubenswrapper[36504]: I1203 22:19:40.965440 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfwgj\" (UniqueName: \"kubernetes.io/projected/90692872-8469-4cd5-995d-e68ef985bcc7-kube-api-access-sfwgj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:19:40.982426 master-0 kubenswrapper[36504]: I1203 22:19:40.982093 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv"] Dec 03 22:19:40.991792 master-0 kubenswrapper[36504]: I1203 22:19:40.990098 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-v22wv"] Dec 03 22:19:41.107059 master-0 kubenswrapper[36504]: I1203 22:19:41.106920 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90692872-8469-4cd5-995d-e68ef985bcc7" path="/var/lib/kubelet/pods/90692872-8469-4cd5-995d-e68ef985bcc7/volumes" Dec 03 22:19:51.095898 master-0 kubenswrapper[36504]: I1203 22:19:51.095808 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:20:05.108187 master-0 kubenswrapper[36504]: I1203 22:20:05.108116 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:20:35.711325 master-0 kubenswrapper[36504]: E1203 22:20:35.711246 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:20:46.370382 master-0 kubenswrapper[36504]: I1203 22:20:46.370319 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8"] Dec 03 22:20:46.371185 master-0 kubenswrapper[36504]: E1203 22:20:46.370749 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90692872-8469-4cd5-995d-e68ef985bcc7" containerName="operator" Dec 03 22:20:46.371185 master-0 kubenswrapper[36504]: I1203 22:20:46.370782 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="90692872-8469-4cd5-995d-e68ef985bcc7" containerName="operator" Dec 03 22:20:46.371185 master-0 kubenswrapper[36504]: I1203 22:20:46.371038 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="90692872-8469-4cd5-995d-e68ef985bcc7" containerName="operator" Dec 03 22:20:46.374914 master-0 kubenswrapper[36504]: I1203 22:20:46.374888 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:20:46.390285 master-0 kubenswrapper[36504]: I1203 22:20:46.390213 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs"] Dec 03 22:20:46.393026 master-0 kubenswrapper[36504]: I1203 22:20:46.392368 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:20:46.403586 master-0 kubenswrapper[36504]: I1203 22:20:46.403494 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8"] Dec 03 22:20:46.419633 master-0 kubenswrapper[36504]: I1203 22:20:46.419589 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q"] Dec 03 22:20:46.420591 master-0 kubenswrapper[36504]: I1203 22:20:46.420537 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2vf\" (UniqueName: \"kubernetes.io/projected/d5090b03-b50e-4470-92ff-cdba513adcae-kube-api-access-ss2vf\") pod \"barbican-operator-controller-manager-5cd89994b5-v5rh8\" (UID: \"d5090b03-b50e-4470-92ff-cdba513adcae\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:20:46.420665 master-0 kubenswrapper[36504]: I1203 22:20:46.420655 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhz9j\" (UniqueName: \"kubernetes.io/projected/294bc967-f361-45a7-85f6-e846ec563425-kube-api-access-nhz9j\") pod \"cinder-operator-controller-manager-f8856dd79-qghrs\" (UID: \"294bc967-f361-45a7-85f6-e846ec563425\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:20:46.421299 master-0 kubenswrapper[36504]: I1203 22:20:46.421271 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:20:46.433084 master-0 kubenswrapper[36504]: I1203 22:20:46.432569 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs"] Dec 03 22:20:46.445627 master-0 kubenswrapper[36504]: I1203 22:20:46.444662 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q"] Dec 03 22:20:46.472639 master-0 kubenswrapper[36504]: I1203 22:20:46.467074 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp"] Dec 03 22:20:46.472639 master-0 kubenswrapper[36504]: I1203 22:20:46.469237 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:20:46.479082 master-0 kubenswrapper[36504]: I1203 22:20:46.478979 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp"] Dec 03 22:20:46.523872 master-0 kubenswrapper[36504]: I1203 22:20:46.523268 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhz9j\" (UniqueName: \"kubernetes.io/projected/294bc967-f361-45a7-85f6-e846ec563425-kube-api-access-nhz9j\") pod \"cinder-operator-controller-manager-f8856dd79-qghrs\" (UID: \"294bc967-f361-45a7-85f6-e846ec563425\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:20:46.523872 master-0 kubenswrapper[36504]: I1203 22:20:46.523333 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87hw\" (UniqueName: \"kubernetes.io/projected/0ee5f717-c104-47a4-b47a-f56856f80286-kube-api-access-t87hw\") pod \"glance-operator-controller-manager-78cd4f7769-l6vkp\" (UID: \"0ee5f717-c104-47a4-b47a-f56856f80286\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:20:46.523872 master-0 kubenswrapper[36504]: I1203 22:20:46.523400 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2vf\" (UniqueName: \"kubernetes.io/projected/d5090b03-b50e-4470-92ff-cdba513adcae-kube-api-access-ss2vf\") pod \"barbican-operator-controller-manager-5cd89994b5-v5rh8\" (UID: \"d5090b03-b50e-4470-92ff-cdba513adcae\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:20:46.523872 master-0 kubenswrapper[36504]: I1203 22:20:46.523448 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqz9n\" (UniqueName: \"kubernetes.io/projected/1f86e923-1f9c-43ca-84ac-1cff0660b6a9-kube-api-access-hqz9n\") pod \"designate-operator-controller-manager-84bc9f68f5-dgq4q\" (UID: \"1f86e923-1f9c-43ca-84ac-1cff0660b6a9\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:20:46.536420 master-0 kubenswrapper[36504]: I1203 22:20:46.527882 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5"] Dec 03 22:20:46.575481 master-0 kubenswrapper[36504]: I1203 22:20:46.575383 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp"] Dec 03 22:20:46.577288 master-0 kubenswrapper[36504]: I1203 22:20:46.577257 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5"] Dec 03 22:20:46.577288 master-0 kubenswrapper[36504]: I1203 22:20:46.577289 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp"] Dec 03 22:20:46.577546 master-0 kubenswrapper[36504]: I1203 22:20:46.577416 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:20:46.582396 master-0 kubenswrapper[36504]: I1203 22:20:46.578990 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:20:46.587998 master-0 kubenswrapper[36504]: I1203 22:20:46.587598 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhz9j\" (UniqueName: \"kubernetes.io/projected/294bc967-f361-45a7-85f6-e846ec563425-kube-api-access-nhz9j\") pod \"cinder-operator-controller-manager-f8856dd79-qghrs\" (UID: \"294bc967-f361-45a7-85f6-e846ec563425\") " pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:20:46.602184 master-0 kubenswrapper[36504]: I1203 22:20:46.602122 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm"] Dec 03 22:20:46.603516 master-0 kubenswrapper[36504]: I1203 22:20:46.603487 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:46.606105 master-0 kubenswrapper[36504]: I1203 22:20:46.606059 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 22:20:46.610577 master-0 kubenswrapper[36504]: I1203 22:20:46.606756 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2vf\" (UniqueName: \"kubernetes.io/projected/d5090b03-b50e-4470-92ff-cdba513adcae-kube-api-access-ss2vf\") pod \"barbican-operator-controller-manager-5cd89994b5-v5rh8\" (UID: \"d5090b03-b50e-4470-92ff-cdba513adcae\") " pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:20:46.622540 master-0 kubenswrapper[36504]: I1203 22:20:46.622319 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8"] Dec 03 22:20:46.624390 master-0 kubenswrapper[36504]: I1203 22:20:46.624356 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:20:46.624503 master-0 kubenswrapper[36504]: I1203 22:20:46.624363 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87hw\" (UniqueName: \"kubernetes.io/projected/0ee5f717-c104-47a4-b47a-f56856f80286-kube-api-access-t87hw\") pod \"glance-operator-controller-manager-78cd4f7769-l6vkp\" (UID: \"0ee5f717-c104-47a4-b47a-f56856f80286\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:20:46.624503 master-0 kubenswrapper[36504]: I1203 22:20:46.624458 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:46.624625 master-0 kubenswrapper[36504]: I1203 22:20:46.624524 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49b2m\" (UniqueName: \"kubernetes.io/projected/66ded0bb-1de9-4b08-a2a2-c679aedbbded-kube-api-access-49b2m\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:46.624625 master-0 kubenswrapper[36504]: I1203 22:20:46.624577 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqz9n\" (UniqueName: \"kubernetes.io/projected/1f86e923-1f9c-43ca-84ac-1cff0660b6a9-kube-api-access-hqz9n\") pod \"designate-operator-controller-manager-84bc9f68f5-dgq4q\" (UID: \"1f86e923-1f9c-43ca-84ac-1cff0660b6a9\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:20:46.653997 master-0 kubenswrapper[36504]: I1203 22:20:46.653856 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm"] Dec 03 22:20:46.669829 master-0 kubenswrapper[36504]: I1203 22:20:46.669465 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87hw\" (UniqueName: \"kubernetes.io/projected/0ee5f717-c104-47a4-b47a-f56856f80286-kube-api-access-t87hw\") pod \"glance-operator-controller-manager-78cd4f7769-l6vkp\" (UID: \"0ee5f717-c104-47a4-b47a-f56856f80286\") " pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:20:46.680749 master-0 kubenswrapper[36504]: I1203 22:20:46.671257 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqz9n\" (UniqueName: \"kubernetes.io/projected/1f86e923-1f9c-43ca-84ac-1cff0660b6a9-kube-api-access-hqz9n\") pod \"designate-operator-controller-manager-84bc9f68f5-dgq4q\" (UID: \"1f86e923-1f9c-43ca-84ac-1cff0660b6a9\") " pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:20:46.680749 master-0 kubenswrapper[36504]: I1203 22:20:46.680156 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd"] Dec 03 22:20:46.685329 master-0 kubenswrapper[36504]: I1203 22:20:46.685014 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:20:46.713707 master-0 kubenswrapper[36504]: I1203 22:20:46.713551 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:20:46.720026 master-0 kubenswrapper[36504]: I1203 22:20:46.719963 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8"] Dec 03 22:20:46.730345 master-0 kubenswrapper[36504]: I1203 22:20:46.730280 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwz22\" (UniqueName: \"kubernetes.io/projected/4d4ce37c-6c7b-40f9-a14e-4c956089d552-kube-api-access-kwz22\") pod \"ironic-operator-controller-manager-7c9bfd6967-9r2t8\" (UID: \"4d4ce37c-6c7b-40f9-a14e-4c956089d552\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:20:46.730546 master-0 kubenswrapper[36504]: I1203 22:20:46.730469 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mwpv\" (UniqueName: \"kubernetes.io/projected/066459a1-f230-40e7-b00c-64edad996634-kube-api-access-8mwpv\") pod \"keystone-operator-controller-manager-58b8dcc5fb-dzpzd\" (UID: \"066459a1-f230-40e7-b00c-64edad996634\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:20:46.730675 master-0 kubenswrapper[36504]: I1203 22:20:46.730557 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tz2d\" (UniqueName: \"kubernetes.io/projected/7bc69a6b-4e45-44d8-9263-d04ae58ba614-kube-api-access-7tz2d\") pod \"heat-operator-controller-manager-7fd96594c7-59fg5\" (UID: \"7bc69a6b-4e45-44d8-9263-d04ae58ba614\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:20:46.730675 master-0 kubenswrapper[36504]: I1203 22:20:46.730620 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwkfh\" (UniqueName: \"kubernetes.io/projected/591effbb-eb20-4094-8637-d3fee9582c37-kube-api-access-jwkfh\") pod \"horizon-operator-controller-manager-f6cc97788-tz5zp\" (UID: \"591effbb-eb20-4094-8637-d3fee9582c37\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:20:46.730675 master-0 kubenswrapper[36504]: I1203 22:20:46.730669 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:46.730877 master-0 kubenswrapper[36504]: I1203 22:20:46.730741 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49b2m\" (UniqueName: \"kubernetes.io/projected/66ded0bb-1de9-4b08-a2a2-c679aedbbded-kube-api-access-49b2m\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:46.731303 master-0 kubenswrapper[36504]: E1203 22:20:46.731268 36504 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:46.731428 master-0 kubenswrapper[36504]: E1203 22:20:46.731332 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert podName:66ded0bb-1de9-4b08-a2a2-c679aedbbded nodeName:}" failed. No retries permitted until 2025-12-03 22:20:47.231311186 +0000 UTC m=+612.451083193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-lcnlm" (UID: "66ded0bb-1de9-4b08-a2a2-c679aedbbded") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:46.731637 master-0 kubenswrapper[36504]: I1203 22:20:46.731603 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:20:46.733795 master-0 kubenswrapper[36504]: I1203 22:20:46.733736 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q"] Dec 03 22:20:46.738481 master-0 kubenswrapper[36504]: I1203 22:20:46.738438 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:20:46.753262 master-0 kubenswrapper[36504]: I1203 22:20:46.753200 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd"] Dec 03 22:20:46.764124 master-0 kubenswrapper[36504]: I1203 22:20:46.764057 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49b2m\" (UniqueName: \"kubernetes.io/projected/66ded0bb-1de9-4b08-a2a2-c679aedbbded-kube-api-access-49b2m\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:46.783036 master-0 kubenswrapper[36504]: I1203 22:20:46.782941 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:20:46.799411 master-0 kubenswrapper[36504]: I1203 22:20:46.798406 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q"] Dec 03 22:20:46.817676 master-0 kubenswrapper[36504]: I1203 22:20:46.817621 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l"] Dec 03 22:20:46.819910 master-0 kubenswrapper[36504]: I1203 22:20:46.819863 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:20:46.834816 master-0 kubenswrapper[36504]: I1203 22:20:46.834220 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvlg4\" (UniqueName: \"kubernetes.io/projected/93356ecd-7681-4dbd-a992-03fef31de73f-kube-api-access-kvlg4\") pod \"manila-operator-controller-manager-56f9fbf74b-6cj4q\" (UID: \"93356ecd-7681-4dbd-a992-03fef31de73f\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:20:46.834816 master-0 kubenswrapper[36504]: I1203 22:20:46.834394 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6rl8\" (UniqueName: \"kubernetes.io/projected/6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e-kube-api-access-z6rl8\") pod \"mariadb-operator-controller-manager-647d75769b-ljq2l\" (UID: \"6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:20:46.834816 master-0 kubenswrapper[36504]: I1203 22:20:46.834433 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwz22\" (UniqueName: \"kubernetes.io/projected/4d4ce37c-6c7b-40f9-a14e-4c956089d552-kube-api-access-kwz22\") pod \"ironic-operator-controller-manager-7c9bfd6967-9r2t8\" (UID: \"4d4ce37c-6c7b-40f9-a14e-4c956089d552\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:20:46.834816 master-0 kubenswrapper[36504]: I1203 22:20:46.834517 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mwpv\" (UniqueName: \"kubernetes.io/projected/066459a1-f230-40e7-b00c-64edad996634-kube-api-access-8mwpv\") pod \"keystone-operator-controller-manager-58b8dcc5fb-dzpzd\" (UID: \"066459a1-f230-40e7-b00c-64edad996634\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:20:46.834816 master-0 kubenswrapper[36504]: I1203 22:20:46.834569 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tz2d\" (UniqueName: \"kubernetes.io/projected/7bc69a6b-4e45-44d8-9263-d04ae58ba614-kube-api-access-7tz2d\") pod \"heat-operator-controller-manager-7fd96594c7-59fg5\" (UID: \"7bc69a6b-4e45-44d8-9263-d04ae58ba614\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:20:46.834816 master-0 kubenswrapper[36504]: I1203 22:20:46.834623 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwkfh\" (UniqueName: \"kubernetes.io/projected/591effbb-eb20-4094-8637-d3fee9582c37-kube-api-access-jwkfh\") pod \"horizon-operator-controller-manager-f6cc97788-tz5zp\" (UID: \"591effbb-eb20-4094-8637-d3fee9582c37\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:20:46.843803 master-0 kubenswrapper[36504]: I1203 22:20:46.837785 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:20:46.884503 master-0 kubenswrapper[36504]: I1203 22:20:46.882535 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l"] Dec 03 22:20:46.888118 master-0 kubenswrapper[36504]: I1203 22:20:46.885462 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tz2d\" (UniqueName: \"kubernetes.io/projected/7bc69a6b-4e45-44d8-9263-d04ae58ba614-kube-api-access-7tz2d\") pod \"heat-operator-controller-manager-7fd96594c7-59fg5\" (UID: \"7bc69a6b-4e45-44d8-9263-d04ae58ba614\") " pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:20:46.888118 master-0 kubenswrapper[36504]: I1203 22:20:46.887415 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwz22\" (UniqueName: \"kubernetes.io/projected/4d4ce37c-6c7b-40f9-a14e-4c956089d552-kube-api-access-kwz22\") pod \"ironic-operator-controller-manager-7c9bfd6967-9r2t8\" (UID: \"4d4ce37c-6c7b-40f9-a14e-4c956089d552\") " pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:20:46.899096 master-0 kubenswrapper[36504]: I1203 22:20:46.898187 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwkfh\" (UniqueName: \"kubernetes.io/projected/591effbb-eb20-4094-8637-d3fee9582c37-kube-api-access-jwkfh\") pod \"horizon-operator-controller-manager-f6cc97788-tz5zp\" (UID: \"591effbb-eb20-4094-8637-d3fee9582c37\") " pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:20:46.909029 master-0 kubenswrapper[36504]: I1203 22:20:46.908958 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mwpv\" (UniqueName: \"kubernetes.io/projected/066459a1-f230-40e7-b00c-64edad996634-kube-api-access-8mwpv\") pod \"keystone-operator-controller-manager-58b8dcc5fb-dzpzd\" (UID: \"066459a1-f230-40e7-b00c-64edad996634\") " pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:20:46.918088 master-0 kubenswrapper[36504]: I1203 22:20:46.916206 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr"] Dec 03 22:20:46.919899 master-0 kubenswrapper[36504]: I1203 22:20:46.919831 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:20:46.938219 master-0 kubenswrapper[36504]: I1203 22:20:46.937643 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr"] Dec 03 22:20:46.939800 master-0 kubenswrapper[36504]: I1203 22:20:46.939250 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvlg4\" (UniqueName: \"kubernetes.io/projected/93356ecd-7681-4dbd-a992-03fef31de73f-kube-api-access-kvlg4\") pod \"manila-operator-controller-manager-56f9fbf74b-6cj4q\" (UID: \"93356ecd-7681-4dbd-a992-03fef31de73f\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:20:46.939800 master-0 kubenswrapper[36504]: I1203 22:20:46.939348 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6rl8\" (UniqueName: \"kubernetes.io/projected/6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e-kube-api-access-z6rl8\") pod \"mariadb-operator-controller-manager-647d75769b-ljq2l\" (UID: \"6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:20:46.973881 master-0 kubenswrapper[36504]: I1203 22:20:46.971920 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k"] Dec 03 22:20:46.974238 master-0 kubenswrapper[36504]: I1203 22:20:46.974164 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:20:46.995257 master-0 kubenswrapper[36504]: I1203 22:20:46.995188 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6rl8\" (UniqueName: \"kubernetes.io/projected/6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e-kube-api-access-z6rl8\") pod \"mariadb-operator-controller-manager-647d75769b-ljq2l\" (UID: \"6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e\") " pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:20:46.996335 master-0 kubenswrapper[36504]: I1203 22:20:46.995888 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvlg4\" (UniqueName: \"kubernetes.io/projected/93356ecd-7681-4dbd-a992-03fef31de73f-kube-api-access-kvlg4\") pod \"manila-operator-controller-manager-56f9fbf74b-6cj4q\" (UID: \"93356ecd-7681-4dbd-a992-03fef31de73f\") " pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:20:47.030201 master-0 kubenswrapper[36504]: I1203 22:20:47.029592 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k"] Dec 03 22:20:47.065846 master-0 kubenswrapper[36504]: I1203 22:20:47.063595 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:20:47.072129 master-0 kubenswrapper[36504]: I1203 22:20:47.069642 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7rl\" (UniqueName: \"kubernetes.io/projected/45bff768-9e56-4111-aecc-6b4b4bea9202-kube-api-access-kn7rl\") pod \"neutron-operator-controller-manager-7cdd6b54fb-wzvfr\" (UID: \"45bff768-9e56-4111-aecc-6b4b4bea9202\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:20:47.072129 master-0 kubenswrapper[36504]: I1203 22:20:47.069998 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d8h6\" (UniqueName: \"kubernetes.io/projected/70f7d60f-1ece-4a67-a82a-d10933d8f28f-kube-api-access-5d8h6\") pod \"nova-operator-controller-manager-865fc86d5b-hmp8k\" (UID: \"70f7d60f-1ece-4a67-a82a-d10933d8f28f\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:20:47.109122 master-0 kubenswrapper[36504]: I1203 22:20:47.105616 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:20:47.133660 master-0 kubenswrapper[36504]: I1203 22:20:47.133577 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx"] Dec 03 22:20:47.136887 master-0 kubenswrapper[36504]: I1203 22:20:47.136798 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:20:47.186149 master-0 kubenswrapper[36504]: I1203 22:20:47.169831 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx"] Dec 03 22:20:47.186149 master-0 kubenswrapper[36504]: I1203 22:20:47.169937 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h"] Dec 03 22:20:47.186149 master-0 kubenswrapper[36504]: I1203 22:20:47.172683 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.186149 master-0 kubenswrapper[36504]: I1203 22:20:47.175102 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 22:20:47.186149 master-0 kubenswrapper[36504]: I1203 22:20:47.178912 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d8h6\" (UniqueName: \"kubernetes.io/projected/70f7d60f-1ece-4a67-a82a-d10933d8f28f-kube-api-access-5d8h6\") pod \"nova-operator-controller-manager-865fc86d5b-hmp8k\" (UID: \"70f7d60f-1ece-4a67-a82a-d10933d8f28f\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:20:47.186149 master-0 kubenswrapper[36504]: I1203 22:20:47.178974 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7rl\" (UniqueName: \"kubernetes.io/projected/45bff768-9e56-4111-aecc-6b4b4bea9202-kube-api-access-kn7rl\") pod \"neutron-operator-controller-manager-7cdd6b54fb-wzvfr\" (UID: \"45bff768-9e56-4111-aecc-6b4b4bea9202\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:20:47.186730 master-0 kubenswrapper[36504]: I1203 22:20:47.186652 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:20:47.202824 master-0 kubenswrapper[36504]: I1203 22:20:47.201022 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h"] Dec 03 22:20:47.205909 master-0 kubenswrapper[36504]: I1203 22:20:47.204863 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d8h6\" (UniqueName: \"kubernetes.io/projected/70f7d60f-1ece-4a67-a82a-d10933d8f28f-kube-api-access-5d8h6\") pod \"nova-operator-controller-manager-865fc86d5b-hmp8k\" (UID: \"70f7d60f-1ece-4a67-a82a-d10933d8f28f\") " pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:20:47.209349 master-0 kubenswrapper[36504]: I1203 22:20:47.207708 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:20:47.209349 master-0 kubenswrapper[36504]: I1203 22:20:47.208045 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7rl\" (UniqueName: \"kubernetes.io/projected/45bff768-9e56-4111-aecc-6b4b4bea9202-kube-api-access-kn7rl\") pod \"neutron-operator-controller-manager-7cdd6b54fb-wzvfr\" (UID: \"45bff768-9e56-4111-aecc-6b4b4bea9202\") " pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:20:47.214977 master-0 kubenswrapper[36504]: I1203 22:20:47.210941 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v"] Dec 03 22:20:47.214977 master-0 kubenswrapper[36504]: I1203 22:20:47.213628 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:20:47.224748 master-0 kubenswrapper[36504]: I1203 22:20:47.224671 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh"] Dec 03 22:20:47.225332 master-0 kubenswrapper[36504]: I1203 22:20:47.225274 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:20:47.228689 master-0 kubenswrapper[36504]: I1203 22:20:47.228633 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:20:47.232971 master-0 kubenswrapper[36504]: I1203 22:20:47.232892 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v"] Dec 03 22:20:47.240949 master-0 kubenswrapper[36504]: I1203 22:20:47.240669 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh"] Dec 03 22:20:47.249546 master-0 kubenswrapper[36504]: I1203 22:20:47.249473 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-tpmdc"] Dec 03 22:20:47.284524 master-0 kubenswrapper[36504]: I1203 22:20:47.284413 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.295059 master-0 kubenswrapper[36504]: I1203 22:20:47.294116 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:20:47.301580 master-0 kubenswrapper[36504]: I1203 22:20:47.296609 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cr8k\" (UniqueName: \"kubernetes.io/projected/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-kube-api-access-7cr8k\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.301580 master-0 kubenswrapper[36504]: I1203 22:20:47.296725 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq2zc\" (UniqueName: \"kubernetes.io/projected/1365d101-c3ad-483c-8ed5-2616e633d9ae-kube-api-access-fq2zc\") pod \"octavia-operator-controller-manager-845b79dc4f-m94lx\" (UID: \"1365d101-c3ad-483c-8ed5-2616e633d9ae\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:20:47.301580 master-0 kubenswrapper[36504]: I1203 22:20:47.296886 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:47.301580 master-0 kubenswrapper[36504]: E1203 22:20:47.297123 36504 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:47.301580 master-0 kubenswrapper[36504]: E1203 22:20:47.297270 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert podName:66ded0bb-1de9-4b08-a2a2-c679aedbbded nodeName:}" failed. No retries permitted until 2025-12-03 22:20:48.297234153 +0000 UTC m=+613.517006150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-lcnlm" (UID: "66ded0bb-1de9-4b08-a2a2-c679aedbbded") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:47.301580 master-0 kubenswrapper[36504]: I1203 22:20:47.299085 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:20:47.308012 master-0 kubenswrapper[36504]: I1203 22:20:47.307948 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-tpmdc"] Dec 03 22:20:47.396324 master-0 kubenswrapper[36504]: I1203 22:20:47.392227 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:20:47.403300 master-0 kubenswrapper[36504]: I1203 22:20:47.401990 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:20:47.405314 master-0 kubenswrapper[36504]: I1203 22:20:47.404644 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.405314 master-0 kubenswrapper[36504]: I1203 22:20:47.404896 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bb99\" (UniqueName: \"kubernetes.io/projected/5e840a7b-22a7-4605-816a-dc26e299d247-kube-api-access-2bb99\") pod \"ovn-operator-controller-manager-647f96877-tjm2v\" (UID: \"5e840a7b-22a7-4605-816a-dc26e299d247\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:20:47.405314 master-0 kubenswrapper[36504]: I1203 22:20:47.405019 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshsp\" (UniqueName: \"kubernetes.io/projected/20e58c00-5ade-44ab-825e-3d24c0644fe9-kube-api-access-wshsp\") pod \"placement-operator-controller-manager-6b64f6f645-7mmbh\" (UID: \"20e58c00-5ade-44ab-825e-3d24c0644fe9\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:20:47.405314 master-0 kubenswrapper[36504]: I1203 22:20:47.405081 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cr8k\" (UniqueName: \"kubernetes.io/projected/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-kube-api-access-7cr8k\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.405314 master-0 kubenswrapper[36504]: I1203 22:20:47.405239 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq2zc\" (UniqueName: \"kubernetes.io/projected/1365d101-c3ad-483c-8ed5-2616e633d9ae-kube-api-access-fq2zc\") pod \"octavia-operator-controller-manager-845b79dc4f-m94lx\" (UID: \"1365d101-c3ad-483c-8ed5-2616e633d9ae\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:20:47.405314 master-0 kubenswrapper[36504]: I1203 22:20:47.405308 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktms\" (UniqueName: \"kubernetes.io/projected/bc796bc4-f06b-49c3-96b7-331444eebfcf-kube-api-access-bktms\") pod \"swift-operator-controller-manager-696b999796-tpmdc\" (UID: \"bc796bc4-f06b-49c3-96b7-331444eebfcf\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:20:47.406381 master-0 kubenswrapper[36504]: E1203 22:20:47.405887 36504 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:47.406381 master-0 kubenswrapper[36504]: E1203 22:20:47.405969 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert podName:0f5cbd39-5e9c-4682-902c-e2d0b26b4e69 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:47.905940102 +0000 UTC m=+613.125712109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" (UID: "0f5cbd39-5e9c-4682-902c-e2d0b26b4e69") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:47.416725 master-0 kubenswrapper[36504]: I1203 22:20:47.416307 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775"] Dec 03 22:20:47.467978 master-0 kubenswrapper[36504]: I1203 22:20:47.467899 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775"] Dec 03 22:20:47.470940 master-0 kubenswrapper[36504]: I1203 22:20:47.468624 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:20:47.516828 master-0 kubenswrapper[36504]: I1203 22:20:47.508796 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bb99\" (UniqueName: \"kubernetes.io/projected/5e840a7b-22a7-4605-816a-dc26e299d247-kube-api-access-2bb99\") pod \"ovn-operator-controller-manager-647f96877-tjm2v\" (UID: \"5e840a7b-22a7-4605-816a-dc26e299d247\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:20:47.516828 master-0 kubenswrapper[36504]: I1203 22:20:47.508887 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshsp\" (UniqueName: \"kubernetes.io/projected/20e58c00-5ade-44ab-825e-3d24c0644fe9-kube-api-access-wshsp\") pod \"placement-operator-controller-manager-6b64f6f645-7mmbh\" (UID: \"20e58c00-5ade-44ab-825e-3d24c0644fe9\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:20:47.516828 master-0 kubenswrapper[36504]: I1203 22:20:47.508972 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktms\" (UniqueName: \"kubernetes.io/projected/bc796bc4-f06b-49c3-96b7-331444eebfcf-kube-api-access-bktms\") pod \"swift-operator-controller-manager-696b999796-tpmdc\" (UID: \"bc796bc4-f06b-49c3-96b7-331444eebfcf\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:20:47.520799 master-0 kubenswrapper[36504]: I1203 22:20:47.518145 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq2zc\" (UniqueName: \"kubernetes.io/projected/1365d101-c3ad-483c-8ed5-2616e633d9ae-kube-api-access-fq2zc\") pod \"octavia-operator-controller-manager-845b79dc4f-m94lx\" (UID: \"1365d101-c3ad-483c-8ed5-2616e633d9ae\") " pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:20:47.520799 master-0 kubenswrapper[36504]: I1203 22:20:47.518259 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cr8k\" (UniqueName: \"kubernetes.io/projected/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-kube-api-access-7cr8k\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.522821 master-0 kubenswrapper[36504]: I1203 22:20:47.522391 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:20:47.547237 master-0 kubenswrapper[36504]: I1203 22:20:47.538916 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8"] Dec 03 22:20:47.547237 master-0 kubenswrapper[36504]: I1203 22:20:47.541707 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:20:47.568024 master-0 kubenswrapper[36504]: I1203 22:20:47.562145 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8"] Dec 03 22:20:47.574281 master-0 kubenswrapper[36504]: I1203 22:20:47.574103 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w"] Dec 03 22:20:47.583650 master-0 kubenswrapper[36504]: I1203 22:20:47.577066 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bb99\" (UniqueName: \"kubernetes.io/projected/5e840a7b-22a7-4605-816a-dc26e299d247-kube-api-access-2bb99\") pod \"ovn-operator-controller-manager-647f96877-tjm2v\" (UID: \"5e840a7b-22a7-4605-816a-dc26e299d247\") " pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:20:47.583650 master-0 kubenswrapper[36504]: I1203 22:20:47.578172 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktms\" (UniqueName: \"kubernetes.io/projected/bc796bc4-f06b-49c3-96b7-331444eebfcf-kube-api-access-bktms\") pod \"swift-operator-controller-manager-696b999796-tpmdc\" (UID: \"bc796bc4-f06b-49c3-96b7-331444eebfcf\") " pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:20:47.583650 master-0 kubenswrapper[36504]: I1203 22:20:47.578375 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:20:47.589210 master-0 kubenswrapper[36504]: I1203 22:20:47.589161 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshsp\" (UniqueName: \"kubernetes.io/projected/20e58c00-5ade-44ab-825e-3d24c0644fe9-kube-api-access-wshsp\") pod \"placement-operator-controller-manager-6b64f6f645-7mmbh\" (UID: \"20e58c00-5ade-44ab-825e-3d24c0644fe9\") " pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:20:47.592392 master-0 kubenswrapper[36504]: I1203 22:20:47.592271 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w"] Dec 03 22:20:47.615884 master-0 kubenswrapper[36504]: I1203 22:20:47.614174 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fxnf\" (UniqueName: \"kubernetes.io/projected/bb05eb5d-bc4c-4725-86ce-4e0efd653397-kube-api-access-6fxnf\") pod \"test-operator-controller-manager-57dfcdd5b8-984d8\" (UID: \"bb05eb5d-bc4c-4725-86ce-4e0efd653397\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:20:47.615884 master-0 kubenswrapper[36504]: I1203 22:20:47.614300 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppx8\" (UniqueName: \"kubernetes.io/projected/5f39d487-89d3-44a4-97e3-cf9c051ecaa5-kube-api-access-hppx8\") pod \"telemetry-operator-controller-manager-7b5867bfc7-ts775\" (UID: \"5f39d487-89d3-44a4-97e3-cf9c051ecaa5\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:20:47.615884 master-0 kubenswrapper[36504]: I1203 22:20:47.614404 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x8w7\" (UniqueName: \"kubernetes.io/projected/af647618-8824-4366-982b-d62778dff1b6-kube-api-access-7x8w7\") pod \"watcher-operator-controller-manager-6b9b669fdb-vnr5w\" (UID: \"af647618-8824-4366-982b-d62778dff1b6\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:20:47.655987 master-0 kubenswrapper[36504]: I1203 22:20:47.646909 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p"] Dec 03 22:20:47.655987 master-0 kubenswrapper[36504]: I1203 22:20:47.653642 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.662298 master-0 kubenswrapper[36504]: I1203 22:20:47.662012 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 22:20:47.662298 master-0 kubenswrapper[36504]: I1203 22:20:47.662098 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 22:20:47.717669 master-0 kubenswrapper[36504]: I1203 22:20:47.666743 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p"] Dec 03 22:20:47.717669 master-0 kubenswrapper[36504]: I1203 22:20:47.697016 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz"] Dec 03 22:20:47.717669 master-0 kubenswrapper[36504]: I1203 22:20:47.698995 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" Dec 03 22:20:47.717669 master-0 kubenswrapper[36504]: I1203 22:20:47.715523 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz"] Dec 03 22:20:47.719000 master-0 kubenswrapper[36504]: I1203 22:20:47.717714 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x8w7\" (UniqueName: \"kubernetes.io/projected/af647618-8824-4366-982b-d62778dff1b6-kube-api-access-7x8w7\") pod \"watcher-operator-controller-manager-6b9b669fdb-vnr5w\" (UID: \"af647618-8824-4366-982b-d62778dff1b6\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:20:47.719000 master-0 kubenswrapper[36504]: I1203 22:20:47.717856 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fxnf\" (UniqueName: \"kubernetes.io/projected/bb05eb5d-bc4c-4725-86ce-4e0efd653397-kube-api-access-6fxnf\") pod \"test-operator-controller-manager-57dfcdd5b8-984d8\" (UID: \"bb05eb5d-bc4c-4725-86ce-4e0efd653397\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:20:47.719000 master-0 kubenswrapper[36504]: I1203 22:20:47.717933 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppx8\" (UniqueName: \"kubernetes.io/projected/5f39d487-89d3-44a4-97e3-cf9c051ecaa5-kube-api-access-hppx8\") pod \"telemetry-operator-controller-manager-7b5867bfc7-ts775\" (UID: \"5f39d487-89d3-44a4-97e3-cf9c051ecaa5\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:20:47.764280 master-0 kubenswrapper[36504]: I1203 22:20:47.750418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppx8\" (UniqueName: \"kubernetes.io/projected/5f39d487-89d3-44a4-97e3-cf9c051ecaa5-kube-api-access-hppx8\") pod \"telemetry-operator-controller-manager-7b5867bfc7-ts775\" (UID: \"5f39d487-89d3-44a4-97e3-cf9c051ecaa5\") " pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:20:47.764280 master-0 kubenswrapper[36504]: I1203 22:20:47.756900 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:20:47.764280 master-0 kubenswrapper[36504]: I1203 22:20:47.762425 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:20:47.772718 master-0 kubenswrapper[36504]: I1203 22:20:47.772412 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x8w7\" (UniqueName: \"kubernetes.io/projected/af647618-8824-4366-982b-d62778dff1b6-kube-api-access-7x8w7\") pod \"watcher-operator-controller-manager-6b9b669fdb-vnr5w\" (UID: \"af647618-8824-4366-982b-d62778dff1b6\") " pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:20:47.773333 master-0 kubenswrapper[36504]: I1203 22:20:47.773280 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fxnf\" (UniqueName: \"kubernetes.io/projected/bb05eb5d-bc4c-4725-86ce-4e0efd653397-kube-api-access-6fxnf\") pod \"test-operator-controller-manager-57dfcdd5b8-984d8\" (UID: \"bb05eb5d-bc4c-4725-86ce-4e0efd653397\") " pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:20:47.775250 master-0 kubenswrapper[36504]: I1203 22:20:47.775210 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:20:47.821524 master-0 kubenswrapper[36504]: I1203 22:20:47.821260 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:20:47.825225 master-0 kubenswrapper[36504]: I1203 22:20:47.825176 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.825307 master-0 kubenswrapper[36504]: I1203 22:20:47.825249 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmwq\" (UniqueName: \"kubernetes.io/projected/89698816-4551-483b-a4e1-0698671933fe-kube-api-access-bxmwq\") pod \"rabbitmq-cluster-operator-manager-78955d896f-knrlz\" (UID: \"89698816-4551-483b-a4e1-0698671933fe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" Dec 03 22:20:47.825723 master-0 kubenswrapper[36504]: I1203 22:20:47.825674 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.825873 master-0 kubenswrapper[36504]: I1203 22:20:47.825846 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh7r\" (UniqueName: \"kubernetes.io/projected/2c039764-ec5b-4e34-9a3a-db6f4ad05849-kube-api-access-dvh7r\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.831707 master-0 kubenswrapper[36504]: I1203 22:20:47.831634 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:20:47.839915 master-0 kubenswrapper[36504]: I1203 22:20:47.839852 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: I1203 22:20:47.928987 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: I1203 22:20:47.929054 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: I1203 22:20:47.929091 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh7r\" (UniqueName: \"kubernetes.io/projected/2c039764-ec5b-4e34-9a3a-db6f4ad05849-kube-api-access-dvh7r\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: I1203 22:20:47.929144 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: I1203 22:20:47.929174 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmwq\" (UniqueName: \"kubernetes.io/projected/89698816-4551-483b-a4e1-0698671933fe-kube-api-access-bxmwq\") pod \"rabbitmq-cluster-operator-manager-78955d896f-knrlz\" (UID: \"89698816-4551-483b-a4e1-0698671933fe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: E1203 22:20:47.931032 36504 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: E1203 22:20:47.931115 36504 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: E1203 22:20:47.931277 36504 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: E1203 22:20:47.931359 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:48.431075206 +0000 UTC m=+613.650847223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "webhook-server-cert" not found Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: E1203 22:20:47.931407 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert podName:0f5cbd39-5e9c-4682-902c-e2d0b26b4e69 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:48.931384946 +0000 UTC m=+614.151156953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" (UID: "0f5cbd39-5e9c-4682-902c-e2d0b26b4e69") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:47.934577 master-0 kubenswrapper[36504]: E1203 22:20:47.931440 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:48.431426076 +0000 UTC m=+613.651198083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "metrics-server-cert" not found Dec 03 22:20:48.107799 master-0 kubenswrapper[36504]: I1203 22:20:48.083096 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh7r\" (UniqueName: \"kubernetes.io/projected/2c039764-ec5b-4e34-9a3a-db6f4ad05849-kube-api-access-dvh7r\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:48.107799 master-0 kubenswrapper[36504]: I1203 22:20:48.103559 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmwq\" (UniqueName: \"kubernetes.io/projected/89698816-4551-483b-a4e1-0698671933fe-kube-api-access-bxmwq\") pod \"rabbitmq-cluster-operator-manager-78955d896f-knrlz\" (UID: \"89698816-4551-483b-a4e1-0698671933fe\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" Dec 03 22:20:48.218100 master-0 kubenswrapper[36504]: I1203 22:20:48.208855 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8"] Dec 03 22:20:48.260798 master-0 kubenswrapper[36504]: I1203 22:20:48.255439 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs"] Dec 03 22:20:48.295095 master-0 kubenswrapper[36504]: W1203 22:20:48.289489 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294bc967_f361_45a7_85f6_e846ec563425.slice/crio-c275424db8205f92e6c8b08e76d95f4752ff65c14ab98d12756e245c8caadc74 WatchSource:0}: Error finding container c275424db8205f92e6c8b08e76d95f4752ff65c14ab98d12756e245c8caadc74: Status 404 returned error can't find the container with id c275424db8205f92e6c8b08e76d95f4752ff65c14ab98d12756e245c8caadc74 Dec 03 22:20:48.347796 master-0 kubenswrapper[36504]: I1203 22:20:48.341755 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" Dec 03 22:20:48.383790 master-0 kubenswrapper[36504]: I1203 22:20:48.377138 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:48.383790 master-0 kubenswrapper[36504]: E1203 22:20:48.379599 36504 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:48.383790 master-0 kubenswrapper[36504]: E1203 22:20:48.379660 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert podName:66ded0bb-1de9-4b08-a2a2-c679aedbbded nodeName:}" failed. No retries permitted until 2025-12-03 22:20:50.379638988 +0000 UTC m=+615.599411005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-lcnlm" (UID: "66ded0bb-1de9-4b08-a2a2-c679aedbbded") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:48.432402 master-0 kubenswrapper[36504]: I1203 22:20:48.429572 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp"] Dec 03 22:20:48.465797 master-0 kubenswrapper[36504]: I1203 22:20:48.464679 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp"] Dec 03 22:20:48.485239 master-0 kubenswrapper[36504]: I1203 22:20:48.485007 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:48.485541 master-0 kubenswrapper[36504]: I1203 22:20:48.485480 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:48.486853 master-0 kubenswrapper[36504]: E1203 22:20:48.485918 36504 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:20:48.486853 master-0 kubenswrapper[36504]: E1203 22:20:48.486021 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:49.485982052 +0000 UTC m=+614.705754059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "metrics-server-cert" not found Dec 03 22:20:48.486853 master-0 kubenswrapper[36504]: E1203 22:20:48.486106 36504 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:20:48.486853 master-0 kubenswrapper[36504]: E1203 22:20:48.486144 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:49.486134097 +0000 UTC m=+614.705906104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "webhook-server-cert" not found Dec 03 22:20:48.530501 master-0 kubenswrapper[36504]: I1203 22:20:48.530316 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q"] Dec 03 22:20:48.544811 master-0 kubenswrapper[36504]: I1203 22:20:48.544719 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5"] Dec 03 22:20:48.574915 master-0 kubenswrapper[36504]: W1203 22:20:48.574849 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591effbb_eb20_4094_8637_d3fee9582c37.slice/crio-05ce0c1937c86f91a09b3513c823ef64c658e0343aa5abc8846389135c5f61e9 WatchSource:0}: Error finding container 05ce0c1937c86f91a09b3513c823ef64c658e0343aa5abc8846389135c5f61e9: Status 404 returned error can't find the container with id 05ce0c1937c86f91a09b3513c823ef64c658e0343aa5abc8846389135c5f61e9 Dec 03 22:20:48.852969 master-0 kubenswrapper[36504]: I1203 22:20:48.852908 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" event={"ID":"0ee5f717-c104-47a4-b47a-f56856f80286","Type":"ContainerStarted","Data":"3629f7ceb130d105abe048dc30d9fa30b3731d6596f06ba29631d05cb19edb26"} Dec 03 22:20:48.854836 master-0 kubenswrapper[36504]: I1203 22:20:48.854798 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" event={"ID":"591effbb-eb20-4094-8637-d3fee9582c37","Type":"ContainerStarted","Data":"05ce0c1937c86f91a09b3513c823ef64c658e0343aa5abc8846389135c5f61e9"} Dec 03 22:20:48.861920 master-0 kubenswrapper[36504]: I1203 22:20:48.861849 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" event={"ID":"294bc967-f361-45a7-85f6-e846ec563425","Type":"ContainerStarted","Data":"c275424db8205f92e6c8b08e76d95f4752ff65c14ab98d12756e245c8caadc74"} Dec 03 22:20:48.865072 master-0 kubenswrapper[36504]: I1203 22:20:48.865011 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" event={"ID":"1f86e923-1f9c-43ca-84ac-1cff0660b6a9","Type":"ContainerStarted","Data":"24748b8626ed290e500677989ee23bc524e00e90da19480e40814e18c92cf960"} Dec 03 22:20:48.867347 master-0 kubenswrapper[36504]: I1203 22:20:48.866844 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" event={"ID":"7bc69a6b-4e45-44d8-9263-d04ae58ba614","Type":"ContainerStarted","Data":"bfa4e701f59658387357ea61300fa609ab77dc89b23cd0597e671f8d39138630"} Dec 03 22:20:48.873571 master-0 kubenswrapper[36504]: I1203 22:20:48.873503 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" event={"ID":"d5090b03-b50e-4470-92ff-cdba513adcae","Type":"ContainerStarted","Data":"cde63e676457355eb05d39d24249407bddc69fa75ccf2674a695ac9d20e7e26c"} Dec 03 22:20:49.002115 master-0 kubenswrapper[36504]: I1203 22:20:49.002036 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:49.002424 master-0 kubenswrapper[36504]: E1203 22:20:49.002390 36504 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:49.002609 master-0 kubenswrapper[36504]: E1203 22:20:49.002561 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert podName:0f5cbd39-5e9c-4682-902c-e2d0b26b4e69 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:51.002460001 +0000 UTC m=+616.222232008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" (UID: "0f5cbd39-5e9c-4682-902c-e2d0b26b4e69") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:49.004719 master-0 kubenswrapper[36504]: W1203 22:20:49.004669 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4ce37c_6c7b_40f9_a14e_4c956089d552.slice/crio-baafd8c7ae1b0e511af4471b939e8729779f8a3497ca262adf9faad87d9e1ff8 WatchSource:0}: Error finding container baafd8c7ae1b0e511af4471b939e8729779f8a3497ca262adf9faad87d9e1ff8: Status 404 returned error can't find the container with id baafd8c7ae1b0e511af4471b939e8729779f8a3497ca262adf9faad87d9e1ff8 Dec 03 22:20:49.016639 master-0 kubenswrapper[36504]: I1203 22:20:49.015648 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8"] Dec 03 22:20:49.136418 master-0 kubenswrapper[36504]: I1203 22:20:49.134018 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr"] Dec 03 22:20:49.159847 master-0 kubenswrapper[36504]: I1203 22:20:49.159753 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k"] Dec 03 22:20:49.181070 master-0 kubenswrapper[36504]: I1203 22:20:49.180988 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd"] Dec 03 22:20:49.276323 master-0 kubenswrapper[36504]: I1203 22:20:49.275661 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l"] Dec 03 22:20:49.523514 master-0 kubenswrapper[36504]: I1203 22:20:49.523438 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:49.524216 master-0 kubenswrapper[36504]: I1203 22:20:49.523544 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:49.524216 master-0 kubenswrapper[36504]: E1203 22:20:49.523979 36504 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:20:49.524216 master-0 kubenswrapper[36504]: E1203 22:20:49.524054 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:51.524036821 +0000 UTC m=+616.743808828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "metrics-server-cert" not found Dec 03 22:20:49.524350 master-0 kubenswrapper[36504]: E1203 22:20:49.524300 36504 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:20:49.524350 master-0 kubenswrapper[36504]: E1203 22:20:49.524340 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:51.52432982 +0000 UTC m=+616.744101827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "webhook-server-cert" not found Dec 03 22:20:49.845418 master-0 kubenswrapper[36504]: I1203 22:20:49.845351 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v"] Dec 03 22:20:49.869159 master-0 kubenswrapper[36504]: I1203 22:20:49.868919 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8"] Dec 03 22:20:49.879590 master-0 kubenswrapper[36504]: I1203 22:20:49.879327 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx"] Dec 03 22:20:49.884246 master-0 kubenswrapper[36504]: W1203 22:20:49.884168 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93356ecd_7681_4dbd_a992_03fef31de73f.slice/crio-f3325a93cf814ff2142093470a735aacaaf5d95240ee9d793b85d8a557c9a382 WatchSource:0}: Error finding container f3325a93cf814ff2142093470a735aacaaf5d95240ee9d793b85d8a557c9a382: Status 404 returned error can't find the container with id f3325a93cf814ff2142093470a735aacaaf5d95240ee9d793b85d8a557c9a382 Dec 03 22:20:49.885699 master-0 kubenswrapper[36504]: W1203 22:20:49.885641 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f39d487_89d3_44a4_97e3_cf9c051ecaa5.slice/crio-1866cceca8e6ec1ee3d9974529213c2d83cb92cfed1df4f418c75f446dbee340 WatchSource:0}: Error finding container 1866cceca8e6ec1ee3d9974529213c2d83cb92cfed1df4f418c75f446dbee340: Status 404 returned error can't find the container with id 1866cceca8e6ec1ee3d9974529213c2d83cb92cfed1df4f418c75f446dbee340 Dec 03 22:20:49.886399 master-0 kubenswrapper[36504]: W1203 22:20:49.886371 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc796bc4_f06b_49c3_96b7_331444eebfcf.slice/crio-9d6631200147e4b6596a5a2a9086d69a9ed4467e1a277d52018c8fcf1665ec95 WatchSource:0}: Error finding container 9d6631200147e4b6596a5a2a9086d69a9ed4467e1a277d52018c8fcf1665ec95: Status 404 returned error can't find the container with id 9d6631200147e4b6596a5a2a9086d69a9ed4467e1a277d52018c8fcf1665ec95 Dec 03 22:20:49.894069 master-0 kubenswrapper[36504]: W1203 22:20:49.894018 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e58c00_5ade_44ab_825e_3d24c0644fe9.slice/crio-556dbeb7d44c03698de94cad7011d74a2bd34584c99cbb23c73cf499938b6697 WatchSource:0}: Error finding container 556dbeb7d44c03698de94cad7011d74a2bd34584c99cbb23c73cf499938b6697: Status 404 returned error can't find the container with id 556dbeb7d44c03698de94cad7011d74a2bd34584c99cbb23c73cf499938b6697 Dec 03 22:20:49.899149 master-0 kubenswrapper[36504]: I1203 22:20:49.898818 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" event={"ID":"066459a1-f230-40e7-b00c-64edad996634","Type":"ContainerStarted","Data":"26c178f570aceca09786bc0cefa4c72e5925a90114184cc1d58ad30dd0ac60d8"} Dec 03 22:20:49.900319 master-0 kubenswrapper[36504]: I1203 22:20:49.900248 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q"] Dec 03 22:20:49.903655 master-0 kubenswrapper[36504]: I1203 22:20:49.903424 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" event={"ID":"4d4ce37c-6c7b-40f9-a14e-4c956089d552","Type":"ContainerStarted","Data":"baafd8c7ae1b0e511af4471b939e8729779f8a3497ca262adf9faad87d9e1ff8"} Dec 03 22:20:49.907286 master-0 kubenswrapper[36504]: W1203 22:20:49.907036 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e840a7b_22a7_4605_816a_dc26e299d247.slice/crio-656f25454b94cfe8cc7c0d8695842991fa3736ac3b8cde39f28f3daf86a44b31 WatchSource:0}: Error finding container 656f25454b94cfe8cc7c0d8695842991fa3736ac3b8cde39f28f3daf86a44b31: Status 404 returned error can't find the container with id 656f25454b94cfe8cc7c0d8695842991fa3736ac3b8cde39f28f3daf86a44b31 Dec 03 22:20:49.910117 master-0 kubenswrapper[36504]: W1203 22:20:49.909917 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1365d101_c3ad_483c_8ed5_2616e633d9ae.slice/crio-243b8da44d1c265caaf590aaa093a19dc3adc9b9292db89cce925e4de1e2a767 WatchSource:0}: Error finding container 243b8da44d1c265caaf590aaa093a19dc3adc9b9292db89cce925e4de1e2a767: Status 404 returned error can't find the container with id 243b8da44d1c265caaf590aaa093a19dc3adc9b9292db89cce925e4de1e2a767 Dec 03 22:20:49.912864 master-0 kubenswrapper[36504]: W1203 22:20:49.912683 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89698816_4551_483b_a4e1_0698671933fe.slice/crio-90006b7ec905d44d4295ae8a715d32268eeabbc794187938e2dd33adfe112baa WatchSource:0}: Error finding container 90006b7ec905d44d4295ae8a715d32268eeabbc794187938e2dd33adfe112baa: Status 404 returned error can't find the container with id 90006b7ec905d44d4295ae8a715d32268eeabbc794187938e2dd33adfe112baa Dec 03 22:20:49.913085 master-0 kubenswrapper[36504]: E1203 22:20:49.912937 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2bb99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-tjm2v_openstack-operators(5e840a7b-22a7-4605-816a-dc26e299d247): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:20:49.913206 master-0 kubenswrapper[36504]: I1203 22:20:49.912824 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" event={"ID":"70f7d60f-1ece-4a67-a82a-d10933d8f28f","Type":"ContainerStarted","Data":"b3b240a956bbfeb6e17c36259cf0a5cdc43a8cedab2be17f7d0120166f84311a"} Dec 03 22:20:49.920215 master-0 kubenswrapper[36504]: E1203 22:20:49.919502 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2bb99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-tjm2v_openstack-operators(5e840a7b-22a7-4605-816a-dc26e299d247): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:20:49.920215 master-0 kubenswrapper[36504]: I1203 22:20:49.919643 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh"] Dec 03 22:20:49.920215 master-0 kubenswrapper[36504]: I1203 22:20:49.919764 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" event={"ID":"45bff768-9e56-4111-aecc-6b4b4bea9202","Type":"ContainerStarted","Data":"c7e42deb02446280616946d34472e79987214444086028e21ade435097f1ec88"} Dec 03 22:20:49.920215 master-0 kubenswrapper[36504]: E1203 22:20:49.919700 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bxmwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-78955d896f-knrlz_openstack-operators(89698816-4551-483b-a4e1-0698671933fe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:20:49.920846 master-0 kubenswrapper[36504]: E1203 22:20:49.920789 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:20:49.920956 master-0 kubenswrapper[36504]: E1203 22:20:49.920856 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" podUID="89698816-4551-483b-a4e1-0698671933fe" Dec 03 22:20:49.921934 master-0 kubenswrapper[36504]: E1203 22:20:49.921801 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:50,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:30,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:10,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fq2zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-845b79dc4f-m94lx_openstack-operators(1365d101-c3ad-483c-8ed5-2616e633d9ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:20:49.922699 master-0 kubenswrapper[36504]: I1203 22:20:49.922606 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" event={"ID":"6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e","Type":"ContainerStarted","Data":"9cfacbdec576e69e872269b31e5ebfda2bfdde8ae08d9834b917dd5bb487bd32"} Dec 03 22:20:49.926017 master-0 kubenswrapper[36504]: E1203 22:20:49.925958 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fq2zc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-845b79dc4f-m94lx_openstack-operators(1365d101-c3ad-483c-8ed5-2616e633d9ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:20:49.927197 master-0 kubenswrapper[36504]: E1203 22:20:49.927120 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" podUID="1365d101-c3ad-483c-8ed5-2616e633d9ae" Dec 03 22:20:49.928623 master-0 kubenswrapper[36504]: I1203 22:20:49.928548 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" event={"ID":"bb05eb5d-bc4c-4725-86ce-4e0efd653397","Type":"ContainerStarted","Data":"2a8d5f3aa8fd8e958938092441b1ad66f198eaad1a48955ab5af3e8e8def95f5"} Dec 03 22:20:49.932621 master-0 kubenswrapper[36504]: I1203 22:20:49.931868 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775"] Dec 03 22:20:49.943080 master-0 kubenswrapper[36504]: I1203 22:20:49.942981 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-696b999796-tpmdc"] Dec 03 22:20:49.951942 master-0 kubenswrapper[36504]: I1203 22:20:49.951795 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w"] Dec 03 22:20:49.959401 master-0 kubenswrapper[36504]: I1203 22:20:49.959323 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz"] Dec 03 22:20:50.449211 master-0 kubenswrapper[36504]: I1203 22:20:50.449109 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:50.450929 master-0 kubenswrapper[36504]: E1203 22:20:50.450182 36504 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:50.450929 master-0 kubenswrapper[36504]: E1203 22:20:50.450318 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert podName:66ded0bb-1de9-4b08-a2a2-c679aedbbded nodeName:}" failed. No retries permitted until 2025-12-03 22:20:54.450290531 +0000 UTC m=+619.670062538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-lcnlm" (UID: "66ded0bb-1de9-4b08-a2a2-c679aedbbded") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:50.945801 master-0 kubenswrapper[36504]: I1203 22:20:50.945716 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" event={"ID":"20e58c00-5ade-44ab-825e-3d24c0644fe9","Type":"ContainerStarted","Data":"556dbeb7d44c03698de94cad7011d74a2bd34584c99cbb23c73cf499938b6697"} Dec 03 22:20:50.949894 master-0 kubenswrapper[36504]: I1203 22:20:50.949835 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" event={"ID":"bc796bc4-f06b-49c3-96b7-331444eebfcf","Type":"ContainerStarted","Data":"9d6631200147e4b6596a5a2a9086d69a9ed4467e1a277d52018c8fcf1665ec95"} Dec 03 22:20:50.968298 master-0 kubenswrapper[36504]: I1203 22:20:50.968235 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" event={"ID":"93356ecd-7681-4dbd-a992-03fef31de73f","Type":"ContainerStarted","Data":"f3325a93cf814ff2142093470a735aacaaf5d95240ee9d793b85d8a557c9a382"} Dec 03 22:20:50.973039 master-0 kubenswrapper[36504]: I1203 22:20:50.972944 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" event={"ID":"5f39d487-89d3-44a4-97e3-cf9c051ecaa5","Type":"ContainerStarted","Data":"1866cceca8e6ec1ee3d9974529213c2d83cb92cfed1df4f418c75f446dbee340"} Dec 03 22:20:50.982031 master-0 kubenswrapper[36504]: I1203 22:20:50.981957 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" event={"ID":"89698816-4551-483b-a4e1-0698671933fe","Type":"ContainerStarted","Data":"90006b7ec905d44d4295ae8a715d32268eeabbc794187938e2dd33adfe112baa"} Dec 03 22:20:50.984415 master-0 kubenswrapper[36504]: E1203 22:20:50.984361 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" podUID="89698816-4551-483b-a4e1-0698671933fe" Dec 03 22:20:50.989126 master-0 kubenswrapper[36504]: I1203 22:20:50.989088 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" event={"ID":"1365d101-c3ad-483c-8ed5-2616e633d9ae","Type":"ContainerStarted","Data":"243b8da44d1c265caaf590aaa093a19dc3adc9b9292db89cce925e4de1e2a767"} Dec 03 22:20:51.029522 master-0 kubenswrapper[36504]: E1203 22:20:50.991834 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" podUID="1365d101-c3ad-483c-8ed5-2616e633d9ae" Dec 03 22:20:51.029522 master-0 kubenswrapper[36504]: I1203 22:20:50.992277 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" event={"ID":"5e840a7b-22a7-4605-816a-dc26e299d247","Type":"ContainerStarted","Data":"656f25454b94cfe8cc7c0d8695842991fa3736ac3b8cde39f28f3daf86a44b31"} Dec 03 22:20:51.029522 master-0 kubenswrapper[36504]: E1203 22:20:50.995605 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:20:51.029522 master-0 kubenswrapper[36504]: I1203 22:20:50.995842 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" event={"ID":"af647618-8824-4366-982b-d62778dff1b6","Type":"ContainerStarted","Data":"341b828c26ed87577aa6dde98bc9e8e72762b22f19e74e30cb9fdf10244ef5b2"} Dec 03 22:20:51.073790 master-0 kubenswrapper[36504]: I1203 22:20:51.073683 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:51.074138 master-0 kubenswrapper[36504]: E1203 22:20:51.074111 36504 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:51.074206 master-0 kubenswrapper[36504]: E1203 22:20:51.074177 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert podName:0f5cbd39-5e9c-4682-902c-e2d0b26b4e69 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:55.074158347 +0000 UTC m=+620.293930354 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" (UID: "0f5cbd39-5e9c-4682-902c-e2d0b26b4e69") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:51.614501 master-0 kubenswrapper[36504]: I1203 22:20:51.614421 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:51.614938 master-0 kubenswrapper[36504]: I1203 22:20:51.614553 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:51.614938 master-0 kubenswrapper[36504]: E1203 22:20:51.614715 36504 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:20:51.614938 master-0 kubenswrapper[36504]: E1203 22:20:51.614783 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:55.61475419 +0000 UTC m=+620.834526197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "webhook-server-cert" not found Dec 03 22:20:51.614938 master-0 kubenswrapper[36504]: E1203 22:20:51.614841 36504 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:20:51.615106 master-0 kubenswrapper[36504]: E1203 22:20:51.615001 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:20:55.614967107 +0000 UTC m=+620.834739324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "metrics-server-cert" not found Dec 03 22:20:52.013668 master-0 kubenswrapper[36504]: E1203 22:20:52.013178 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" podUID="89698816-4551-483b-a4e1-0698671933fe" Dec 03 22:20:52.015639 master-0 kubenswrapper[36504]: E1203 22:20:52.015566 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:20:52.015718 master-0 kubenswrapper[36504]: E1203 22:20:52.015663 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" podUID="1365d101-c3ad-483c-8ed5-2616e633d9ae" Dec 03 22:20:54.541481 master-0 kubenswrapper[36504]: I1203 22:20:54.541415 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:20:54.542051 master-0 kubenswrapper[36504]: E1203 22:20:54.541730 36504 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:54.542051 master-0 kubenswrapper[36504]: E1203 22:20:54.541918 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert podName:66ded0bb-1de9-4b08-a2a2-c679aedbbded nodeName:}" failed. No retries permitted until 2025-12-03 22:21:02.541880979 +0000 UTC m=+627.761652986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert") pod "infra-operator-controller-manager-7d9c9d7fd8-lcnlm" (UID: "66ded0bb-1de9-4b08-a2a2-c679aedbbded") : secret "infra-operator-webhook-server-cert" not found Dec 03 22:20:55.152325 master-0 kubenswrapper[36504]: I1203 22:20:55.152213 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:20:55.152838 master-0 kubenswrapper[36504]: E1203 22:20:55.152451 36504 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:55.153031 master-0 kubenswrapper[36504]: E1203 22:20:55.152994 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert podName:0f5cbd39-5e9c-4682-902c-e2d0b26b4e69 nodeName:}" failed. No retries permitted until 2025-12-03 22:21:03.152937658 +0000 UTC m=+628.372709665 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert") pod "openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" (UID: "0f5cbd39-5e9c-4682-902c-e2d0b26b4e69") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 22:20:55.633096 master-0 kubenswrapper[36504]: I1203 22:20:55.633004 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:55.634070 master-0 kubenswrapper[36504]: E1203 22:20:55.633220 36504 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 22:20:55.634070 master-0 kubenswrapper[36504]: I1203 22:20:55.633241 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:20:55.634070 master-0 kubenswrapper[36504]: E1203 22:20:55.633318 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:21:03.63329446 +0000 UTC m=+628.853066477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "webhook-server-cert" not found Dec 03 22:20:55.634070 master-0 kubenswrapper[36504]: E1203 22:20:55.633589 36504 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 22:20:55.634070 master-0 kubenswrapper[36504]: E1203 22:20:55.633664 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs podName:2c039764-ec5b-4e34-9a3a-db6f4ad05849 nodeName:}" failed. No retries permitted until 2025-12-03 22:21:03.633641151 +0000 UTC m=+628.853413148 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs") pod "openstack-operator-controller-manager-57d98476c4-z957p" (UID: "2c039764-ec5b-4e34-9a3a-db6f4ad05849") : secret "metrics-server-cert" not found Dec 03 22:21:02.586892 master-0 kubenswrapper[36504]: I1203 22:21:02.586796 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:21:02.603073 master-0 kubenswrapper[36504]: I1203 22:21:02.602743 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/66ded0bb-1de9-4b08-a2a2-c679aedbbded-cert\") pod \"infra-operator-controller-manager-7d9c9d7fd8-lcnlm\" (UID: \"66ded0bb-1de9-4b08-a2a2-c679aedbbded\") " pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:21:02.722983 master-0 kubenswrapper[36504]: I1203 22:21:02.722920 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:21:03.095854 master-0 kubenswrapper[36504]: I1203 22:21:03.095726 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:21:03.199156 master-0 kubenswrapper[36504]: I1203 22:21:03.199096 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:21:03.203429 master-0 kubenswrapper[36504]: I1203 22:21:03.203353 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f5cbd39-5e9c-4682-902c-e2d0b26b4e69-cert\") pod \"openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h\" (UID: \"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:21:03.436180 master-0 kubenswrapper[36504]: I1203 22:21:03.436088 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:21:03.711874 master-0 kubenswrapper[36504]: I1203 22:21:03.711698 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:03.712425 master-0 kubenswrapper[36504]: I1203 22:21:03.711886 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:03.716993 master-0 kubenswrapper[36504]: I1203 22:21:03.716956 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-webhook-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:03.717928 master-0 kubenswrapper[36504]: I1203 22:21:03.717887 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c039764-ec5b-4e34-9a3a-db6f4ad05849-metrics-certs\") pod \"openstack-operator-controller-manager-57d98476c4-z957p\" (UID: \"2c039764-ec5b-4e34-9a3a-db6f4ad05849\") " pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:03.751875 master-0 kubenswrapper[36504]: I1203 22:21:03.751751 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:05.127148 master-0 kubenswrapper[36504]: I1203 22:21:05.127090 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:21:12.096414 master-0 kubenswrapper[36504]: I1203 22:21:12.096311 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:21:16.868326 master-0 kubenswrapper[36504]: I1203 22:21:16.867962 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h"] Dec 03 22:21:16.922584 master-0 kubenswrapper[36504]: W1203 22:21:16.922513 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f5cbd39_5e9c_4682_902c_e2d0b26b4e69.slice/crio-b87f1214403414902f5b4a49cd6303995140cd6f057ee8abee3e80d490939f40 WatchSource:0}: Error finding container b87f1214403414902f5b4a49cd6303995140cd6f057ee8abee3e80d490939f40: Status 404 returned error can't find the container with id b87f1214403414902f5b4a49cd6303995140cd6f057ee8abee3e80d490939f40 Dec 03 22:21:17.216546 master-0 kubenswrapper[36504]: I1203 22:21:17.215991 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p"] Dec 03 22:21:17.234893 master-0 kubenswrapper[36504]: W1203 22:21:17.234759 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c039764_ec5b_4e34_9a3a_db6f4ad05849.slice/crio-76f6ad660763be842433af768cb7bf458887788404e1d27ef62d9e09576114dc WatchSource:0}: Error finding container 76f6ad660763be842433af768cb7bf458887788404e1d27ef62d9e09576114dc: Status 404 returned error can't find the container with id 76f6ad660763be842433af768cb7bf458887788404e1d27ef62d9e09576114dc Dec 03 22:21:17.314938 master-0 kubenswrapper[36504]: I1203 22:21:17.314781 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" event={"ID":"4d4ce37c-6c7b-40f9-a14e-4c956089d552","Type":"ContainerStarted","Data":"446da9b06365d3d816f8cf104c8e45804782817a94dcab262ffbca2fd8b5aabe"} Dec 03 22:21:17.316504 master-0 kubenswrapper[36504]: I1203 22:21:17.316475 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" event={"ID":"2c039764-ec5b-4e34-9a3a-db6f4ad05849","Type":"ContainerStarted","Data":"76f6ad660763be842433af768cb7bf458887788404e1d27ef62d9e09576114dc"} Dec 03 22:21:17.317897 master-0 kubenswrapper[36504]: I1203 22:21:17.317841 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" event={"ID":"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69","Type":"ContainerStarted","Data":"b87f1214403414902f5b4a49cd6303995140cd6f057ee8abee3e80d490939f40"} Dec 03 22:21:17.320506 master-0 kubenswrapper[36504]: I1203 22:21:17.319814 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" event={"ID":"0ee5f717-c104-47a4-b47a-f56856f80286","Type":"ContainerStarted","Data":"212274010ab97bc27b28a96fc14ced465b76fd140612e4eb4e084a5812d63e99"} Dec 03 22:21:17.497174 master-0 kubenswrapper[36504]: I1203 22:21:17.496641 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm"] Dec 03 22:21:18.347856 master-0 kubenswrapper[36504]: E1203 22:21:18.346858 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2bb99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-647f96877-tjm2v_openstack-operators(5e840a7b-22a7-4605-816a-dc26e299d247): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:21:18.348414 master-0 kubenswrapper[36504]: E1203 22:21:18.348290 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:21:18.372444 master-0 kubenswrapper[36504]: I1203 22:21:18.371937 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" event={"ID":"70f7d60f-1ece-4a67-a82a-d10933d8f28f","Type":"ContainerStarted","Data":"59d9ea11e9e9548aba289b4efc3b5cab4e7f264e56489f37c63028261717b09b"} Dec 03 22:21:18.375871 master-0 kubenswrapper[36504]: I1203 22:21:18.375812 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" event={"ID":"bc796bc4-f06b-49c3-96b7-331444eebfcf","Type":"ContainerStarted","Data":"affd29f08ebf0a03d35cce17a95a56cb92fe3224b8668c3086deaff2f9094402"} Dec 03 22:21:18.392123 master-0 kubenswrapper[36504]: I1203 22:21:18.391593 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" event={"ID":"294bc967-f361-45a7-85f6-e846ec563425","Type":"ContainerStarted","Data":"845ce1184bdcfd2b0d77757b916ef21448a1b8dd07816841f3e64ba3d0eeec2e"} Dec 03 22:21:18.415166 master-0 kubenswrapper[36504]: I1203 22:21:18.403806 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" event={"ID":"af647618-8824-4366-982b-d62778dff1b6","Type":"ContainerStarted","Data":"fdcabdeedd8ce05dc8a16d6cc6e311dfd474618e21a14d61330c93613bf270f2"} Dec 03 22:21:18.420304 master-0 kubenswrapper[36504]: I1203 22:21:18.420153 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" event={"ID":"7bc69a6b-4e45-44d8-9263-d04ae58ba614","Type":"ContainerStarted","Data":"7c91614cb44953c9ecce3f8eb6ce27e2b4d242c72b303238a3bd7b2017005ac2"} Dec 03 22:21:18.428064 master-0 kubenswrapper[36504]: I1203 22:21:18.425812 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" event={"ID":"5f39d487-89d3-44a4-97e3-cf9c051ecaa5","Type":"ContainerStarted","Data":"bba2f005bfa0863bf45a71ecd438c31df0e02a9e531d7ec2ee619e3176f67996"} Dec 03 22:21:18.433228 master-0 kubenswrapper[36504]: I1203 22:21:18.432686 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" event={"ID":"6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e","Type":"ContainerStarted","Data":"bfeb5065f6b7e859349562bb9d1d158ab06880796e94b1ef099847a74cae81e8"} Dec 03 22:21:18.436292 master-0 kubenswrapper[36504]: I1203 22:21:18.436212 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" event={"ID":"66ded0bb-1de9-4b08-a2a2-c679aedbbded","Type":"ContainerStarted","Data":"eafd4fe0884761473da256279221c57e2f7ea1b07172a2c07db72b770f98fc6a"} Dec 03 22:21:18.440102 master-0 kubenswrapper[36504]: I1203 22:21:18.440028 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" event={"ID":"1f86e923-1f9c-43ca-84ac-1cff0660b6a9","Type":"ContainerStarted","Data":"ec035fc97244cc4fb7550af153d731c43959b5b1debd3c5d047f0cce66145257"} Dec 03 22:21:18.441544 master-0 kubenswrapper[36504]: E1203 22:21:18.441502 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jwkfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-f6cc97788-tz5zp_openstack-operators(591effbb-eb20-4094-8637-d3fee9582c37): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:21:18.443261 master-0 kubenswrapper[36504]: E1203 22:21:18.443215 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" podUID="591effbb-eb20-4094-8637-d3fee9582c37" Dec 03 22:21:18.443611 master-0 kubenswrapper[36504]: I1203 22:21:18.443573 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" event={"ID":"93356ecd-7681-4dbd-a992-03fef31de73f","Type":"ContainerStarted","Data":"783f4d997258457316f6ff02d250a00704881168355c2a41a4c9fd55e338f3f2"} Dec 03 22:21:18.459224 master-0 kubenswrapper[36504]: I1203 22:21:18.459150 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" event={"ID":"d5090b03-b50e-4470-92ff-cdba513adcae","Type":"ContainerStarted","Data":"e6eeb451c71e3af8a1a58d9fa5955d65b27f0a9869dbe26b1bb6eeb88266a113"} Dec 03 22:21:18.492698 master-0 kubenswrapper[36504]: E1203 22:21:18.492561 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kn7rl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cdd6b54fb-wzvfr_openstack-operators(45bff768-9e56-4111-aecc-6b4b4bea9202): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:21:18.510839 master-0 kubenswrapper[36504]: E1203 22:21:18.508283 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" podUID="45bff768-9e56-4111-aecc-6b4b4bea9202" Dec 03 22:21:18.519120 master-0 kubenswrapper[36504]: I1203 22:21:18.519023 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" event={"ID":"bb05eb5d-bc4c-4725-86ce-4e0efd653397","Type":"ContainerStarted","Data":"8ebd86fc7247b5b3faf1b98394384d3c788e51e2388d06dca47b84314d1d9b90"} Dec 03 22:21:18.530325 master-0 kubenswrapper[36504]: I1203 22:21:18.525530 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" event={"ID":"5e840a7b-22a7-4605-816a-dc26e299d247","Type":"ContainerStarted","Data":"d9c22e41e32a0396e0aaaef6bcfaef9a03dec0bb4fc6dac02d30f98b22e098ca"} Dec 03 22:21:18.530325 master-0 kubenswrapper[36504]: I1203 22:21:18.526960 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:21:18.530325 master-0 kubenswrapper[36504]: E1203 22:21:18.528445 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:21:18.530325 master-0 kubenswrapper[36504]: I1203 22:21:18.529684 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" event={"ID":"20e58c00-5ade-44ab-825e-3d24c0644fe9","Type":"ContainerStarted","Data":"2784a8413628d716fca8941cd6f877af19cc44d8ea22c42958bc710db0092dff"} Dec 03 22:21:18.570259 master-0 kubenswrapper[36504]: E1203 22:21:18.567048 36504 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8mwpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-58b8dcc5fb-dzpzd_openstack-operators(066459a1-f230-40e7-b00c-64edad996634): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 22:21:18.570259 master-0 kubenswrapper[36504]: E1203 22:21:18.568206 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" podUID="066459a1-f230-40e7-b00c-64edad996634" Dec 03 22:21:19.636092 master-0 kubenswrapper[36504]: I1203 22:21:19.636021 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" event={"ID":"89698816-4551-483b-a4e1-0698671933fe","Type":"ContainerStarted","Data":"7b131ef2f73e23cd593cd9087e56e0f989099c06a27f7e6831cf702564b85e7d"} Dec 03 22:21:19.665592 master-0 kubenswrapper[36504]: I1203 22:21:19.665529 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" event={"ID":"1365d101-c3ad-483c-8ed5-2616e633d9ae","Type":"ContainerStarted","Data":"7b2829cb9c25727e2b0490375b9217b82fa6db95238b105951d8df1e4bb1fca6"} Dec 03 22:21:19.676885 master-0 kubenswrapper[36504]: I1203 22:21:19.676750 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" event={"ID":"2c039764-ec5b-4e34-9a3a-db6f4ad05849","Type":"ContainerStarted","Data":"7a403100457cb4f951bde4d4687ad5ecc00a841cff24d7478cec6ea22a3cea47"} Dec 03 22:21:19.679104 master-0 kubenswrapper[36504]: I1203 22:21:19.679057 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:19.681548 master-0 kubenswrapper[36504]: I1203 22:21:19.681513 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" event={"ID":"45bff768-9e56-4111-aecc-6b4b4bea9202","Type":"ContainerStarted","Data":"ee811af7076713e76c9cf47c2727aa8acd3f167a40b23a2be6220566e5affdf1"} Dec 03 22:21:19.684299 master-0 kubenswrapper[36504]: I1203 22:21:19.684248 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:21:19.686738 master-0 kubenswrapper[36504]: E1203 22:21:19.686698 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" podUID="45bff768-9e56-4111-aecc-6b4b4bea9202" Dec 03 22:21:19.692652 master-0 kubenswrapper[36504]: I1203 22:21:19.692606 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" event={"ID":"066459a1-f230-40e7-b00c-64edad996634","Type":"ContainerStarted","Data":"47963beeb653927d561b6e17e70caa706ea3ca244682d3f8c6b14bccdb56f8b7"} Dec 03 22:21:19.694117 master-0 kubenswrapper[36504]: I1203 22:21:19.694040 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:21:19.697128 master-0 kubenswrapper[36504]: E1203 22:21:19.697066 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" podUID="066459a1-f230-40e7-b00c-64edad996634" Dec 03 22:21:19.701979 master-0 kubenswrapper[36504]: I1203 22:21:19.701219 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" event={"ID":"591effbb-eb20-4094-8637-d3fee9582c37","Type":"ContainerStarted","Data":"0f63fcf2413d222fd006fb3b2c14b5034162211a3815ce71264411893da64a9a"} Dec 03 22:21:19.701979 master-0 kubenswrapper[36504]: I1203 22:21:19.701295 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:21:19.703981 master-0 kubenswrapper[36504]: E1203 22:21:19.703041 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:21:19.703981 master-0 kubenswrapper[36504]: E1203 22:21:19.703160 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" podUID="591effbb-eb20-4094-8637-d3fee9582c37" Dec 03 22:21:20.037913 master-0 kubenswrapper[36504]: I1203 22:21:20.033302 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-78955d896f-knrlz" podStartSLOduration=6.129889047 podStartE2EDuration="33.033280475s" podCreationTimestamp="2025-12-03 22:20:47 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.919626923 +0000 UTC m=+615.139398930" lastFinishedPulling="2025-12-03 22:21:16.823018351 +0000 UTC m=+642.042790358" observedRunningTime="2025-12-03 22:21:19.885692092 +0000 UTC m=+645.105464099" watchObservedRunningTime="2025-12-03 22:21:20.033280475 +0000 UTC m=+645.253052482" Dec 03 22:21:20.250415 master-0 kubenswrapper[36504]: I1203 22:21:20.246903 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" podStartSLOduration=33.246869322 podStartE2EDuration="33.246869322s" podCreationTimestamp="2025-12-03 22:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:21:20.215552949 +0000 UTC m=+645.435324976" watchObservedRunningTime="2025-12-03 22:21:20.246869322 +0000 UTC m=+645.466641329" Dec 03 22:21:20.760898 master-0 kubenswrapper[36504]: E1203 22:21:20.746099 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" podUID="066459a1-f230-40e7-b00c-64edad996634" Dec 03 22:21:20.760898 master-0 kubenswrapper[36504]: E1203 22:21:20.746592 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" podUID="45bff768-9e56-4111-aecc-6b4b4bea9202" Dec 03 22:21:20.760898 master-0 kubenswrapper[36504]: E1203 22:21:20.746648 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" podUID="591effbb-eb20-4094-8637-d3fee9582c37" Dec 03 22:21:23.774956 master-0 kubenswrapper[36504]: I1203 22:21:23.774874 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-57d98476c4-z957p" Dec 03 22:21:27.070710 master-0 kubenswrapper[36504]: I1203 22:21:27.070548 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" Dec 03 22:21:27.169370 master-0 kubenswrapper[36504]: E1203 22:21:27.169304 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" podUID="591effbb-eb20-4094-8637-d3fee9582c37" Dec 03 22:21:27.213376 master-0 kubenswrapper[36504]: I1203 22:21:27.213293 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" Dec 03 22:21:27.216938 master-0 kubenswrapper[36504]: E1203 22:21:27.216870 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" podUID="066459a1-f230-40e7-b00c-64edad996634" Dec 03 22:21:27.411878 master-0 kubenswrapper[36504]: I1203 22:21:27.411808 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" Dec 03 22:21:27.415780 master-0 kubenswrapper[36504]: E1203 22:21:27.415721 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" podUID="45bff768-9e56-4111-aecc-6b4b4bea9202" Dec 03 22:21:27.771534 master-0 kubenswrapper[36504]: I1203 22:21:27.771451 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" Dec 03 22:21:27.826412 master-0 kubenswrapper[36504]: E1203 22:21:27.826333 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podUID="5e840a7b-22a7-4605-816a-dc26e299d247" Dec 03 22:21:28.966854 master-0 kubenswrapper[36504]: I1203 22:21:28.959146 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" event={"ID":"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69","Type":"ContainerStarted","Data":"7cab21faab4d2b7371a67ed66ebf0cd338d78626163bfbaaf14ac1fcf9cd5456"} Dec 03 22:21:28.966854 master-0 kubenswrapper[36504]: I1203 22:21:28.959226 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" event={"ID":"0f5cbd39-5e9c-4682-902c-e2d0b26b4e69","Type":"ContainerStarted","Data":"70851117445db6507b7f4304a14e44448803d8a5f34727b520e759a63d4b9d24"} Dec 03 22:21:28.966854 master-0 kubenswrapper[36504]: I1203 22:21:28.959392 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:21:28.966854 master-0 kubenswrapper[36504]: I1203 22:21:28.966402 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" event={"ID":"0ee5f717-c104-47a4-b47a-f56856f80286","Type":"ContainerStarted","Data":"6a09be9efb6a8de0524c52f2c3f25510c79bf0e33228211162cd88e59fafe1f2"} Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.968549 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.970722 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" event={"ID":"bb05eb5d-bc4c-4725-86ce-4e0efd653397","Type":"ContainerStarted","Data":"65d3c5fd906f10fccc464bebf95f289314fda5d8b68f8897dac9a86841b40141"} Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.971210 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.972871 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.974716 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.975737 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" event={"ID":"4d4ce37c-6c7b-40f9-a14e-4c956089d552","Type":"ContainerStarted","Data":"c8f0274e373b0ddf416b627d9cd06f9ddc0e23f7cbf97b5afca23d844b32bbef"} Dec 03 22:21:28.977802 master-0 kubenswrapper[36504]: I1203 22:21:28.976643 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:21:28.991830 master-0 kubenswrapper[36504]: I1203 22:21:28.984110 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" Dec 03 22:21:28.991830 master-0 kubenswrapper[36504]: I1203 22:21:28.986856 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" event={"ID":"1365d101-c3ad-483c-8ed5-2616e633d9ae","Type":"ContainerStarted","Data":"9ea2684e0790f2273ae1ead019670904cae07252983df6426c085d95ff103a75"} Dec 03 22:21:28.991830 master-0 kubenswrapper[36504]: I1203 22:21:28.988267 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:21:29.004239 master-0 kubenswrapper[36504]: I1203 22:21:29.004194 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" Dec 03 22:21:29.010741 master-0 kubenswrapper[36504]: I1203 22:21:29.010678 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" event={"ID":"1f86e923-1f9c-43ca-84ac-1cff0660b6a9","Type":"ContainerStarted","Data":"4b06b1eab319b57dbac10c71390508b8c0397387310926faabe758f0054f3509"} Dec 03 22:21:29.016757 master-0 kubenswrapper[36504]: I1203 22:21:29.016679 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:21:29.018946 master-0 kubenswrapper[36504]: I1203 22:21:29.018895 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" event={"ID":"66ded0bb-1de9-4b08-a2a2-c679aedbbded","Type":"ContainerStarted","Data":"d34825c502671cd692f9d2c660f82fdc4a6c15bb427b7ad2d3402d2a8b3a9742"} Dec 03 22:21:29.026805 master-0 kubenswrapper[36504]: I1203 22:21:29.026713 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" event={"ID":"294bc967-f361-45a7-85f6-e846ec563425","Type":"ContainerStarted","Data":"c550f11233b4e376b0f96ae8b01e811218c005a2d6eb6099ae7dcc6166eed6e9"} Dec 03 22:21:29.028077 master-0 kubenswrapper[36504]: I1203 22:21:29.028011 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:21:29.030460 master-0 kubenswrapper[36504]: I1203 22:21:29.030407 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" Dec 03 22:21:29.030670 master-0 kubenswrapper[36504]: I1203 22:21:29.030492 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" Dec 03 22:21:29.058570 master-0 kubenswrapper[36504]: I1203 22:21:29.045114 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" event={"ID":"7bc69a6b-4e45-44d8-9263-d04ae58ba614","Type":"ContainerStarted","Data":"90d85f7247b4eb33958bf7f7f39acdf68256673b40c1607741144d2bc5fc2b24"} Dec 03 22:21:29.058570 master-0 kubenswrapper[36504]: I1203 22:21:29.051640 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:21:29.078832 master-0 kubenswrapper[36504]: I1203 22:21:29.065716 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" podStartSLOduration=32.809452781 podStartE2EDuration="43.065646695s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:21:16.925009858 +0000 UTC m=+642.144781865" lastFinishedPulling="2025-12-03 22:21:27.181203732 +0000 UTC m=+652.400975779" observedRunningTime="2025-12-03 22:21:29.04780355 +0000 UTC m=+654.267575577" watchObservedRunningTime="2025-12-03 22:21:29.065646695 +0000 UTC m=+654.285418702" Dec 03 22:21:29.094612 master-0 kubenswrapper[36504]: I1203 22:21:29.093061 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" Dec 03 22:21:29.100449 master-0 kubenswrapper[36504]: I1203 22:21:29.097013 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-7fd96594c7-59fg5" podStartSLOduration=3.5824144430000002 podStartE2EDuration="43.09698324s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:48.518293018 +0000 UTC m=+613.738065025" lastFinishedPulling="2025-12-03 22:21:28.032861815 +0000 UTC m=+653.252633822" observedRunningTime="2025-12-03 22:21:29.082485009 +0000 UTC m=+654.302257016" watchObservedRunningTime="2025-12-03 22:21:29.09698324 +0000 UTC m=+654.316755247" Dec 03 22:21:29.123989 master-0 kubenswrapper[36504]: I1203 22:21:29.123905 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-845b79dc4f-m94lx" podStartSLOduration=5.012862823 podStartE2EDuration="43.123882914s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.921466151 +0000 UTC m=+615.141238158" lastFinishedPulling="2025-12-03 22:21:28.032486242 +0000 UTC m=+653.252258249" observedRunningTime="2025-12-03 22:21:29.112953897 +0000 UTC m=+654.332725914" watchObservedRunningTime="2025-12-03 22:21:29.123882914 +0000 UTC m=+654.343654921" Dec 03 22:21:29.141018 master-0 kubenswrapper[36504]: I1203 22:21:29.140921 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7c9bfd6967-9r2t8" podStartSLOduration=4.131752954 podStartE2EDuration="43.140902514s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.010143504 +0000 UTC m=+614.229915511" lastFinishedPulling="2025-12-03 22:21:28.019293054 +0000 UTC m=+653.239065071" observedRunningTime="2025-12-03 22:21:29.139713686 +0000 UTC m=+654.359485703" watchObservedRunningTime="2025-12-03 22:21:29.140902514 +0000 UTC m=+654.360674511" Dec 03 22:21:29.188208 master-0 kubenswrapper[36504]: I1203 22:21:29.184529 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84bc9f68f5-dgq4q" podStartSLOduration=3.697961699 podStartE2EDuration="43.184502077s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:48.573425767 +0000 UTC m=+613.793197774" lastFinishedPulling="2025-12-03 22:21:28.059966145 +0000 UTC m=+653.279738152" observedRunningTime="2025-12-03 22:21:29.179398795 +0000 UTC m=+654.399170802" watchObservedRunningTime="2025-12-03 22:21:29.184502077 +0000 UTC m=+654.404274084" Dec 03 22:21:29.265798 master-0 kubenswrapper[36504]: I1203 22:21:29.263711 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-f8856dd79-qghrs" podStartSLOduration=3.6026047439999997 podStartE2EDuration="43.26368557s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:48.358198508 +0000 UTC m=+613.577970515" lastFinishedPulling="2025-12-03 22:21:28.019279344 +0000 UTC m=+653.239051341" observedRunningTime="2025-12-03 22:21:29.230261309 +0000 UTC m=+654.450033316" watchObservedRunningTime="2025-12-03 22:21:29.26368557 +0000 UTC m=+654.483457577" Dec 03 22:21:29.315648 master-0 kubenswrapper[36504]: I1203 22:21:29.315461 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-78cd4f7769-l6vkp" podStartSLOduration=3.870764923 podStartE2EDuration="43.315418961s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:48.464942065 +0000 UTC m=+613.684714072" lastFinishedPulling="2025-12-03 22:21:27.909596093 +0000 UTC m=+653.129368110" observedRunningTime="2025-12-03 22:21:29.292097951 +0000 UTC m=+654.511869968" watchObservedRunningTime="2025-12-03 22:21:29.315418961 +0000 UTC m=+654.535190968" Dec 03 22:21:29.368140 master-0 kubenswrapper[36504]: I1203 22:21:29.367889 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-57dfcdd5b8-984d8" podStartSLOduration=5.185384676 podStartE2EDuration="43.367853865s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.874187201 +0000 UTC m=+615.093959208" lastFinishedPulling="2025-12-03 22:21:28.05665639 +0000 UTC m=+653.276428397" observedRunningTime="2025-12-03 22:21:29.349364988 +0000 UTC m=+654.569137025" watchObservedRunningTime="2025-12-03 22:21:29.367853865 +0000 UTC m=+654.587625872" Dec 03 22:21:30.085919 master-0 kubenswrapper[36504]: I1203 22:21:30.085393 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" event={"ID":"93356ecd-7681-4dbd-a992-03fef31de73f","Type":"ContainerStarted","Data":"57e9fcc285f426f4e8b8a3b81112f9361e774cf1c522536d9b3ed3ce8bafe762"} Dec 03 22:21:30.087876 master-0 kubenswrapper[36504]: I1203 22:21:30.086947 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:21:30.093788 master-0 kubenswrapper[36504]: I1203 22:21:30.093733 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" Dec 03 22:21:30.097264 master-0 kubenswrapper[36504]: I1203 22:21:30.097168 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" event={"ID":"6b3459e2-0fbe-405b-a8b7-56aeaedc0b6e","Type":"ContainerStarted","Data":"eb02ecde821b5b484bc77c59e160c8456e344d3f365e49ca9fbfdc8f3f57408e"} Dec 03 22:21:30.098580 master-0 kubenswrapper[36504]: I1203 22:21:30.098532 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:21:30.102623 master-0 kubenswrapper[36504]: I1203 22:21:30.102558 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" Dec 03 22:21:30.106097 master-0 kubenswrapper[36504]: I1203 22:21:30.105380 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" event={"ID":"66ded0bb-1de9-4b08-a2a2-c679aedbbded","Type":"ContainerStarted","Data":"154436e80c31a16521e46d170250b89afce2b09efc264e95621a7d5d4c8c9b5b"} Dec 03 22:21:30.106097 master-0 kubenswrapper[36504]: I1203 22:21:30.105653 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:21:30.110034 master-0 kubenswrapper[36504]: I1203 22:21:30.109959 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" event={"ID":"70f7d60f-1ece-4a67-a82a-d10933d8f28f","Type":"ContainerStarted","Data":"b670b5c967d7df078a69bbf7bb22e0f38dadc4ea3e88d228031155f23d336837"} Dec 03 22:21:30.110309 master-0 kubenswrapper[36504]: I1203 22:21:30.110255 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:21:30.112417 master-0 kubenswrapper[36504]: I1203 22:21:30.112354 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" event={"ID":"bc796bc4-f06b-49c3-96b7-331444eebfcf","Type":"ContainerStarted","Data":"c04540aebebe4d44e64627ae417d7286bb79f388e7fc919ab363dd2d0c25b735"} Dec 03 22:21:30.113077 master-0 kubenswrapper[36504]: I1203 22:21:30.112984 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:21:30.118801 master-0 kubenswrapper[36504]: I1203 22:21:30.116042 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" Dec 03 22:21:30.122786 master-0 kubenswrapper[36504]: I1203 22:21:30.120482 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" event={"ID":"d5090b03-b50e-4470-92ff-cdba513adcae","Type":"ContainerStarted","Data":"e7dab80f496ab29f86cca68a5526fe8e0145e2182a75ecdfe6f3b7b4ef1277ef"} Dec 03 22:21:30.126788 master-0 kubenswrapper[36504]: I1203 22:21:30.124135 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:21:30.126788 master-0 kubenswrapper[36504]: I1203 22:21:30.125828 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" Dec 03 22:21:30.129947 master-0 kubenswrapper[36504]: I1203 22:21:30.127998 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" Dec 03 22:21:30.129947 master-0 kubenswrapper[36504]: I1203 22:21:30.128740 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-56f9fbf74b-6cj4q" podStartSLOduration=5.405605704 podStartE2EDuration="44.128691067s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.894044831 +0000 UTC m=+615.113816838" lastFinishedPulling="2025-12-03 22:21:28.617130194 +0000 UTC m=+653.836902201" observedRunningTime="2025-12-03 22:21:30.114035501 +0000 UTC m=+655.333807508" watchObservedRunningTime="2025-12-03 22:21:30.128691067 +0000 UTC m=+655.348463074" Dec 03 22:21:30.131051 master-0 kubenswrapper[36504]: I1203 22:21:30.130991 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" event={"ID":"5f39d487-89d3-44a4-97e3-cf9c051ecaa5","Type":"ContainerStarted","Data":"c3ac4dc4261e14d7495e3e0406f8d2bab056a930fcef9bcbd35b272031f95608"} Dec 03 22:21:30.132884 master-0 kubenswrapper[36504]: I1203 22:21:30.132726 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:21:30.133965 master-0 kubenswrapper[36504]: I1203 22:21:30.133918 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" Dec 03 22:21:30.136605 master-0 kubenswrapper[36504]: I1203 22:21:30.136555 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" event={"ID":"af647618-8824-4366-982b-d62778dff1b6","Type":"ContainerStarted","Data":"67c8ace89bb68e54dd2650759597cb46b410838bb8e5583d14635917290d8083"} Dec 03 22:21:30.137085 master-0 kubenswrapper[36504]: I1203 22:21:30.136968 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:21:30.140582 master-0 kubenswrapper[36504]: I1203 22:21:30.140534 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" Dec 03 22:21:30.160110 master-0 kubenswrapper[36504]: I1203 22:21:30.160016 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" event={"ID":"20e58c00-5ade-44ab-825e-3d24c0644fe9","Type":"ContainerStarted","Data":"f9905a7e0137657bb6382dcb57fbfbaa610b3c0246aee59ff75f023eeb7735a9"} Dec 03 22:21:30.162535 master-0 kubenswrapper[36504]: I1203 22:21:30.162504 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:21:30.176251 master-0 kubenswrapper[36504]: I1203 22:21:30.175479 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" Dec 03 22:21:30.185306 master-0 kubenswrapper[36504]: I1203 22:21:30.185211 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-647d75769b-ljq2l" podStartSLOduration=4.899852275 podStartE2EDuration="44.185181969s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.278121087 +0000 UTC m=+614.497893094" lastFinishedPulling="2025-12-03 22:21:28.563450781 +0000 UTC m=+653.783222788" observedRunningTime="2025-12-03 22:21:30.174985585 +0000 UTC m=+655.394757612" watchObservedRunningTime="2025-12-03 22:21:30.185181969 +0000 UTC m=+655.404953976" Dec 03 22:21:30.231982 master-0 kubenswrapper[36504]: I1203 22:21:30.231591 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" podStartSLOduration=33.956661863 podStartE2EDuration="44.231564391s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:21:17.478322934 +0000 UTC m=+642.698094941" lastFinishedPulling="2025-12-03 22:21:27.753225462 +0000 UTC m=+652.972997469" observedRunningTime="2025-12-03 22:21:30.227531102 +0000 UTC m=+655.447303119" watchObservedRunningTime="2025-12-03 22:21:30.231564391 +0000 UTC m=+655.451336398" Dec 03 22:21:30.293820 master-0 kubenswrapper[36504]: I1203 22:21:30.293698 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-696b999796-tpmdc" podStartSLOduration=5.628920459 podStartE2EDuration="44.293669171s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.89874191 +0000 UTC m=+615.118513917" lastFinishedPulling="2025-12-03 22:21:28.563490632 +0000 UTC m=+653.783262629" observedRunningTime="2025-12-03 22:21:30.280125192 +0000 UTC m=+655.499897189" watchObservedRunningTime="2025-12-03 22:21:30.293669171 +0000 UTC m=+655.513441178" Dec 03 22:21:30.323643 master-0 kubenswrapper[36504]: I1203 22:21:30.323535 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-865fc86d5b-hmp8k" podStartSLOduration=5.413944328 podStartE2EDuration="44.323501848s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.146908624 +0000 UTC m=+614.366680631" lastFinishedPulling="2025-12-03 22:21:28.056466144 +0000 UTC m=+653.276238151" observedRunningTime="2025-12-03 22:21:30.304228636 +0000 UTC m=+655.524000643" watchObservedRunningTime="2025-12-03 22:21:30.323501848 +0000 UTC m=+655.543273855" Dec 03 22:21:30.331017 master-0 kubenswrapper[36504]: I1203 22:21:30.330905 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-6b64f6f645-7mmbh" podStartSLOduration=4.68029059 podStartE2EDuration="44.330880903s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.898350848 +0000 UTC m=+615.118122855" lastFinishedPulling="2025-12-03 22:21:29.548941161 +0000 UTC m=+654.768713168" observedRunningTime="2025-12-03 22:21:30.326148692 +0000 UTC m=+655.545920699" watchObservedRunningTime="2025-12-03 22:21:30.330880903 +0000 UTC m=+655.550652910" Dec 03 22:21:30.432419 master-0 kubenswrapper[36504]: I1203 22:21:30.426477 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9b669fdb-vnr5w" podStartSLOduration=4.684927636 podStartE2EDuration="43.426449054s" podCreationTimestamp="2025-12-03 22:20:47 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.901180368 +0000 UTC m=+615.120952375" lastFinishedPulling="2025-12-03 22:21:28.642701786 +0000 UTC m=+653.862473793" observedRunningTime="2025-12-03 22:21:30.419813854 +0000 UTC m=+655.639585881" watchObservedRunningTime="2025-12-03 22:21:30.426449054 +0000 UTC m=+655.646221061" Dec 03 22:21:30.469675 master-0 kubenswrapper[36504]: I1203 22:21:30.469585 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5cd89994b5-v5rh8" podStartSLOduration=4.814498577 podStartE2EDuration="44.469560772s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:48.44050916 +0000 UTC m=+613.660281167" lastFinishedPulling="2025-12-03 22:21:28.095571365 +0000 UTC m=+653.315343362" observedRunningTime="2025-12-03 22:21:30.458066308 +0000 UTC m=+655.677838325" watchObservedRunningTime="2025-12-03 22:21:30.469560772 +0000 UTC m=+655.689332779" Dec 03 22:21:33.444010 master-0 kubenswrapper[36504]: I1203 22:21:33.443935 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cb6d6b947xdn7h" Dec 03 22:21:34.406506 master-0 kubenswrapper[36504]: I1203 22:21:34.406389 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7b5867bfc7-ts775" podStartSLOduration=9.741401682 podStartE2EDuration="48.40635418s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.898125481 +0000 UTC m=+615.117897488" lastFinishedPulling="2025-12-03 22:21:28.563077979 +0000 UTC m=+653.782849986" observedRunningTime="2025-12-03 22:21:30.535894547 +0000 UTC m=+655.755666574" watchObservedRunningTime="2025-12-03 22:21:34.40635418 +0000 UTC m=+659.626126197" Dec 03 22:21:35.720522 master-0 kubenswrapper[36504]: E1203 22:21:35.720403 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:21:39.273621 master-0 kubenswrapper[36504]: I1203 22:21:39.273531 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" event={"ID":"066459a1-f230-40e7-b00c-64edad996634","Type":"ContainerStarted","Data":"1771c23783fe6aa98b50f9dd242fcb72369a1c7ba77b5d876f18737cf6bc3422"} Dec 03 22:21:39.629102 master-0 kubenswrapper[36504]: I1203 22:21:39.628789 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-58b8dcc5fb-dzpzd" podStartSLOduration=26.302346217 podStartE2EDuration="53.628732878s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.180585942 +0000 UTC m=+614.400357949" lastFinishedPulling="2025-12-03 22:21:16.506972603 +0000 UTC m=+641.726744610" observedRunningTime="2025-12-03 22:21:39.626912251 +0000 UTC m=+664.846684258" watchObservedRunningTime="2025-12-03 22:21:39.628732878 +0000 UTC m=+664.848504895" Dec 03 22:21:42.325710 master-0 kubenswrapper[36504]: I1203 22:21:42.325604 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" event={"ID":"5e840a7b-22a7-4605-816a-dc26e299d247","Type":"ContainerStarted","Data":"ed5cae503238934eb676c6b6335cbada266dabd7f8fabb3fdbd612b2c0387306"} Dec 03 22:21:42.329692 master-0 kubenswrapper[36504]: I1203 22:21:42.329650 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" event={"ID":"45bff768-9e56-4111-aecc-6b4b4bea9202","Type":"ContainerStarted","Data":"635493ca00da1b85cac3911bb9a0c65c73bb7fccf2cd43ec430aa08cbefcff84"} Dec 03 22:21:42.334220 master-0 kubenswrapper[36504]: I1203 22:21:42.334150 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" event={"ID":"591effbb-eb20-4094-8637-d3fee9582c37","Type":"ContainerStarted","Data":"4bee90b5a76a43e78d38cd29de4ebb1bd8edf553349552a793eba225eedcee21"} Dec 03 22:21:42.354893 master-0 kubenswrapper[36504]: I1203 22:21:42.354635 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-647f96877-tjm2v" podStartSLOduration=29.695261498 podStartE2EDuration="56.354609003s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.912733134 +0000 UTC m=+615.132505141" lastFinishedPulling="2025-12-03 22:21:16.572080639 +0000 UTC m=+641.791852646" observedRunningTime="2025-12-03 22:21:42.345943968 +0000 UTC m=+667.565715995" watchObservedRunningTime="2025-12-03 22:21:42.354609003 +0000 UTC m=+667.574381010" Dec 03 22:21:42.380205 master-0 kubenswrapper[36504]: I1203 22:21:42.377708 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-f6cc97788-tz5zp" podStartSLOduration=29.080196491 podStartE2EDuration="56.377678014s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:48.577256819 +0000 UTC m=+613.797028826" lastFinishedPulling="2025-12-03 22:21:15.874738342 +0000 UTC m=+641.094510349" observedRunningTime="2025-12-03 22:21:42.371870061 +0000 UTC m=+667.591642078" watchObservedRunningTime="2025-12-03 22:21:42.377678014 +0000 UTC m=+667.597450021" Dec 03 22:21:42.746458 master-0 kubenswrapper[36504]: I1203 22:21:42.746372 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d9c9d7fd8-lcnlm" Dec 03 22:21:42.785804 master-0 kubenswrapper[36504]: I1203 22:21:42.785375 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cdd6b54fb-wzvfr" podStartSLOduration=30.043870549 podStartE2EDuration="56.78534767s" podCreationTimestamp="2025-12-03 22:20:46 +0000 UTC" firstStartedPulling="2025-12-03 22:20:49.097853978 +0000 UTC m=+614.317625985" lastFinishedPulling="2025-12-03 22:21:15.839331099 +0000 UTC m=+641.059103106" observedRunningTime="2025-12-03 22:21:42.417117905 +0000 UTC m=+667.636889912" watchObservedRunningTime="2025-12-03 22:21:42.78534767 +0000 UTC m=+668.005119677" Dec 03 22:22:14.104029 master-0 kubenswrapper[36504]: I1203 22:22:14.103951 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:22:15.103105 master-0 kubenswrapper[36504]: I1203 22:22:15.103005 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:22:35.708970 master-0 kubenswrapper[36504]: E1203 22:22:35.708871 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:22:48.088478 master-0 kubenswrapper[36504]: I1203 22:22:48.085238 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-tq96q"] Dec 03 22:22:48.098113 master-0 kubenswrapper[36504]: I1203 22:22:48.094538 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.098113 master-0 kubenswrapper[36504]: I1203 22:22:48.097344 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-tq96q"] Dec 03 22:22:48.099258 master-0 kubenswrapper[36504]: I1203 22:22:48.099221 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 22:22:48.116505 master-0 kubenswrapper[36504]: I1203 22:22:48.114020 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 22:22:48.116505 master-0 kubenswrapper[36504]: I1203 22:22:48.114373 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 22:22:48.146047 master-0 kubenswrapper[36504]: I1203 22:22:48.144194 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d60333c-6dc0-4509-9201-7e296daf6f73-config\") pod \"dnsmasq-dns-5dbfd7c4bf-tq96q\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.146047 master-0 kubenswrapper[36504]: I1203 22:22:48.144343 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrd6g\" (UniqueName: \"kubernetes.io/projected/9d60333c-6dc0-4509-9201-7e296daf6f73-kube-api-access-mrd6g\") pod \"dnsmasq-dns-5dbfd7c4bf-tq96q\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.159343 master-0 kubenswrapper[36504]: I1203 22:22:48.159243 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-b4xzc"] Dec 03 22:22:48.189369 master-0 kubenswrapper[36504]: I1203 22:22:48.188737 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-b4xzc"] Dec 03 22:22:48.189369 master-0 kubenswrapper[36504]: I1203 22:22:48.189021 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.191533 master-0 kubenswrapper[36504]: I1203 22:22:48.191482 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 22:22:48.251230 master-0 kubenswrapper[36504]: I1203 22:22:48.251120 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.251582 master-0 kubenswrapper[36504]: I1203 22:22:48.251480 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrd6g\" (UniqueName: \"kubernetes.io/projected/9d60333c-6dc0-4509-9201-7e296daf6f73-kube-api-access-mrd6g\") pod \"dnsmasq-dns-5dbfd7c4bf-tq96q\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.251795 master-0 kubenswrapper[36504]: I1203 22:22:48.251697 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-config\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.252127 master-0 kubenswrapper[36504]: I1203 22:22:48.252076 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvf9l\" (UniqueName: \"kubernetes.io/projected/b6d8084e-5235-4191-8b31-5dd067f81b91-kube-api-access-pvf9l\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.252217 master-0 kubenswrapper[36504]: I1203 22:22:48.252190 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d60333c-6dc0-4509-9201-7e296daf6f73-config\") pod \"dnsmasq-dns-5dbfd7c4bf-tq96q\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.254062 master-0 kubenswrapper[36504]: I1203 22:22:48.253982 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d60333c-6dc0-4509-9201-7e296daf6f73-config\") pod \"dnsmasq-dns-5dbfd7c4bf-tq96q\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.278940 master-0 kubenswrapper[36504]: I1203 22:22:48.275212 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrd6g\" (UniqueName: \"kubernetes.io/projected/9d60333c-6dc0-4509-9201-7e296daf6f73-kube-api-access-mrd6g\") pod \"dnsmasq-dns-5dbfd7c4bf-tq96q\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.354879 master-0 kubenswrapper[36504]: I1203 22:22:48.354653 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvf9l\" (UniqueName: \"kubernetes.io/projected/b6d8084e-5235-4191-8b31-5dd067f81b91-kube-api-access-pvf9l\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.354879 master-0 kubenswrapper[36504]: I1203 22:22:48.354817 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.355201 master-0 kubenswrapper[36504]: I1203 22:22:48.354903 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-config\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.356682 master-0 kubenswrapper[36504]: I1203 22:22:48.356644 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-dns-svc\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.356864 master-0 kubenswrapper[36504]: I1203 22:22:48.356698 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-config\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.374942 master-0 kubenswrapper[36504]: I1203 22:22:48.374897 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvf9l\" (UniqueName: \"kubernetes.io/projected/b6d8084e-5235-4191-8b31-5dd067f81b91-kube-api-access-pvf9l\") pod \"dnsmasq-dns-75d7c5dbd7-b4xzc\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.466411 master-0 kubenswrapper[36504]: I1203 22:22:48.466317 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:22:48.567496 master-0 kubenswrapper[36504]: I1203 22:22:48.567418 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:22:48.852839 master-0 kubenswrapper[36504]: I1203 22:22:48.852457 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-tq96q"] Dec 03 22:22:48.856583 master-0 kubenswrapper[36504]: W1203 22:22:48.856395 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d60333c_6dc0_4509_9201_7e296daf6f73.slice/crio-88e4eeeff208261b587dfdf7d52e5989dce926d1cfb0b2ccc0c4961b7607a043 WatchSource:0}: Error finding container 88e4eeeff208261b587dfdf7d52e5989dce926d1cfb0b2ccc0c4961b7607a043: Status 404 returned error can't find the container with id 88e4eeeff208261b587dfdf7d52e5989dce926d1cfb0b2ccc0c4961b7607a043 Dec 03 22:22:49.174524 master-0 kubenswrapper[36504]: I1203 22:22:49.174295 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" event={"ID":"9d60333c-6dc0-4509-9201-7e296daf6f73","Type":"ContainerStarted","Data":"88e4eeeff208261b587dfdf7d52e5989dce926d1cfb0b2ccc0c4961b7607a043"} Dec 03 22:22:49.650793 master-0 kubenswrapper[36504]: I1203 22:22:49.648869 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-b4xzc"] Dec 03 22:22:50.191792 master-0 kubenswrapper[36504]: I1203 22:22:50.188390 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" event={"ID":"b6d8084e-5235-4191-8b31-5dd067f81b91","Type":"ContainerStarted","Data":"10b6c784cdbfb8fb6dc169103d560854c3e42b32c382f7a23dc1ee820525e3be"} Dec 03 22:22:50.226060 master-0 kubenswrapper[36504]: I1203 22:22:50.225958 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-tq96q"] Dec 03 22:22:50.488428 master-0 kubenswrapper[36504]: I1203 22:22:50.488272 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f856ff86c-xkdfs"] Dec 03 22:22:50.498387 master-0 kubenswrapper[36504]: I1203 22:22:50.498314 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.506539 master-0 kubenswrapper[36504]: I1203 22:22:50.506476 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f856ff86c-xkdfs"] Dec 03 22:22:50.646540 master-0 kubenswrapper[36504]: I1203 22:22:50.645308 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-config\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.646540 master-0 kubenswrapper[36504]: I1203 22:22:50.645411 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-dns-svc\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.646540 master-0 kubenswrapper[36504]: I1203 22:22:50.645489 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv6f\" (UniqueName: \"kubernetes.io/projected/4339d191-869f-43be-80f4-92e25c8f6a7a-kube-api-access-brv6f\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.748530 master-0 kubenswrapper[36504]: I1203 22:22:50.747820 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv6f\" (UniqueName: \"kubernetes.io/projected/4339d191-869f-43be-80f4-92e25c8f6a7a-kube-api-access-brv6f\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.748530 master-0 kubenswrapper[36504]: I1203 22:22:50.747924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-config\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.748530 master-0 kubenswrapper[36504]: I1203 22:22:50.747989 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-dns-svc\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.749837 master-0 kubenswrapper[36504]: I1203 22:22:50.749106 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-dns-svc\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.749837 master-0 kubenswrapper[36504]: I1203 22:22:50.749610 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-config\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:50.894714 master-0 kubenswrapper[36504]: I1203 22:22:50.894623 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv6f\" (UniqueName: \"kubernetes.io/projected/4339d191-869f-43be-80f4-92e25c8f6a7a-kube-api-access-brv6f\") pod \"dnsmasq-dns-6f856ff86c-xkdfs\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:51.140723 master-0 kubenswrapper[36504]: I1203 22:22:51.140643 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:22:51.693234 master-0 kubenswrapper[36504]: I1203 22:22:51.693066 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-b4xzc"] Dec 03 22:22:51.868160 master-0 kubenswrapper[36504]: I1203 22:22:51.866316 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-z6l5h"] Dec 03 22:22:51.868754 master-0 kubenswrapper[36504]: I1203 22:22:51.868681 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:51.916246 master-0 kubenswrapper[36504]: I1203 22:22:51.916159 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-z6l5h"] Dec 03 22:22:52.000887 master-0 kubenswrapper[36504]: I1203 22:22:52.000410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqswd\" (UniqueName: \"kubernetes.io/projected/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-kube-api-access-vqswd\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.000887 master-0 kubenswrapper[36504]: I1203 22:22:52.000699 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-dns-svc\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.000887 master-0 kubenswrapper[36504]: I1203 22:22:52.000745 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-config\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.024836 master-0 kubenswrapper[36504]: I1203 22:22:52.024752 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f856ff86c-xkdfs"] Dec 03 22:22:52.106095 master-0 kubenswrapper[36504]: I1203 22:22:52.103975 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-dns-svc\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.106095 master-0 kubenswrapper[36504]: I1203 22:22:52.104277 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-config\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.106095 master-0 kubenswrapper[36504]: I1203 22:22:52.104507 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqswd\" (UniqueName: \"kubernetes.io/projected/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-kube-api-access-vqswd\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.111875 master-0 kubenswrapper[36504]: I1203 22:22:52.107218 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-config\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.111875 master-0 kubenswrapper[36504]: I1203 22:22:52.107366 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-dns-svc\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.160643 master-0 kubenswrapper[36504]: I1203 22:22:52.160585 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqswd\" (UniqueName: \"kubernetes.io/projected/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-kube-api-access-vqswd\") pod \"dnsmasq-dns-658bb5765c-z6l5h\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.227798 master-0 kubenswrapper[36504]: I1203 22:22:52.227725 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:22:52.291550 master-0 kubenswrapper[36504]: I1203 22:22:52.291367 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" event={"ID":"4339d191-869f-43be-80f4-92e25c8f6a7a","Type":"ContainerStarted","Data":"59ab8c5cb00a4eddfb83f74749073c881f2e5326523c7199bf205a43053e31ed"} Dec 03 22:22:52.913233 master-0 kubenswrapper[36504]: I1203 22:22:52.913150 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-z6l5h"] Dec 03 22:22:53.317008 master-0 kubenswrapper[36504]: I1203 22:22:53.316909 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" event={"ID":"fa0d6984-46f5-4f29-ba6e-75ac1a90606b","Type":"ContainerStarted","Data":"84d4c45a49ae3fed1fea680d50ca3895ef535b76410e8604a237cb58315c914e"} Dec 03 22:22:55.281658 master-0 kubenswrapper[36504]: I1203 22:22:55.281578 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:22:55.288137 master-0 kubenswrapper[36504]: I1203 22:22:55.287961 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.342872 master-0 kubenswrapper[36504]: I1203 22:22:55.341416 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 22:22:55.342872 master-0 kubenswrapper[36504]: I1203 22:22:55.341466 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 22:22:55.342872 master-0 kubenswrapper[36504]: I1203 22:22:55.341663 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 22:22:55.342872 master-0 kubenswrapper[36504]: I1203 22:22:55.341679 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 22:22:55.342872 master-0 kubenswrapper[36504]: I1203 22:22:55.341894 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 22:22:55.342872 master-0 kubenswrapper[36504]: I1203 22:22:55.342019 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 22:22:55.443943 master-0 kubenswrapper[36504]: I1203 22:22:55.443844 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh8dh\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-kube-api-access-hh8dh\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444441 master-0 kubenswrapper[36504]: I1203 22:22:55.443978 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-config-data\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444441 master-0 kubenswrapper[36504]: I1203 22:22:55.444007 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-06d82782-2362-4f98-959d-11759748f469\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7d89d6e1-db13-46ad-9b88-a39d70246160\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444441 master-0 kubenswrapper[36504]: I1203 22:22:55.444063 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444441 master-0 kubenswrapper[36504]: I1203 22:22:55.444088 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444441 master-0 kubenswrapper[36504]: I1203 22:22:55.444225 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444691 master-0 kubenswrapper[36504]: I1203 22:22:55.444598 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.444819 master-0 kubenswrapper[36504]: I1203 22:22:55.444778 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.446038 master-0 kubenswrapper[36504]: I1203 22:22:55.445969 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.446038 master-0 kubenswrapper[36504]: I1203 22:22:55.446037 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.446195 master-0 kubenswrapper[36504]: I1203 22:22:55.446075 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.512663 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548660 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548761 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548814 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548858 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh8dh\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-kube-api-access-hh8dh\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548907 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-config-data\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548946 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-06d82782-2362-4f98-959d-11759748f469\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7d89d6e1-db13-46ad-9b88-a39d70246160\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.548976 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.549004 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.549049 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.549104 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.549169 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.549442 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.549677 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.550357 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-config-data\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.550858 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.551651 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-server-conf\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.555089 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-pod-info\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.556653 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.556692 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-06d82782-2362-4f98-959d-11759748f469\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7d89d6e1-db13-46ad-9b88-a39d70246160\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/9c1cc96cb5d464128954f8bdd79c849dd0c06144d8d719c511b91520742ea538/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.562937 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.565509 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.637296 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 22:22:55.652419 master-0 kubenswrapper[36504]: I1203 22:22:55.639850 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 22:22:55.668669 master-0 kubenswrapper[36504]: I1203 22:22:55.655863 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.668669 master-0 kubenswrapper[36504]: I1203 22:22:55.658688 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 22:22:55.693275 master-0 kubenswrapper[36504]: I1203 22:22:55.679735 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 22:22:55.693275 master-0 kubenswrapper[36504]: I1203 22:22:55.684685 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 22:22:55.697352 master-0 kubenswrapper[36504]: I1203 22:22:55.697301 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh8dh\" (UniqueName: \"kubernetes.io/projected/09ea7cea-94f9-4bdb-ad61-24281b0ee1ed-kube-api-access-hh8dh\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:55.765633 master-0 kubenswrapper[36504]: I1203 22:22:55.765518 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-kolla-config\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.765633 master-0 kubenswrapper[36504]: I1203 22:22:55.765591 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.766526 master-0 kubenswrapper[36504]: I1203 22:22:55.765671 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dlfr\" (UniqueName: \"kubernetes.io/projected/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-kube-api-access-7dlfr\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.766526 master-0 kubenswrapper[36504]: I1203 22:22:55.765718 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.766526 master-0 kubenswrapper[36504]: I1203 22:22:55.765758 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-config-data\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.788006 master-0 kubenswrapper[36504]: I1203 22:22:55.786782 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 22:22:55.891966 master-0 kubenswrapper[36504]: I1203 22:22:55.891780 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dlfr\" (UniqueName: \"kubernetes.io/projected/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-kube-api-access-7dlfr\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.892994 master-0 kubenswrapper[36504]: I1203 22:22:55.892971 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.894319 master-0 kubenswrapper[36504]: I1203 22:22:55.894292 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-config-data\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.897172 master-0 kubenswrapper[36504]: I1203 22:22:55.897111 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-kolla-config\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.897462 master-0 kubenswrapper[36504]: I1203 22:22:55.897443 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.900245 master-0 kubenswrapper[36504]: I1203 22:22:55.900183 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.900874 master-0 kubenswrapper[36504]: I1203 22:22:55.900833 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-config-data\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.901807 master-0 kubenswrapper[36504]: I1203 22:22:55.901738 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-kolla-config\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.906074 master-0 kubenswrapper[36504]: I1203 22:22:55.905917 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.949744 master-0 kubenswrapper[36504]: I1203 22:22:55.949580 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dlfr\" (UniqueName: \"kubernetes.io/projected/8b81647a-8849-4ff3-ad1b-08f0aaaa5657-kube-api-access-7dlfr\") pod \"memcached-0\" (UID: \"8b81647a-8849-4ff3-ad1b-08f0aaaa5657\") " pod="openstack/memcached-0" Dec 03 22:22:55.991462 master-0 kubenswrapper[36504]: I1203 22:22:55.991380 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 22:22:57.815870 master-0 kubenswrapper[36504]: I1203 22:22:57.815725 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-06d82782-2362-4f98-959d-11759748f469\" (UniqueName: \"kubernetes.io/csi/topolvm.io^7d89d6e1-db13-46ad-9b88-a39d70246160\") pod \"rabbitmq-server-0\" (UID: \"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed\") " pod="openstack/rabbitmq-server-0" Dec 03 22:22:58.039353 master-0 kubenswrapper[36504]: I1203 22:22:58.039177 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 22:22:58.137590 master-0 kubenswrapper[36504]: I1203 22:22:58.137487 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:22:58.143986 master-0 kubenswrapper[36504]: I1203 22:22:58.141747 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:22:58.183313 master-0 kubenswrapper[36504]: I1203 22:22:58.169263 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:22:58.264138 master-0 kubenswrapper[36504]: I1203 22:22:58.263914 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksr5\" (UniqueName: \"kubernetes.io/projected/1284fac2-9956-456c-9781-135e637e85bd-kube-api-access-4ksr5\") pod \"kube-state-metrics-0\" (UID: \"1284fac2-9956-456c-9781-135e637e85bd\") " pod="openstack/kube-state-metrics-0" Dec 03 22:22:58.381564 master-0 kubenswrapper[36504]: I1203 22:22:58.375881 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksr5\" (UniqueName: \"kubernetes.io/projected/1284fac2-9956-456c-9781-135e637e85bd-kube-api-access-4ksr5\") pod \"kube-state-metrics-0\" (UID: \"1284fac2-9956-456c-9781-135e637e85bd\") " pod="openstack/kube-state-metrics-0" Dec 03 22:22:58.421189 master-0 kubenswrapper[36504]: I1203 22:22:58.421114 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksr5\" (UniqueName: \"kubernetes.io/projected/1284fac2-9956-456c-9781-135e637e85bd-kube-api-access-4ksr5\") pod \"kube-state-metrics-0\" (UID: \"1284fac2-9956-456c-9781-135e637e85bd\") " pod="openstack/kube-state-metrics-0" Dec 03 22:22:58.532494 master-0 kubenswrapper[36504]: I1203 22:22:58.532340 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:22:58.851568 master-0 kubenswrapper[36504]: I1203 22:22:58.851502 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zqx7b"] Dec 03 22:22:58.860496 master-0 kubenswrapper[36504]: I1203 22:22:58.858501 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.867328 master-0 kubenswrapper[36504]: I1203 22:22:58.867262 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 22:22:58.869943 master-0 kubenswrapper[36504]: I1203 22:22:58.867967 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 22:22:58.905391 master-0 kubenswrapper[36504]: I1203 22:22:58.905164 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnd6n\" (UniqueName: \"kubernetes.io/projected/13962c28-06ea-4b66-aa46-f00d50e29eda-kube-api-access-lnd6n\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.905391 master-0 kubenswrapper[36504]: I1203 22:22:58.905288 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-log-ovn\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.905391 master-0 kubenswrapper[36504]: I1203 22:22:58.905355 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/13962c28-06ea-4b66-aa46-f00d50e29eda-ovn-controller-tls-certs\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.905391 master-0 kubenswrapper[36504]: I1203 22:22:58.905384 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13962c28-06ea-4b66-aa46-f00d50e29eda-scripts\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.905391 master-0 kubenswrapper[36504]: I1203 22:22:58.905411 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13962c28-06ea-4b66-aa46-f00d50e29eda-combined-ca-bundle\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.905974 master-0 kubenswrapper[36504]: I1203 22:22:58.905433 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-run-ovn\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.905974 master-0 kubenswrapper[36504]: I1203 22:22:58.905480 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-run\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:58.913207 master-0 kubenswrapper[36504]: I1203 22:22:58.913118 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-54zjs"] Dec 03 22:22:58.920175 master-0 kubenswrapper[36504]: I1203 22:22:58.919645 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:58.944197 master-0 kubenswrapper[36504]: I1203 22:22:58.944106 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zqx7b"] Dec 03 22:22:59.011189 master-0 kubenswrapper[36504]: I1203 22:22:59.007434 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-scripts\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.013196 master-0 kubenswrapper[36504]: I1203 22:22:59.012279 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-log-ovn\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013196 master-0 kubenswrapper[36504]: I1203 22:22:59.012760 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/13962c28-06ea-4b66-aa46-f00d50e29eda-ovn-controller-tls-certs\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013394 master-0 kubenswrapper[36504]: I1203 22:22:59.013232 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13962c28-06ea-4b66-aa46-f00d50e29eda-scripts\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013394 master-0 kubenswrapper[36504]: I1203 22:22:59.013266 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13962c28-06ea-4b66-aa46-f00d50e29eda-combined-ca-bundle\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013394 master-0 kubenswrapper[36504]: I1203 22:22:59.013288 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-run-ovn\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013394 master-0 kubenswrapper[36504]: I1203 22:22:59.013365 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-log\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.013564 master-0 kubenswrapper[36504]: I1203 22:22:59.013416 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkkwf\" (UniqueName: \"kubernetes.io/projected/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-kube-api-access-qkkwf\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.013564 master-0 kubenswrapper[36504]: I1203 22:22:59.013456 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-run\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013564 master-0 kubenswrapper[36504]: I1203 22:22:59.013496 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-run\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.013564 master-0 kubenswrapper[36504]: I1203 22:22:59.013515 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-lib\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.013927 master-0 kubenswrapper[36504]: I1203 22:22:59.013625 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnd6n\" (UniqueName: \"kubernetes.io/projected/13962c28-06ea-4b66-aa46-f00d50e29eda-kube-api-access-lnd6n\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013927 master-0 kubenswrapper[36504]: I1203 22:22:59.013700 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-log-ovn\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.013927 master-0 kubenswrapper[36504]: I1203 22:22:59.013718 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-etc-ovs\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.025529 master-0 kubenswrapper[36504]: I1203 22:22:59.016725 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13962c28-06ea-4b66-aa46-f00d50e29eda-scripts\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.025529 master-0 kubenswrapper[36504]: I1203 22:22:59.019950 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/13962c28-06ea-4b66-aa46-f00d50e29eda-ovn-controller-tls-certs\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.025529 master-0 kubenswrapper[36504]: I1203 22:22:59.020183 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-run\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.025529 master-0 kubenswrapper[36504]: I1203 22:22:59.021045 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13962c28-06ea-4b66-aa46-f00d50e29eda-var-run-ovn\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.036131 master-0 kubenswrapper[36504]: I1203 22:22:59.036002 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-54zjs"] Dec 03 22:22:59.046009 master-0 kubenswrapper[36504]: I1203 22:22:59.043492 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13962c28-06ea-4b66-aa46-f00d50e29eda-combined-ca-bundle\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.071433 master-0 kubenswrapper[36504]: I1203 22:22:59.071321 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnd6n\" (UniqueName: \"kubernetes.io/projected/13962c28-06ea-4b66-aa46-f00d50e29eda-kube-api-access-lnd6n\") pod \"ovn-controller-zqx7b\" (UID: \"13962c28-06ea-4b66-aa46-f00d50e29eda\") " pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.119543 master-0 kubenswrapper[36504]: I1203 22:22:59.118576 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-log\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.119543 master-0 kubenswrapper[36504]: I1203 22:22:59.119199 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkkwf\" (UniqueName: \"kubernetes.io/projected/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-kube-api-access-qkkwf\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.120346 master-0 kubenswrapper[36504]: I1203 22:22:59.120323 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-run\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.120440 master-0 kubenswrapper[36504]: I1203 22:22:59.120356 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-lib\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.120757 master-0 kubenswrapper[36504]: I1203 22:22:59.120478 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-etc-ovs\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.120757 master-0 kubenswrapper[36504]: I1203 22:22:59.120561 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-run\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.120757 master-0 kubenswrapper[36504]: I1203 22:22:59.120584 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-scripts\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.127352 master-0 kubenswrapper[36504]: I1203 22:22:59.127298 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-etc-ovs\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.127657 master-0 kubenswrapper[36504]: I1203 22:22:59.127417 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-log\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.129112 master-0 kubenswrapper[36504]: I1203 22:22:59.129084 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-var-lib\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.141747 master-0 kubenswrapper[36504]: I1203 22:22:59.139794 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-scripts\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.184407 master-0 kubenswrapper[36504]: I1203 22:22:59.184167 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkkwf\" (UniqueName: \"kubernetes.io/projected/1aa8ce82-8c7e-4f58-8585-8ca5680b6951-kube-api-access-qkkwf\") pod \"ovn-controller-ovs-54zjs\" (UID: \"1aa8ce82-8c7e-4f58-8585-8ca5680b6951\") " pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.192083 master-0 kubenswrapper[36504]: I1203 22:22:59.191956 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b" Dec 03 22:22:59.198600 master-0 kubenswrapper[36504]: I1203 22:22:59.198402 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 22:22:59.203681 master-0 kubenswrapper[36504]: I1203 22:22:59.203616 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.214490 master-0 kubenswrapper[36504]: I1203 22:22:59.211664 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 03 22:22:59.214490 master-0 kubenswrapper[36504]: I1203 22:22:59.212048 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 03 22:22:59.214490 master-0 kubenswrapper[36504]: I1203 22:22:59.214386 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 03 22:22:59.221074 master-0 kubenswrapper[36504]: I1203 22:22:59.220921 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 03 22:22:59.264244 master-0 kubenswrapper[36504]: I1203 22:22:59.264152 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55ac1860-363d-42ff-90dd-3b6bf2e78864-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.266440 master-0 kubenswrapper[36504]: I1203 22:22:59.264821 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55ac1860-363d-42ff-90dd-3b6bf2e78864-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.268554 master-0 kubenswrapper[36504]: I1203 22:22:59.268490 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gv7\" (UniqueName: \"kubernetes.io/projected/55ac1860-363d-42ff-90dd-3b6bf2e78864-kube-api-access-b7gv7\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.271893 master-0 kubenswrapper[36504]: I1203 22:22:59.271813 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55ac1860-363d-42ff-90dd-3b6bf2e78864-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.272271 master-0 kubenswrapper[36504]: I1203 22:22:59.272221 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.272577 master-0 kubenswrapper[36504]: I1203 22:22:59.272557 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.272877 master-0 kubenswrapper[36504]: I1203 22:22:59.272833 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.287469 master-0 kubenswrapper[36504]: I1203 22:22:59.287320 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 22:22:59.316919 master-0 kubenswrapper[36504]: I1203 22:22:59.316410 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:22:59.328323 master-0 kubenswrapper[36504]: I1203 22:22:59.328238 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:22:59.410595 master-0 kubenswrapper[36504]: I1203 22:22:59.410501 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gv7\" (UniqueName: \"kubernetes.io/projected/55ac1860-363d-42ff-90dd-3b6bf2e78864-kube-api-access-b7gv7\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.410986 master-0 kubenswrapper[36504]: I1203 22:22:59.410651 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55ac1860-363d-42ff-90dd-3b6bf2e78864-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.410986 master-0 kubenswrapper[36504]: I1203 22:22:59.410705 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.410986 master-0 kubenswrapper[36504]: I1203 22:22:59.410753 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.410986 master-0 kubenswrapper[36504]: I1203 22:22:59.410802 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.410986 master-0 kubenswrapper[36504]: I1203 22:22:59.410863 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55ac1860-363d-42ff-90dd-3b6bf2e78864-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.410986 master-0 kubenswrapper[36504]: I1203 22:22:59.410885 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55ac1860-363d-42ff-90dd-3b6bf2e78864-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.425640 master-0 kubenswrapper[36504]: I1203 22:22:59.420028 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55ac1860-363d-42ff-90dd-3b6bf2e78864-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.425640 master-0 kubenswrapper[36504]: I1203 22:22:59.424484 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:22:59.425640 master-0 kubenswrapper[36504]: I1203 22:22:59.424540 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 22:22:59.425640 master-0 kubenswrapper[36504]: I1203 22:22:59.424856 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.428726 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.429577 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.429889 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.429936 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.430148 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.430183 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.430168 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.430294 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.431423 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 22:22:59.435096 master-0 kubenswrapper[36504]: I1203 22:22:59.434380 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 22:22:59.440296 master-0 kubenswrapper[36504]: I1203 22:22:59.435317 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 22:22:59.440296 master-0 kubenswrapper[36504]: I1203 22:22:59.436550 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 22:22:59.440296 master-0 kubenswrapper[36504]: I1203 22:22:59.437817 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55ac1860-363d-42ff-90dd-3b6bf2e78864-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.440296 master-0 kubenswrapper[36504]: I1203 22:22:59.439347 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55ac1860-363d-42ff-90dd-3b6bf2e78864-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:22:59.460347 master-0 kubenswrapper[36504]: I1203 22:22:59.460141 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55ac1860-363d-42ff-90dd-3b6bf2e78864-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:23:00.650072 master-0 kubenswrapper[36504]: I1203 22:23:00.648914 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 22:23:01.495159 master-0 kubenswrapper[36504]: I1203 22:23:01.492097 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gv7\" (UniqueName: \"kubernetes.io/projected/55ac1860-363d-42ff-90dd-3b6bf2e78864-kube-api-access-b7gv7\") pod \"alertmanager-metric-storage-0\" (UID: \"55ac1860-363d-42ff-90dd-3b6bf2e78864\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610175 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6be1c526-0db0-4e85-9d73-8c78c20e4273-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610289 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610349 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610446 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1ed8445-bac4-4e69-9462-c1e33f646315-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610503 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ed8445-bac4-4e69-9462-c1e33f646315-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610534 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6be1c526-0db0-4e85-9d73-8c78c20e4273-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610558 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610575 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610622 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-095faa9e-409b-48e9-8678-291111c98615\" (UniqueName: \"kubernetes.io/csi/topolvm.io^05a7a762-489b-438f-bebc-5cc01a276c64\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610666 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610714 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtgwk\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-kube-api-access-vtgwk\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.610848 master-0 kubenswrapper[36504]: I1203 22:23:01.610870 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5e84b778-c4f9-4bee-975e-4ea838dd3fbb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^781d4877-3b76-4d48-8c6b-fea84eab440c\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.611713 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-kolla-config\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.611848 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.611910 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-config-data-default\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.612010 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.612080 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.612203 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvlx\" (UniqueName: \"kubernetes.io/projected/a1ed8445-bac4-4e69-9462-c1e33f646315-kube-api-access-7nvlx\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.613036 master-0 kubenswrapper[36504]: I1203 22:23:01.612238 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ed8445-bac4-4e69-9462-c1e33f646315-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.676657 master-0 kubenswrapper[36504]: I1203 22:23:01.674135 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:23:01.718406 master-0 kubenswrapper[36504]: I1203 22:23:01.718321 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.718762 master-0 kubenswrapper[36504]: I1203 22:23:01.718445 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtgwk\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-kube-api-access-vtgwk\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.718762 master-0 kubenswrapper[36504]: I1203 22:23:01.718529 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5e84b778-c4f9-4bee-975e-4ea838dd3fbb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^781d4877-3b76-4d48-8c6b-fea84eab440c\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.718762 master-0 kubenswrapper[36504]: I1203 22:23:01.718618 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-kolla-config\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.718762 master-0 kubenswrapper[36504]: I1203 22:23:01.718660 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.718762 master-0 kubenswrapper[36504]: I1203 22:23:01.718687 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-config-data-default\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.718762 master-0 kubenswrapper[36504]: I1203 22:23:01.718728 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.718990 master-0 kubenswrapper[36504]: I1203 22:23:01.718762 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.718990 master-0 kubenswrapper[36504]: I1203 22:23:01.718824 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvlx\" (UniqueName: \"kubernetes.io/projected/a1ed8445-bac4-4e69-9462-c1e33f646315-kube-api-access-7nvlx\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.718990 master-0 kubenswrapper[36504]: I1203 22:23:01.718851 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ed8445-bac4-4e69-9462-c1e33f646315-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.718990 master-0 kubenswrapper[36504]: I1203 22:23:01.718905 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6be1c526-0db0-4e85-9d73-8c78c20e4273-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.718990 master-0 kubenswrapper[36504]: I1203 22:23:01.718940 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.718990 master-0 kubenswrapper[36504]: I1203 22:23:01.718972 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.719180 master-0 kubenswrapper[36504]: I1203 22:23:01.719019 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1ed8445-bac4-4e69-9462-c1e33f646315-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.719180 master-0 kubenswrapper[36504]: I1203 22:23:01.719056 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ed8445-bac4-4e69-9462-c1e33f646315-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.719180 master-0 kubenswrapper[36504]: I1203 22:23:01.719079 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6be1c526-0db0-4e85-9d73-8c78c20e4273-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.719180 master-0 kubenswrapper[36504]: I1203 22:23:01.719100 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.719180 master-0 kubenswrapper[36504]: I1203 22:23:01.719117 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.719180 master-0 kubenswrapper[36504]: I1203 22:23:01.719160 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-095faa9e-409b-48e9-8678-291111c98615\" (UniqueName: \"kubernetes.io/csi/topolvm.io^05a7a762-489b-438f-bebc-5cc01a276c64\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.720641 master-0 kubenswrapper[36504]: I1203 22:23:01.720495 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.722595 master-0 kubenswrapper[36504]: I1203 22:23:01.722364 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.722935 master-0 kubenswrapper[36504]: I1203 22:23:01.722887 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.723142 master-0 kubenswrapper[36504]: I1203 22:23:01.723050 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6be1c526-0db0-4e85-9d73-8c78c20e4273-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.723834 master-0 kubenswrapper[36504]: I1203 22:23:01.723800 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:01.723915 master-0 kubenswrapper[36504]: I1203 22:23:01.723852 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-095faa9e-409b-48e9-8678-291111c98615\" (UniqueName: \"kubernetes.io/csi/topolvm.io^05a7a762-489b-438f-bebc-5cc01a276c64\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/c456651b3777ed2eaacc7e230205864a0c596738ebdbbf94714a1d1d49b7a140/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.724000 master-0 kubenswrapper[36504]: I1203 22:23:01.723919 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-kolla-config\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.724000 master-0 kubenswrapper[36504]: I1203 22:23:01.723966 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-config-data-default\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.724485 master-0 kubenswrapper[36504]: I1203 22:23:01.724418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.725047 master-0 kubenswrapper[36504]: I1203 22:23:01.724994 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a1ed8445-bac4-4e69-9462-c1e33f646315-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.725360 master-0 kubenswrapper[36504]: I1203 22:23:01.725280 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:01.725360 master-0 kubenswrapper[36504]: I1203 22:23:01.725310 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5e84b778-c4f9-4bee-975e-4ea838dd3fbb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^781d4877-3b76-4d48-8c6b-fea84eab440c\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/9e746ef09a0925e575a99a20eb0af703537626faa84e80972528fbd50f010d80/globalmount\"" pod="openstack/openstack-galera-0" Dec 03 22:23:01.725792 master-0 kubenswrapper[36504]: I1203 22:23:01.725745 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1ed8445-bac4-4e69-9462-c1e33f646315-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.727057 master-0 kubenswrapper[36504]: I1203 22:23:01.727026 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6be1c526-0db0-4e85-9d73-8c78c20e4273-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.728844 master-0 kubenswrapper[36504]: I1203 22:23:01.728379 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.729749 master-0 kubenswrapper[36504]: I1203 22:23:01.729690 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1ed8445-bac4-4e69-9462-c1e33f646315-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.730929 master-0 kubenswrapper[36504]: I1203 22:23:01.730798 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.733151 master-0 kubenswrapper[36504]: I1203 22:23:01.732953 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1ed8445-bac4-4e69-9462-c1e33f646315-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:01.739477 master-0 kubenswrapper[36504]: I1203 22:23:01.739140 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6be1c526-0db0-4e85-9d73-8c78c20e4273-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.743713 master-0 kubenswrapper[36504]: I1203 22:23:01.743626 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtgwk\" (UniqueName: \"kubernetes.io/projected/6be1c526-0db0-4e85-9d73-8c78c20e4273-kube-api-access-vtgwk\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:01.744678 master-0 kubenswrapper[36504]: I1203 22:23:01.744622 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvlx\" (UniqueName: \"kubernetes.io/projected/a1ed8445-bac4-4e69-9462-c1e33f646315-kube-api-access-7nvlx\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:02.875500 master-0 kubenswrapper[36504]: I1203 22:23:02.875407 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 22:23:02.878593 master-0 kubenswrapper[36504]: I1203 22:23:02.878519 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:02.882896 master-0 kubenswrapper[36504]: I1203 22:23:02.882794 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 22:23:02.883166 master-0 kubenswrapper[36504]: I1203 22:23:02.882808 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 22:23:02.898805 master-0 kubenswrapper[36504]: I1203 22:23:02.891096 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 22:23:02.898805 master-0 kubenswrapper[36504]: I1203 22:23:02.894328 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 22:23:03.008879 master-0 kubenswrapper[36504]: I1203 22:23:03.008781 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009334 master-0 kubenswrapper[36504]: I1203 22:23:03.008922 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16589858-9fd3-467e-9b55-e3cefb7d7a1a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009334 master-0 kubenswrapper[36504]: I1203 22:23:03.008974 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16589858-9fd3-467e-9b55-e3cefb7d7a1a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009445 master-0 kubenswrapper[36504]: I1203 22:23:03.009358 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009946 master-0 kubenswrapper[36504]: I1203 22:23:03.009495 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16589858-9fd3-467e-9b55-e3cefb7d7a1a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009946 master-0 kubenswrapper[36504]: I1203 22:23:03.009695 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8fa5c2d0-faab-4f51-a709-cf046187b31a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e9d533b-1962-440e-9d2b-cef636003e2f\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009946 master-0 kubenswrapper[36504]: I1203 22:23:03.009828 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpnx\" (UniqueName: \"kubernetes.io/projected/16589858-9fd3-467e-9b55-e3cefb7d7a1a-kube-api-access-fwpnx\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.009946 master-0 kubenswrapper[36504]: I1203 22:23:03.009902 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.114753 master-0 kubenswrapper[36504]: I1203 22:23:03.113828 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8fa5c2d0-faab-4f51-a709-cf046187b31a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e9d533b-1962-440e-9d2b-cef636003e2f\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.114753 master-0 kubenswrapper[36504]: I1203 22:23:03.113920 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpnx\" (UniqueName: \"kubernetes.io/projected/16589858-9fd3-467e-9b55-e3cefb7d7a1a-kube-api-access-fwpnx\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.114753 master-0 kubenswrapper[36504]: I1203 22:23:03.113961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.114753 master-0 kubenswrapper[36504]: I1203 22:23:03.114016 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.114753 master-0 kubenswrapper[36504]: I1203 22:23:03.114047 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16589858-9fd3-467e-9b55-e3cefb7d7a1a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.116429 master-0 kubenswrapper[36504]: I1203 22:23:03.116288 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16589858-9fd3-467e-9b55-e3cefb7d7a1a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.117616 master-0 kubenswrapper[36504]: I1203 22:23:03.116978 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.117616 master-0 kubenswrapper[36504]: I1203 22:23:03.117122 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16589858-9fd3-467e-9b55-e3cefb7d7a1a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.117724 master-0 kubenswrapper[36504]: I1203 22:23:03.117677 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.118185 master-0 kubenswrapper[36504]: I1203 22:23:03.118140 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/16589858-9fd3-467e-9b55-e3cefb7d7a1a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.119470 master-0 kubenswrapper[36504]: I1203 22:23:03.119171 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.120195 master-0 kubenswrapper[36504]: I1203 22:23:03.120126 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:03.120195 master-0 kubenswrapper[36504]: I1203 22:23:03.120163 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8fa5c2d0-faab-4f51-a709-cf046187b31a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e9d533b-1962-440e-9d2b-cef636003e2f\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/96e698e975d703a9580dd2a4506d814ad4903051413d89e522654918e50f83c2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.120367 master-0 kubenswrapper[36504]: I1203 22:23:03.120325 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/16589858-9fd3-467e-9b55-e3cefb7d7a1a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.124529 master-0 kubenswrapper[36504]: I1203 22:23:03.124482 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/16589858-9fd3-467e-9b55-e3cefb7d7a1a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.128905 master-0 kubenswrapper[36504]: I1203 22:23:03.128766 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16589858-9fd3-467e-9b55-e3cefb7d7a1a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.135984 master-0 kubenswrapper[36504]: I1203 22:23:03.135205 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpnx\" (UniqueName: \"kubernetes.io/projected/16589858-9fd3-467e-9b55-e3cefb7d7a1a-kube-api-access-fwpnx\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:03.308644 master-0 kubenswrapper[36504]: I1203 22:23:03.308596 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-095faa9e-409b-48e9-8678-291111c98615\" (UniqueName: \"kubernetes.io/csi/topolvm.io^05a7a762-489b-438f-bebc-5cc01a276c64\") pod \"rabbitmq-cell1-server-0\" (UID: \"6be1c526-0db0-4e85-9d73-8c78c20e4273\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:03.390969 master-0 kubenswrapper[36504]: I1203 22:23:03.386503 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:23:04.424265 master-0 kubenswrapper[36504]: I1203 22:23:04.421133 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:23:04.427594 master-0 kubenswrapper[36504]: I1203 22:23:04.426839 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.432408 master-0 kubenswrapper[36504]: I1203 22:23:04.430586 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 22:23:04.432408 master-0 kubenswrapper[36504]: I1203 22:23:04.430722 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 22:23:04.432408 master-0 kubenswrapper[36504]: I1203 22:23:04.430839 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 22:23:04.435056 master-0 kubenswrapper[36504]: I1203 22:23:04.433825 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 22:23:04.436825 master-0 kubenswrapper[36504]: I1203 22:23:04.436051 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:23:04.502537 master-0 kubenswrapper[36504]: I1203 22:23:04.502348 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 22:23:04.527505 master-0 kubenswrapper[36504]: I1203 22:23:04.527399 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5e84b778-c4f9-4bee-975e-4ea838dd3fbb\" (UniqueName: \"kubernetes.io/csi/topolvm.io^781d4877-3b76-4d48-8c6b-fea84eab440c\") pod \"openstack-galera-0\" (UID: \"a1ed8445-bac4-4e69-9462-c1e33f646315\") " pod="openstack/openstack-galera-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581161 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581277 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581360 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581428 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581456 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19bb9bba-1e31-4f41-9d29-202764cfd498-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581507 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-config\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581540 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19bb9bba-1e31-4f41-9d29-202764cfd498-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.581742 master-0 kubenswrapper[36504]: I1203 22:23:04.581584 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pfbj\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-kube-api-access-5pfbj\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.598115 master-0 kubenswrapper[36504]: I1203 22:23:04.598004 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 22:23:04.684876 master-0 kubenswrapper[36504]: I1203 22:23:04.684628 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pfbj\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-kube-api-access-5pfbj\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.684876 master-0 kubenswrapper[36504]: I1203 22:23:04.684752 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.684876 master-0 kubenswrapper[36504]: I1203 22:23:04.684833 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.684876 master-0 kubenswrapper[36504]: I1203 22:23:04.684883 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.685603 master-0 kubenswrapper[36504]: I1203 22:23:04.685574 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.687098 master-0 kubenswrapper[36504]: I1203 22:23:04.686330 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19bb9bba-1e31-4f41-9d29-202764cfd498-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.687098 master-0 kubenswrapper[36504]: I1203 22:23:04.686419 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-config\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.687098 master-0 kubenswrapper[36504]: I1203 22:23:04.686518 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19bb9bba-1e31-4f41-9d29-202764cfd498-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.687523 master-0 kubenswrapper[36504]: I1203 22:23:04.687480 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19bb9bba-1e31-4f41-9d29-202764cfd498-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.688414 master-0 kubenswrapper[36504]: I1203 22:23:04.688388 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:04.688502 master-0 kubenswrapper[36504]: I1203 22:23:04.688433 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/605be9bb1c2aead55319b869602437907f88623b26353397612d1a0c6a1e99b3/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.689435 master-0 kubenswrapper[36504]: I1203 22:23:04.689407 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.693857 master-0 kubenswrapper[36504]: I1203 22:23:04.690666 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-config\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.693857 master-0 kubenswrapper[36504]: I1203 22:23:04.690794 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.693857 master-0 kubenswrapper[36504]: I1203 22:23:04.692020 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19bb9bba-1e31-4f41-9d29-202764cfd498-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.698755 master-0 kubenswrapper[36504]: I1203 22:23:04.698714 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.721548 master-0 kubenswrapper[36504]: I1203 22:23:04.721469 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pfbj\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-kube-api-access-5pfbj\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:04.946213 master-0 kubenswrapper[36504]: I1203 22:23:04.946021 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 22:23:04.948543 master-0 kubenswrapper[36504]: I1203 22:23:04.948479 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:04.962990 master-0 kubenswrapper[36504]: I1203 22:23:04.956078 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 22:23:04.962990 master-0 kubenswrapper[36504]: I1203 22:23:04.956206 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 22:23:04.987016 master-0 kubenswrapper[36504]: I1203 22:23:04.986946 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 22:23:04.987364 master-0 kubenswrapper[36504]: I1203 22:23:04.987214 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 22:23:05.008580 master-0 kubenswrapper[36504]: I1203 22:23:05.008492 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 22:23:05.212045 master-0 kubenswrapper[36504]: I1203 22:23:05.209133 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523a229-5417-4476-bf50-cdade89475cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.212045 master-0 kubenswrapper[36504]: I1203 22:23:05.209200 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6523a229-5417-4476-bf50-cdade89475cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.212045 master-0 kubenswrapper[36504]: I1203 22:23:05.209237 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-055690ef-36e0-4f8e-84d6-fd0ca8558109\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bb4a9c60-bf21-4084-86f0-bf11320d2412\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.216096 master-0 kubenswrapper[36504]: I1203 22:23:05.216051 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twt6c\" (UniqueName: \"kubernetes.io/projected/6523a229-5417-4476-bf50-cdade89475cd-kube-api-access-twt6c\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.216229 master-0 kubenswrapper[36504]: I1203 22:23:05.216164 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.216291 master-0 kubenswrapper[36504]: I1203 22:23:05.216260 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.216329 master-0 kubenswrapper[36504]: I1203 22:23:05.216294 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.216378 master-0 kubenswrapper[36504]: I1203 22:23:05.216343 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6523a229-5417-4476-bf50-cdade89475cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.318664 master-0 kubenswrapper[36504]: I1203 22:23:05.318554 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.318664 master-0 kubenswrapper[36504]: I1203 22:23:05.318651 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.319145 master-0 kubenswrapper[36504]: I1203 22:23:05.318687 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.319145 master-0 kubenswrapper[36504]: I1203 22:23:05.318723 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6523a229-5417-4476-bf50-cdade89475cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.319493 master-0 kubenswrapper[36504]: I1203 22:23:05.319436 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523a229-5417-4476-bf50-cdade89475cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.319493 master-0 kubenswrapper[36504]: I1203 22:23:05.319486 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6523a229-5417-4476-bf50-cdade89475cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.319670 master-0 kubenswrapper[36504]: I1203 22:23:05.319547 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-055690ef-36e0-4f8e-84d6-fd0ca8558109\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bb4a9c60-bf21-4084-86f0-bf11320d2412\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.319755 master-0 kubenswrapper[36504]: I1203 22:23:05.319704 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twt6c\" (UniqueName: \"kubernetes.io/projected/6523a229-5417-4476-bf50-cdade89475cd-kube-api-access-twt6c\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.322760 master-0 kubenswrapper[36504]: I1203 22:23:05.320780 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6523a229-5417-4476-bf50-cdade89475cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.322760 master-0 kubenswrapper[36504]: I1203 22:23:05.322217 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6523a229-5417-4476-bf50-cdade89475cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.323705 master-0 kubenswrapper[36504]: I1203 22:23:05.323584 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523a229-5417-4476-bf50-cdade89475cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.325441 master-0 kubenswrapper[36504]: I1203 22:23:05.324944 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:05.325441 master-0 kubenswrapper[36504]: I1203 22:23:05.324983 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-055690ef-36e0-4f8e-84d6-fd0ca8558109\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bb4a9c60-bf21-4084-86f0-bf11320d2412\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/16a45463a9568220de4744b7ebe80219bae163c8b618915715cc0bb95db1bbab/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.325566 master-0 kubenswrapper[36504]: I1203 22:23:05.325456 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.327265 master-0 kubenswrapper[36504]: I1203 22:23:05.327204 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.333178 master-0 kubenswrapper[36504]: I1203 22:23:05.333137 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6523a229-5417-4476-bf50-cdade89475cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.618428 master-0 kubenswrapper[36504]: I1203 22:23:05.618306 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twt6c\" (UniqueName: \"kubernetes.io/projected/6523a229-5417-4476-bf50-cdade89475cd-kube-api-access-twt6c\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:05.952209 master-0 kubenswrapper[36504]: I1203 22:23:05.945593 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8fa5c2d0-faab-4f51-a709-cf046187b31a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3e9d533b-1962-440e-9d2b-cef636003e2f\") pod \"openstack-cell1-galera-0\" (UID: \"16589858-9fd3-467e-9b55-e3cefb7d7a1a\") " pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:06.222566 master-0 kubenswrapper[36504]: I1203 22:23:06.222184 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:07.449521 master-0 kubenswrapper[36504]: I1203 22:23:07.449420 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:07.518617 master-0 kubenswrapper[36504]: I1203 22:23:07.518558 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:23:08.650658 master-0 kubenswrapper[36504]: I1203 22:23:08.650548 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 22:23:08.654669 master-0 kubenswrapper[36504]: I1203 22:23:08.654542 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.664484 master-0 kubenswrapper[36504]: I1203 22:23:08.662089 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 22:23:08.664484 master-0 kubenswrapper[36504]: I1203 22:23:08.662261 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 22:23:08.664484 master-0 kubenswrapper[36504]: I1203 22:23:08.663010 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 22:23:08.672641 master-0 kubenswrapper[36504]: I1203 22:23:08.672561 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 22:23:08.756805 master-0 kubenswrapper[36504]: I1203 22:23:08.756698 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-config\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.756805 master-0 kubenswrapper[36504]: I1203 22:23:08.756804 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.757160 master-0 kubenswrapper[36504]: I1203 22:23:08.756889 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.757160 master-0 kubenswrapper[36504]: I1203 22:23:08.756958 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.757340 master-0 kubenswrapper[36504]: I1203 22:23:08.757276 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cnhv\" (UniqueName: \"kubernetes.io/projected/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-kube-api-access-6cnhv\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.757474 master-0 kubenswrapper[36504]: I1203 22:23:08.757446 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.757735 master-0 kubenswrapper[36504]: I1203 22:23:08.757697 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3a2a2bf3-6578-4231-bd46-365ce985fdb6\" (UniqueName: \"kubernetes.io/csi/topolvm.io^72fb1ff3-62b9-4f45-8dfe-ebb9254fc9d3\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.757949 master-0 kubenswrapper[36504]: I1203 22:23:08.757919 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860498 master-0 kubenswrapper[36504]: I1203 22:23:08.860415 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860498 master-0 kubenswrapper[36504]: I1203 22:23:08.860501 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-config\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860883 master-0 kubenswrapper[36504]: I1203 22:23:08.860531 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860883 master-0 kubenswrapper[36504]: I1203 22:23:08.860574 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860883 master-0 kubenswrapper[36504]: I1203 22:23:08.860626 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860883 master-0 kubenswrapper[36504]: I1203 22:23:08.860724 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cnhv\" (UniqueName: \"kubernetes.io/projected/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-kube-api-access-6cnhv\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860883 master-0 kubenswrapper[36504]: I1203 22:23:08.860802 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.860883 master-0 kubenswrapper[36504]: I1203 22:23:08.860880 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3a2a2bf3-6578-4231-bd46-365ce985fdb6\" (UniqueName: \"kubernetes.io/csi/topolvm.io^72fb1ff3-62b9-4f45-8dfe-ebb9254fc9d3\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.862331 master-0 kubenswrapper[36504]: I1203 22:23:08.862263 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.862683 master-0 kubenswrapper[36504]: I1203 22:23:08.862638 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.864048 master-0 kubenswrapper[36504]: I1203 22:23:08.863979 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-config\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.864349 master-0 kubenswrapper[36504]: I1203 22:23:08.864308 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:08.864491 master-0 kubenswrapper[36504]: I1203 22:23:08.864376 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3a2a2bf3-6578-4231-bd46-365ce985fdb6\" (UniqueName: \"kubernetes.io/csi/topolvm.io^72fb1ff3-62b9-4f45-8dfe-ebb9254fc9d3\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a21aa49d5c64e1af7e4fff35d96f9756b6e19d2ec2821beae22a63c5dd618c02/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.866981 master-0 kubenswrapper[36504]: I1203 22:23:08.866595 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.867295 master-0 kubenswrapper[36504]: I1203 22:23:08.867242 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.881066 master-0 kubenswrapper[36504]: I1203 22:23:08.881004 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:08.887950 master-0 kubenswrapper[36504]: I1203 22:23:08.887817 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cnhv\" (UniqueName: \"kubernetes.io/projected/d26acd77-0fe5-4aa6-990e-e62fc1cd1da7-kube-api-access-6cnhv\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:09.067481 master-0 kubenswrapper[36504]: I1203 22:23:09.067409 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-055690ef-36e0-4f8e-84d6-fd0ca8558109\" (UniqueName: \"kubernetes.io/csi/topolvm.io^bb4a9c60-bf21-4084-86f0-bf11320d2412\") pod \"ovsdbserver-nb-0\" (UID: \"6523a229-5417-4476-bf50-cdade89475cd\") " pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:09.209220 master-0 kubenswrapper[36504]: I1203 22:23:09.209142 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:10.567680 master-0 kubenswrapper[36504]: I1203 22:23:10.567610 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3a2a2bf3-6578-4231-bd46-365ce985fdb6\" (UniqueName: \"kubernetes.io/csi/topolvm.io^72fb1ff3-62b9-4f45-8dfe-ebb9254fc9d3\") pod \"ovsdbserver-sb-0\" (UID: \"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:10.797587 master-0 kubenswrapper[36504]: I1203 22:23:10.797255 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:19.084636 master-0 kubenswrapper[36504]: I1203 22:23:19.083343 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 22:23:19.316832 master-0 kubenswrapper[36504]: I1203 22:23:19.315820 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 22:23:19.657853 master-0 kubenswrapper[36504]: I1203 22:23:19.655981 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" event={"ID":"4339d191-869f-43be-80f4-92e25c8f6a7a","Type":"ContainerStarted","Data":"ffb65200d1a41c4029474da0ba9177300ae0f84bf709ff97c7f828181a9c86a6"} Dec 03 22:23:19.659084 master-0 kubenswrapper[36504]: I1203 22:23:19.658199 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" event={"ID":"fa0d6984-46f5-4f29-ba6e-75ac1a90606b","Type":"ContainerStarted","Data":"9cd7b0042a846b98339790dfd1bf87d7afae20db96b6cdd86ceeead0504b8f0a"} Dec 03 22:23:19.659579 master-0 kubenswrapper[36504]: I1203 22:23:19.659537 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8b81647a-8849-4ff3-ad1b-08f0aaaa5657","Type":"ContainerStarted","Data":"7dfe33afea56c497ed6b8443c36bfcfba8b9928d75332e6ccccb6bd6152ed222"} Dec 03 22:23:19.661984 master-0 kubenswrapper[36504]: I1203 22:23:19.661941 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6be1c526-0db0-4e85-9d73-8c78c20e4273","Type":"ContainerStarted","Data":"959b0632c5b37baaf5937897d55ebc8e25e0d309ce407356a1595fdb5b777a68"} Dec 03 22:23:19.668870 master-0 kubenswrapper[36504]: I1203 22:23:19.665118 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" event={"ID":"b6d8084e-5235-4191-8b31-5dd067f81b91","Type":"ContainerStarted","Data":"934a2e85ade5d8389f5dc5a3cd93c6b375c4354c4125315faec6c63fd3664c07"} Dec 03 22:23:19.668870 master-0 kubenswrapper[36504]: I1203 22:23:19.665297 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" podUID="b6d8084e-5235-4191-8b31-5dd067f81b91" containerName="init" containerID="cri-o://934a2e85ade5d8389f5dc5a3cd93c6b375c4354c4125315faec6c63fd3664c07" gracePeriod=10 Dec 03 22:23:19.668870 master-0 kubenswrapper[36504]: I1203 22:23:19.668188 36504 generic.go:334] "Generic (PLEG): container finished" podID="9d60333c-6dc0-4509-9201-7e296daf6f73" containerID="56d0065ecfcd151ed05857ec02e3f63a4e702e026fb93babf430c335683b0e1f" exitCode=0 Dec 03 22:23:19.668870 master-0 kubenswrapper[36504]: I1203 22:23:19.668220 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" event={"ID":"9d60333c-6dc0-4509-9201-7e296daf6f73","Type":"ContainerDied","Data":"56d0065ecfcd151ed05857ec02e3f63a4e702e026fb93babf430c335683b0e1f"} Dec 03 22:23:19.816295 master-0 kubenswrapper[36504]: I1203 22:23:19.816135 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:23:19.830561 master-0 kubenswrapper[36504]: I1203 22:23:19.829028 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 22:23:19.856185 master-0 kubenswrapper[36504]: I1203 22:23:19.853863 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 22:23:20.097263 master-0 kubenswrapper[36504]: I1203 22:23:20.097189 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:23:20.174697 master-0 kubenswrapper[36504]: I1203 22:23:20.174610 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 22:23:20.210101 master-0 kubenswrapper[36504]: W1203 22:23:20.210019 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1ed8445_bac4_4e69_9462_c1e33f646315.slice/crio-5123652617e4f66cdbb4aef614734cd4f3dfed34fa65c4564425b31229dfb96e WatchSource:0}: Error finding container 5123652617e4f66cdbb4aef614734cd4f3dfed34fa65c4564425b31229dfb96e: Status 404 returned error can't find the container with id 5123652617e4f66cdbb4aef614734cd4f3dfed34fa65c4564425b31229dfb96e Dec 03 22:23:20.349707 master-0 kubenswrapper[36504]: I1203 22:23:20.349626 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 22:23:20.362314 master-0 kubenswrapper[36504]: W1203 22:23:20.361128 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6523a229_5417_4476_bf50_cdade89475cd.slice/crio-ac07f862d019ef889311d487a1ac0b20f536738ad8d0098513270456bad5fab2 WatchSource:0}: Error finding container ac07f862d019ef889311d487a1ac0b20f536738ad8d0098513270456bad5fab2: Status 404 returned error can't find the container with id ac07f862d019ef889311d487a1ac0b20f536738ad8d0098513270456bad5fab2 Dec 03 22:23:20.588383 master-0 kubenswrapper[36504]: I1203 22:23:20.588261 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-54zjs"] Dec 03 22:23:20.664010 master-0 kubenswrapper[36504]: I1203 22:23:20.663922 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:23:20.687837 master-0 kubenswrapper[36504]: I1203 22:23:20.687721 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 22:23:20.697239 master-0 kubenswrapper[36504]: I1203 22:23:20.697148 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1ed8445-bac4-4e69-9462-c1e33f646315","Type":"ContainerStarted","Data":"5123652617e4f66cdbb4aef614734cd4f3dfed34fa65c4564425b31229dfb96e"} Dec 03 22:23:20.700343 master-0 kubenswrapper[36504]: I1203 22:23:20.700280 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed","Type":"ContainerStarted","Data":"e2e74a9bd14f19efa3c1bf6514a57b8a7c25715a330c263fac54084c70ea3363"} Dec 03 22:23:20.710399 master-0 kubenswrapper[36504]: I1203 22:23:20.710221 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zqx7b"] Dec 03 22:23:20.725150 master-0 kubenswrapper[36504]: I1203 22:23:20.725085 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 22:23:20.726245 master-0 kubenswrapper[36504]: I1203 22:23:20.726183 36504 generic.go:334] "Generic (PLEG): container finished" podID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerID="ffb65200d1a41c4029474da0ba9177300ae0f84bf709ff97c7f828181a9c86a6" exitCode=0 Dec 03 22:23:20.726423 master-0 kubenswrapper[36504]: I1203 22:23:20.726358 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" event={"ID":"4339d191-869f-43be-80f4-92e25c8f6a7a","Type":"ContainerDied","Data":"ffb65200d1a41c4029474da0ba9177300ae0f84bf709ff97c7f828181a9c86a6"} Dec 03 22:23:20.734323 master-0 kubenswrapper[36504]: I1203 22:23:20.734223 36504 generic.go:334] "Generic (PLEG): container finished" podID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerID="9cd7b0042a846b98339790dfd1bf87d7afae20db96b6cdd86ceeead0504b8f0a" exitCode=0 Dec 03 22:23:20.734452 master-0 kubenswrapper[36504]: I1203 22:23:20.734345 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" event={"ID":"fa0d6984-46f5-4f29-ba6e-75ac1a90606b","Type":"ContainerDied","Data":"9cd7b0042a846b98339790dfd1bf87d7afae20db96b6cdd86ceeead0504b8f0a"} Dec 03 22:23:20.738313 master-0 kubenswrapper[36504]: I1203 22:23:20.738262 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1284fac2-9956-456c-9781-135e637e85bd","Type":"ContainerStarted","Data":"754d83a0e0adc7afafd87a3e7e658d96fe36ba2927baa6e1c4dc55a4f13e4606"} Dec 03 22:23:20.740551 master-0 kubenswrapper[36504]: I1203 22:23:20.740514 36504 generic.go:334] "Generic (PLEG): container finished" podID="b6d8084e-5235-4191-8b31-5dd067f81b91" containerID="934a2e85ade5d8389f5dc5a3cd93c6b375c4354c4125315faec6c63fd3664c07" exitCode=0 Dec 03 22:23:20.740615 master-0 kubenswrapper[36504]: I1203 22:23:20.740582 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" event={"ID":"b6d8084e-5235-4191-8b31-5dd067f81b91","Type":"ContainerDied","Data":"934a2e85ade5d8389f5dc5a3cd93c6b375c4354c4125315faec6c63fd3664c07"} Dec 03 22:23:20.747932 master-0 kubenswrapper[36504]: I1203 22:23:20.743070 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6523a229-5417-4476-bf50-cdade89475cd","Type":"ContainerStarted","Data":"ac07f862d019ef889311d487a1ac0b20f536738ad8d0098513270456bad5fab2"} Dec 03 22:23:20.749358 master-0 kubenswrapper[36504]: W1203 22:23:20.749304 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13962c28_06ea_4b66_aa46_f00d50e29eda.slice/crio-64e1246dea738c4f28e7e1315e2b80518963136b4c00d203588ce7d99763efdf WatchSource:0}: Error finding container 64e1246dea738c4f28e7e1315e2b80518963136b4c00d203588ce7d99763efdf: Status 404 returned error can't find the container with id 64e1246dea738c4f28e7e1315e2b80518963136b4c00d203588ce7d99763efdf Dec 03 22:23:20.749991 master-0 kubenswrapper[36504]: I1203 22:23:20.749848 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16589858-9fd3-467e-9b55-e3cefb7d7a1a","Type":"ContainerStarted","Data":"00afb6aa12680bf661e32eae5fbdd35af44dbab7d3617b1040304b4b6105730f"} Dec 03 22:23:20.760816 master-0 kubenswrapper[36504]: W1203 22:23:20.760725 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd26acd77_0fe5_4aa6_990e_e62fc1cd1da7.slice/crio-3eda914e25e5a38eed13720a5ae39bcbef828a106687628e803cf108fa63523c WatchSource:0}: Error finding container 3eda914e25e5a38eed13720a5ae39bcbef828a106687628e803cf108fa63523c: Status 404 returned error can't find the container with id 3eda914e25e5a38eed13720a5ae39bcbef828a106687628e803cf108fa63523c Dec 03 22:23:20.776850 master-0 kubenswrapper[36504]: W1203 22:23:20.764960 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19bb9bba_1e31_4f41_9d29_202764cfd498.slice/crio-9005232759f9c32eb3d5b2ba7135ad9c8f275ae755a707e46ffaee9d3335daab WatchSource:0}: Error finding container 9005232759f9c32eb3d5b2ba7135ad9c8f275ae755a707e46ffaee9d3335daab: Status 404 returned error can't find the container with id 9005232759f9c32eb3d5b2ba7135ad9c8f275ae755a707e46ffaee9d3335daab Dec 03 22:23:20.776850 master-0 kubenswrapper[36504]: W1203 22:23:20.767274 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa8ce82_8c7e_4f58_8585_8ca5680b6951.slice/crio-a4e009fa5221e0809b0b37dd4c41c6093eb56ffc347ddabb2e27de5b93ad8269 WatchSource:0}: Error finding container a4e009fa5221e0809b0b37dd4c41c6093eb56ffc347ddabb2e27de5b93ad8269: Status 404 returned error can't find the container with id a4e009fa5221e0809b0b37dd4c41c6093eb56ffc347ddabb2e27de5b93ad8269 Dec 03 22:23:20.884740 master-0 kubenswrapper[36504]: I1203 22:23:20.884669 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:23:20.894015 master-0 kubenswrapper[36504]: I1203 22:23:20.893722 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:23:20.999913 master-0 kubenswrapper[36504]: I1203 22:23:20.998296 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-config\") pod \"b6d8084e-5235-4191-8b31-5dd067f81b91\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " Dec 03 22:23:21.000235 master-0 kubenswrapper[36504]: I1203 22:23:21.000058 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrd6g\" (UniqueName: \"kubernetes.io/projected/9d60333c-6dc0-4509-9201-7e296daf6f73-kube-api-access-mrd6g\") pod \"9d60333c-6dc0-4509-9201-7e296daf6f73\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " Dec 03 22:23:21.000235 master-0 kubenswrapper[36504]: I1203 22:23:21.000084 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvf9l\" (UniqueName: \"kubernetes.io/projected/b6d8084e-5235-4191-8b31-5dd067f81b91-kube-api-access-pvf9l\") pod \"b6d8084e-5235-4191-8b31-5dd067f81b91\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " Dec 03 22:23:21.000320 master-0 kubenswrapper[36504]: I1203 22:23:21.000291 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d60333c-6dc0-4509-9201-7e296daf6f73-config\") pod \"9d60333c-6dc0-4509-9201-7e296daf6f73\" (UID: \"9d60333c-6dc0-4509-9201-7e296daf6f73\") " Dec 03 22:23:21.000354 master-0 kubenswrapper[36504]: I1203 22:23:21.000339 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-dns-svc\") pod \"b6d8084e-5235-4191-8b31-5dd067f81b91\" (UID: \"b6d8084e-5235-4191-8b31-5dd067f81b91\") " Dec 03 22:23:21.007443 master-0 kubenswrapper[36504]: I1203 22:23:21.007148 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d8084e-5235-4191-8b31-5dd067f81b91-kube-api-access-pvf9l" (OuterVolumeSpecName: "kube-api-access-pvf9l") pod "b6d8084e-5235-4191-8b31-5dd067f81b91" (UID: "b6d8084e-5235-4191-8b31-5dd067f81b91"). InnerVolumeSpecName "kube-api-access-pvf9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:21.010827 master-0 kubenswrapper[36504]: I1203 22:23:21.010513 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d60333c-6dc0-4509-9201-7e296daf6f73-kube-api-access-mrd6g" (OuterVolumeSpecName: "kube-api-access-mrd6g") pod "9d60333c-6dc0-4509-9201-7e296daf6f73" (UID: "9d60333c-6dc0-4509-9201-7e296daf6f73"). InnerVolumeSpecName "kube-api-access-mrd6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:21.032461 master-0 kubenswrapper[36504]: I1203 22:23:21.032359 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d60333c-6dc0-4509-9201-7e296daf6f73-config" (OuterVolumeSpecName: "config") pod "9d60333c-6dc0-4509-9201-7e296daf6f73" (UID: "9d60333c-6dc0-4509-9201-7e296daf6f73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:21.037962 master-0 kubenswrapper[36504]: I1203 22:23:21.037833 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-config" (OuterVolumeSpecName: "config") pod "b6d8084e-5235-4191-8b31-5dd067f81b91" (UID: "b6d8084e-5235-4191-8b31-5dd067f81b91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:21.046312 master-0 kubenswrapper[36504]: I1203 22:23:21.046274 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6d8084e-5235-4191-8b31-5dd067f81b91" (UID: "b6d8084e-5235-4191-8b31-5dd067f81b91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:21.103681 master-0 kubenswrapper[36504]: I1203 22:23:21.103617 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:21.103681 master-0 kubenswrapper[36504]: I1203 22:23:21.103671 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrd6g\" (UniqueName: \"kubernetes.io/projected/9d60333c-6dc0-4509-9201-7e296daf6f73-kube-api-access-mrd6g\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:21.103681 master-0 kubenswrapper[36504]: I1203 22:23:21.103685 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvf9l\" (UniqueName: \"kubernetes.io/projected/b6d8084e-5235-4191-8b31-5dd067f81b91-kube-api-access-pvf9l\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:21.103681 master-0 kubenswrapper[36504]: I1203 22:23:21.103696 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d60333c-6dc0-4509-9201-7e296daf6f73-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:21.104375 master-0 kubenswrapper[36504]: I1203 22:23:21.103706 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6d8084e-5235-4191-8b31-5dd067f81b91-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:21.765989 master-0 kubenswrapper[36504]: I1203 22:23:21.765903 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b" event={"ID":"13962c28-06ea-4b66-aa46-f00d50e29eda","Type":"ContainerStarted","Data":"64e1246dea738c4f28e7e1315e2b80518963136b4c00d203588ce7d99763efdf"} Dec 03 22:23:21.769687 master-0 kubenswrapper[36504]: I1203 22:23:21.769653 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7","Type":"ContainerStarted","Data":"3eda914e25e5a38eed13720a5ae39bcbef828a106687628e803cf108fa63523c"} Dec 03 22:23:21.772501 master-0 kubenswrapper[36504]: I1203 22:23:21.772432 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55ac1860-363d-42ff-90dd-3b6bf2e78864","Type":"ContainerStarted","Data":"e22b41565866c3b6018341c80206f0a0b1459224c9b5944a806fe11cace70980"} Dec 03 22:23:21.778126 master-0 kubenswrapper[36504]: I1203 22:23:21.778065 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" event={"ID":"b6d8084e-5235-4191-8b31-5dd067f81b91","Type":"ContainerDied","Data":"10b6c784cdbfb8fb6dc169103d560854c3e42b32c382f7a23dc1ee820525e3be"} Dec 03 22:23:21.778259 master-0 kubenswrapper[36504]: I1203 22:23:21.778165 36504 scope.go:117] "RemoveContainer" containerID="934a2e85ade5d8389f5dc5a3cd93c6b375c4354c4125315faec6c63fd3664c07" Dec 03 22:23:21.778259 master-0 kubenswrapper[36504]: I1203 22:23:21.778091 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d7c5dbd7-b4xzc" Dec 03 22:23:21.783526 master-0 kubenswrapper[36504]: I1203 22:23:21.783470 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" event={"ID":"9d60333c-6dc0-4509-9201-7e296daf6f73","Type":"ContainerDied","Data":"88e4eeeff208261b587dfdf7d52e5989dce926d1cfb0b2ccc0c4961b7607a043"} Dec 03 22:23:21.783678 master-0 kubenswrapper[36504]: I1203 22:23:21.783592 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dbfd7c4bf-tq96q" Dec 03 22:23:21.787349 master-0 kubenswrapper[36504]: I1203 22:23:21.786253 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-54zjs" event={"ID":"1aa8ce82-8c7e-4f58-8585-8ca5680b6951","Type":"ContainerStarted","Data":"a4e009fa5221e0809b0b37dd4c41c6093eb56ffc347ddabb2e27de5b93ad8269"} Dec 03 22:23:21.789513 master-0 kubenswrapper[36504]: I1203 22:23:21.789455 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerStarted","Data":"9005232759f9c32eb3d5b2ba7135ad9c8f275ae755a707e46ffaee9d3335daab"} Dec 03 22:23:21.857189 master-0 kubenswrapper[36504]: I1203 22:23:21.855039 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-b4xzc"] Dec 03 22:23:21.882445 master-0 kubenswrapper[36504]: I1203 22:23:21.882361 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75d7c5dbd7-b4xzc"] Dec 03 22:23:21.957967 master-0 kubenswrapper[36504]: I1203 22:23:21.957885 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-tq96q"] Dec 03 22:23:21.967511 master-0 kubenswrapper[36504]: I1203 22:23:21.967413 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dbfd7c4bf-tq96q"] Dec 03 22:23:22.811687 master-0 kubenswrapper[36504]: I1203 22:23:22.811630 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" event={"ID":"fa0d6984-46f5-4f29-ba6e-75ac1a90606b","Type":"ContainerStarted","Data":"b280b189253083a97441e0ef06b856672bdc7de7774b799c182ebdffa8bed6bf"} Dec 03 22:23:22.814428 master-0 kubenswrapper[36504]: I1203 22:23:22.813325 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:23:22.851097 master-0 kubenswrapper[36504]: I1203 22:23:22.850856 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" podStartSLOduration=5.857601822 podStartE2EDuration="31.850823279s" podCreationTimestamp="2025-12-03 22:22:51 +0000 UTC" firstStartedPulling="2025-12-03 22:22:52.915480873 +0000 UTC m=+738.135252880" lastFinishedPulling="2025-12-03 22:23:18.90870233 +0000 UTC m=+764.128474337" observedRunningTime="2025-12-03 22:23:22.845020876 +0000 UTC m=+768.064792903" watchObservedRunningTime="2025-12-03 22:23:22.850823279 +0000 UTC m=+768.070595316" Dec 03 22:23:23.109302 master-0 kubenswrapper[36504]: I1203 22:23:23.109126 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d60333c-6dc0-4509-9201-7e296daf6f73" path="/var/lib/kubelet/pods/9d60333c-6dc0-4509-9201-7e296daf6f73/volumes" Dec 03 22:23:23.114113 master-0 kubenswrapper[36504]: I1203 22:23:23.109846 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d8084e-5235-4191-8b31-5dd067f81b91" path="/var/lib/kubelet/pods/b6d8084e-5235-4191-8b31-5dd067f81b91/volumes" Dec 03 22:23:24.411593 master-0 kubenswrapper[36504]: I1203 22:23:24.411503 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-c8s82"] Dec 03 22:23:24.412373 master-0 kubenswrapper[36504]: E1203 22:23:24.412075 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d60333c-6dc0-4509-9201-7e296daf6f73" containerName="init" Dec 03 22:23:24.412373 master-0 kubenswrapper[36504]: I1203 22:23:24.412092 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d60333c-6dc0-4509-9201-7e296daf6f73" containerName="init" Dec 03 22:23:24.412373 master-0 kubenswrapper[36504]: E1203 22:23:24.412148 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d8084e-5235-4191-8b31-5dd067f81b91" containerName="init" Dec 03 22:23:24.412373 master-0 kubenswrapper[36504]: I1203 22:23:24.412155 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d8084e-5235-4191-8b31-5dd067f81b91" containerName="init" Dec 03 22:23:24.412373 master-0 kubenswrapper[36504]: I1203 22:23:24.412358 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d60333c-6dc0-4509-9201-7e296daf6f73" containerName="init" Dec 03 22:23:24.412536 master-0 kubenswrapper[36504]: I1203 22:23:24.412382 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d8084e-5235-4191-8b31-5dd067f81b91" containerName="init" Dec 03 22:23:24.431423 master-0 kubenswrapper[36504]: I1203 22:23:24.413164 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.431423 master-0 kubenswrapper[36504]: I1203 22:23:24.418795 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 22:23:24.431423 master-0 kubenswrapper[36504]: I1203 22:23:24.426246 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-c8s82"] Dec 03 22:23:24.508190 master-0 kubenswrapper[36504]: I1203 22:23:24.504743 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ca94a5-8da8-4b10-85f1-5dace852dcf4-combined-ca-bundle\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.508190 master-0 kubenswrapper[36504]: I1203 22:23:24.504835 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/41ca94a5-8da8-4b10-85f1-5dace852dcf4-ovn-rundir\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.508190 master-0 kubenswrapper[36504]: I1203 22:23:24.504893 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ca94a5-8da8-4b10-85f1-5dace852dcf4-config\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.508190 master-0 kubenswrapper[36504]: I1203 22:23:24.504919 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ca94a5-8da8-4b10-85f1-5dace852dcf4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.508190 master-0 kubenswrapper[36504]: I1203 22:23:24.505044 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/41ca94a5-8da8-4b10-85f1-5dace852dcf4-ovs-rundir\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.508190 master-0 kubenswrapper[36504]: I1203 22:23:24.505185 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94k7s\" (UniqueName: \"kubernetes.io/projected/41ca94a5-8da8-4b10-85f1-5dace852dcf4-kube-api-access-94k7s\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.608953 master-0 kubenswrapper[36504]: I1203 22:23:24.608783 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/41ca94a5-8da8-4b10-85f1-5dace852dcf4-ovs-rundir\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.608953 master-0 kubenswrapper[36504]: I1203 22:23:24.608969 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94k7s\" (UniqueName: \"kubernetes.io/projected/41ca94a5-8da8-4b10-85f1-5dace852dcf4-kube-api-access-94k7s\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.609346 master-0 kubenswrapper[36504]: I1203 22:23:24.609018 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ca94a5-8da8-4b10-85f1-5dace852dcf4-combined-ca-bundle\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.612812 master-0 kubenswrapper[36504]: I1203 22:23:24.610154 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/41ca94a5-8da8-4b10-85f1-5dace852dcf4-ovn-rundir\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.612812 master-0 kubenswrapper[36504]: I1203 22:23:24.610236 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ca94a5-8da8-4b10-85f1-5dace852dcf4-config\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.612812 master-0 kubenswrapper[36504]: I1203 22:23:24.610269 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ca94a5-8da8-4b10-85f1-5dace852dcf4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.612812 master-0 kubenswrapper[36504]: I1203 22:23:24.609695 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/41ca94a5-8da8-4b10-85f1-5dace852dcf4-ovs-rundir\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.612812 master-0 kubenswrapper[36504]: I1203 22:23:24.610905 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/41ca94a5-8da8-4b10-85f1-5dace852dcf4-ovn-rundir\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.614863 master-0 kubenswrapper[36504]: I1203 22:23:24.614764 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ca94a5-8da8-4b10-85f1-5dace852dcf4-config\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.615169 master-0 kubenswrapper[36504]: I1203 22:23:24.615107 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ca94a5-8da8-4b10-85f1-5dace852dcf4-combined-ca-bundle\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.623871 master-0 kubenswrapper[36504]: I1203 22:23:24.618955 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ca94a5-8da8-4b10-85f1-5dace852dcf4-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.643727 master-0 kubenswrapper[36504]: I1203 22:23:24.643652 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94k7s\" (UniqueName: \"kubernetes.io/projected/41ca94a5-8da8-4b10-85f1-5dace852dcf4-kube-api-access-94k7s\") pod \"ovn-controller-metrics-c8s82\" (UID: \"41ca94a5-8da8-4b10-85f1-5dace852dcf4\") " pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.662526 master-0 kubenswrapper[36504]: I1203 22:23:24.662342 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f856ff86c-xkdfs"] Dec 03 22:23:24.714403 master-0 kubenswrapper[36504]: I1203 22:23:24.714198 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-xg7f9"] Dec 03 22:23:24.720553 master-0 kubenswrapper[36504]: I1203 22:23:24.720478 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:24.737173 master-0 kubenswrapper[36504]: I1203 22:23:24.732869 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 22:23:24.737173 master-0 kubenswrapper[36504]: I1203 22:23:24.735116 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-xg7f9"] Dec 03 22:23:24.786898 master-0 kubenswrapper[36504]: I1203 22:23:24.785495 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-c8s82" Dec 03 22:23:24.878546 master-0 kubenswrapper[36504]: I1203 22:23:24.878478 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" event={"ID":"4339d191-869f-43be-80f4-92e25c8f6a7a","Type":"ContainerStarted","Data":"c86a9efb2e0dc139556d5ddfc93cb88b46254b7d4ef587f8640e95dc5e4f165d"} Dec 03 22:23:24.881041 master-0 kubenswrapper[36504]: I1203 22:23:24.878724 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:23:24.940455 master-0 kubenswrapper[36504]: I1203 22:23:24.940225 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" podStartSLOduration=8.222489686 podStartE2EDuration="34.940195486s" podCreationTimestamp="2025-12-03 22:22:50 +0000 UTC" firstStartedPulling="2025-12-03 22:22:52.042294716 +0000 UTC m=+737.262066723" lastFinishedPulling="2025-12-03 22:23:18.760000516 +0000 UTC m=+763.979772523" observedRunningTime="2025-12-03 22:23:24.902854103 +0000 UTC m=+770.122626130" watchObservedRunningTime="2025-12-03 22:23:24.940195486 +0000 UTC m=+770.159967493" Dec 03 22:23:24.954044 master-0 kubenswrapper[36504]: I1203 22:23:24.950230 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-dns-svc\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:24.954044 master-0 kubenswrapper[36504]: I1203 22:23:24.950505 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qqjt\" (UniqueName: \"kubernetes.io/projected/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-kube-api-access-9qqjt\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:24.954044 master-0 kubenswrapper[36504]: I1203 22:23:24.950751 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:24.954044 master-0 kubenswrapper[36504]: I1203 22:23:24.950955 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-config\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:24.985807 master-0 kubenswrapper[36504]: I1203 22:23:24.979243 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-z6l5h"] Dec 03 22:23:24.985807 master-0 kubenswrapper[36504]: I1203 22:23:24.979656 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="dnsmasq-dns" containerID="cri-o://b280b189253083a97441e0ef06b856672bdc7de7774b799c182ebdffa8bed6bf" gracePeriod=10 Dec 03 22:23:25.000983 master-0 kubenswrapper[36504]: I1203 22:23:24.998650 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5556b9959f-bv6bf"] Dec 03 22:23:25.000983 master-0 kubenswrapper[36504]: I1203 22:23:25.000827 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.005034 master-0 kubenswrapper[36504]: I1203 22:23:25.004295 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 22:23:25.016083 master-0 kubenswrapper[36504]: I1203 22:23:25.015442 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5556b9959f-bv6bf"] Dec 03 22:23:25.055012 master-0 kubenswrapper[36504]: I1203 22:23:25.054944 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.057490 master-0 kubenswrapper[36504]: I1203 22:23:25.055609 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-dns-svc\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.060628 master-0 kubenswrapper[36504]: I1203 22:23:25.056657 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.060793 master-0 kubenswrapper[36504]: I1203 22:23:25.057738 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqrc\" (UniqueName: \"kubernetes.io/projected/2be06279-64ce-428e-b10e-6242685edf0f-kube-api-access-4vqrc\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.061676 master-0 kubenswrapper[36504]: I1203 22:23:25.061655 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-config\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.061825 master-0 kubenswrapper[36504]: I1203 22:23:25.061805 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-sb\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.061930 master-0 kubenswrapper[36504]: I1203 22:23:25.061917 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-nb\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.063464 master-0 kubenswrapper[36504]: I1203 22:23:25.063445 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-dns-svc\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.063688 master-0 kubenswrapper[36504]: I1203 22:23:25.063674 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-config\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.063811 master-0 kubenswrapper[36504]: I1203 22:23:25.063796 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qqjt\" (UniqueName: \"kubernetes.io/projected/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-kube-api-access-9qqjt\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.064031 master-0 kubenswrapper[36504]: I1203 22:23:25.063944 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-config\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.065658 master-0 kubenswrapper[36504]: I1203 22:23:25.065637 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-dns-svc\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.086704 master-0 kubenswrapper[36504]: I1203 22:23:25.086646 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qqjt\" (UniqueName: \"kubernetes.io/projected/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-kube-api-access-9qqjt\") pod \"dnsmasq-dns-5b4bcf6ff-xg7f9\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.158106 master-0 kubenswrapper[36504]: I1203 22:23:25.156139 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:25.173254 master-0 kubenswrapper[36504]: I1203 22:23:25.173168 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-dns-svc\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.173514 master-0 kubenswrapper[36504]: I1203 22:23:25.173388 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqrc\" (UniqueName: \"kubernetes.io/projected/2be06279-64ce-428e-b10e-6242685edf0f-kube-api-access-4vqrc\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.173576 master-0 kubenswrapper[36504]: I1203 22:23:25.173560 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-sb\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.173632 master-0 kubenswrapper[36504]: I1203 22:23:25.173590 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-nb\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.174073 master-0 kubenswrapper[36504]: I1203 22:23:25.174044 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-config\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.176258 master-0 kubenswrapper[36504]: I1203 22:23:25.175989 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-dns-svc\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.176519 master-0 kubenswrapper[36504]: I1203 22:23:25.176481 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-config\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.178196 master-0 kubenswrapper[36504]: I1203 22:23:25.177001 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-sb\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.178196 master-0 kubenswrapper[36504]: I1203 22:23:25.177251 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-nb\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.198231 master-0 kubenswrapper[36504]: I1203 22:23:25.196295 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqrc\" (UniqueName: \"kubernetes.io/projected/2be06279-64ce-428e-b10e-6242685edf0f-kube-api-access-4vqrc\") pod \"dnsmasq-dns-5556b9959f-bv6bf\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.442091 master-0 kubenswrapper[36504]: I1203 22:23:25.441996 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:25.780783 master-0 kubenswrapper[36504]: I1203 22:23:25.780714 36504 scope.go:117] "RemoveContainer" containerID="56d0065ecfcd151ed05857ec02e3f63a4e702e026fb93babf430c335683b0e1f" Dec 03 22:23:25.908862 master-0 kubenswrapper[36504]: I1203 22:23:25.908280 36504 generic.go:334] "Generic (PLEG): container finished" podID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerID="b280b189253083a97441e0ef06b856672bdc7de7774b799c182ebdffa8bed6bf" exitCode=0 Dec 03 22:23:25.909783 master-0 kubenswrapper[36504]: I1203 22:23:25.909733 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerName="dnsmasq-dns" containerID="cri-o://c86a9efb2e0dc139556d5ddfc93cb88b46254b7d4ef587f8640e95dc5e4f165d" gracePeriod=10 Dec 03 22:23:25.913251 master-0 kubenswrapper[36504]: I1203 22:23:25.913090 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" event={"ID":"fa0d6984-46f5-4f29-ba6e-75ac1a90606b","Type":"ContainerDied","Data":"b280b189253083a97441e0ef06b856672bdc7de7774b799c182ebdffa8bed6bf"} Dec 03 22:23:26.930230 master-0 kubenswrapper[36504]: I1203 22:23:26.930151 36504 generic.go:334] "Generic (PLEG): container finished" podID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerID="c86a9efb2e0dc139556d5ddfc93cb88b46254b7d4ef587f8640e95dc5e4f165d" exitCode=0 Dec 03 22:23:26.930230 master-0 kubenswrapper[36504]: I1203 22:23:26.930227 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" event={"ID":"4339d191-869f-43be-80f4-92e25c8f6a7a","Type":"ContainerDied","Data":"c86a9efb2e0dc139556d5ddfc93cb88b46254b7d4ef587f8640e95dc5e4f165d"} Dec 03 22:23:28.096513 master-0 kubenswrapper[36504]: I1203 22:23:28.096422 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:23:29.508139 master-0 kubenswrapper[36504]: I1203 22:23:29.508072 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:23:29.630153 master-0 kubenswrapper[36504]: I1203 22:23:29.627378 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-config\") pod \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " Dec 03 22:23:29.630153 master-0 kubenswrapper[36504]: I1203 22:23:29.627602 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqswd\" (UniqueName: \"kubernetes.io/projected/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-kube-api-access-vqswd\") pod \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " Dec 03 22:23:29.630153 master-0 kubenswrapper[36504]: I1203 22:23:29.627669 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-dns-svc\") pod \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\" (UID: \"fa0d6984-46f5-4f29-ba6e-75ac1a90606b\") " Dec 03 22:23:29.632120 master-0 kubenswrapper[36504]: I1203 22:23:29.632065 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-kube-api-access-vqswd" (OuterVolumeSpecName: "kube-api-access-vqswd") pod "fa0d6984-46f5-4f29-ba6e-75ac1a90606b" (UID: "fa0d6984-46f5-4f29-ba6e-75ac1a90606b"). InnerVolumeSpecName "kube-api-access-vqswd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:29.711060 master-0 kubenswrapper[36504]: I1203 22:23:29.710998 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa0d6984-46f5-4f29-ba6e-75ac1a90606b" (UID: "fa0d6984-46f5-4f29-ba6e-75ac1a90606b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:29.722413 master-0 kubenswrapper[36504]: I1203 22:23:29.722336 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-config" (OuterVolumeSpecName: "config") pod "fa0d6984-46f5-4f29-ba6e-75ac1a90606b" (UID: "fa0d6984-46f5-4f29-ba6e-75ac1a90606b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:29.730641 master-0 kubenswrapper[36504]: I1203 22:23:29.730374 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:29.730641 master-0 kubenswrapper[36504]: I1203 22:23:29.730432 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqswd\" (UniqueName: \"kubernetes.io/projected/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-kube-api-access-vqswd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:29.730641 master-0 kubenswrapper[36504]: I1203 22:23:29.730443 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0d6984-46f5-4f29-ba6e-75ac1a90606b-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:29.791565 master-0 kubenswrapper[36504]: I1203 22:23:29.791333 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:23:29.832187 master-0 kubenswrapper[36504]: I1203 22:23:29.832116 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brv6f\" (UniqueName: \"kubernetes.io/projected/4339d191-869f-43be-80f4-92e25c8f6a7a-kube-api-access-brv6f\") pod \"4339d191-869f-43be-80f4-92e25c8f6a7a\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " Dec 03 22:23:29.832404 master-0 kubenswrapper[36504]: I1203 22:23:29.832385 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-dns-svc\") pod \"4339d191-869f-43be-80f4-92e25c8f6a7a\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " Dec 03 22:23:29.832603 master-0 kubenswrapper[36504]: I1203 22:23:29.832527 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-config\") pod \"4339d191-869f-43be-80f4-92e25c8f6a7a\" (UID: \"4339d191-869f-43be-80f4-92e25c8f6a7a\") " Dec 03 22:23:29.844828 master-0 kubenswrapper[36504]: I1203 22:23:29.838798 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4339d191-869f-43be-80f4-92e25c8f6a7a-kube-api-access-brv6f" (OuterVolumeSpecName: "kube-api-access-brv6f") pod "4339d191-869f-43be-80f4-92e25c8f6a7a" (UID: "4339d191-869f-43be-80f4-92e25c8f6a7a"). InnerVolumeSpecName "kube-api-access-brv6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:29.915888 master-0 kubenswrapper[36504]: I1203 22:23:29.915796 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-config" (OuterVolumeSpecName: "config") pod "4339d191-869f-43be-80f4-92e25c8f6a7a" (UID: "4339d191-869f-43be-80f4-92e25c8f6a7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:29.941043 master-0 kubenswrapper[36504]: I1203 22:23:29.937901 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:29.941043 master-0 kubenswrapper[36504]: I1203 22:23:29.937948 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brv6f\" (UniqueName: \"kubernetes.io/projected/4339d191-869f-43be-80f4-92e25c8f6a7a-kube-api-access-brv6f\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:29.973374 master-0 kubenswrapper[36504]: I1203 22:23:29.973298 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" event={"ID":"4339d191-869f-43be-80f4-92e25c8f6a7a","Type":"ContainerDied","Data":"59ab8c5cb00a4eddfb83f74749073c881f2e5326523c7199bf205a43053e31ed"} Dec 03 22:23:29.973374 master-0 kubenswrapper[36504]: I1203 22:23:29.973362 36504 scope.go:117] "RemoveContainer" containerID="c86a9efb2e0dc139556d5ddfc93cb88b46254b7d4ef587f8640e95dc5e4f165d" Dec 03 22:23:29.974197 master-0 kubenswrapper[36504]: I1203 22:23:29.973455 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f856ff86c-xkdfs" Dec 03 22:23:29.977166 master-0 kubenswrapper[36504]: I1203 22:23:29.977123 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" event={"ID":"fa0d6984-46f5-4f29-ba6e-75ac1a90606b","Type":"ContainerDied","Data":"84d4c45a49ae3fed1fea680d50ca3895ef535b76410e8604a237cb58315c914e"} Dec 03 22:23:29.977268 master-0 kubenswrapper[36504]: I1203 22:23:29.977198 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" Dec 03 22:23:29.993344 master-0 kubenswrapper[36504]: I1203 22:23:29.993173 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4339d191-869f-43be-80f4-92e25c8f6a7a" (UID: "4339d191-869f-43be-80f4-92e25c8f6a7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:30.042798 master-0 kubenswrapper[36504]: I1203 22:23:30.039888 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4339d191-869f-43be-80f4-92e25c8f6a7a-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:30.042798 master-0 kubenswrapper[36504]: I1203 22:23:30.040135 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-z6l5h"] Dec 03 22:23:30.049451 master-0 kubenswrapper[36504]: I1203 22:23:30.049385 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658bb5765c-z6l5h"] Dec 03 22:23:30.204579 master-0 kubenswrapper[36504]: I1203 22:23:30.204501 36504 scope.go:117] "RemoveContainer" containerID="ffb65200d1a41c4029474da0ba9177300ae0f84bf709ff97c7f828181a9c86a6" Dec 03 22:23:30.248639 master-0 kubenswrapper[36504]: I1203 22:23:30.248558 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-xg7f9"] Dec 03 22:23:30.354020 master-0 kubenswrapper[36504]: I1203 22:23:30.351226 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5556b9959f-bv6bf"] Dec 03 22:23:30.433344 master-0 kubenswrapper[36504]: I1203 22:23:30.433260 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-c8s82"] Dec 03 22:23:30.485249 master-0 kubenswrapper[36504]: I1203 22:23:30.485150 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f856ff86c-xkdfs"] Dec 03 22:23:30.501278 master-0 kubenswrapper[36504]: I1203 22:23:30.501185 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f856ff86c-xkdfs"] Dec 03 22:23:30.997445 master-0 kubenswrapper[36504]: I1203 22:23:30.997339 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8b81647a-8849-4ff3-ad1b-08f0aaaa5657","Type":"ContainerStarted","Data":"2028c936613cc91bc5da2403d407a8fb21d74d03bb8c5d20898873d3bcc81c2c"} Dec 03 22:23:30.998179 master-0 kubenswrapper[36504]: I1203 22:23:30.997573 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 22:23:31.027932 master-0 kubenswrapper[36504]: I1203 22:23:31.027790 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=29.521870019 podStartE2EDuration="36.027740605s" podCreationTimestamp="2025-12-03 22:22:55 +0000 UTC" firstStartedPulling="2025-12-03 22:23:19.230551695 +0000 UTC m=+764.450323702" lastFinishedPulling="2025-12-03 22:23:25.736422281 +0000 UTC m=+770.956194288" observedRunningTime="2025-12-03 22:23:31.0218459 +0000 UTC m=+776.241617907" watchObservedRunningTime="2025-12-03 22:23:31.027740605 +0000 UTC m=+776.247512612" Dec 03 22:23:31.112242 master-0 kubenswrapper[36504]: I1203 22:23:31.112146 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" path="/var/lib/kubelet/pods/4339d191-869f-43be-80f4-92e25c8f6a7a/volumes" Dec 03 22:23:31.113307 master-0 kubenswrapper[36504]: I1203 22:23:31.113229 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" path="/var/lib/kubelet/pods/fa0d6984-46f5-4f29-ba6e-75ac1a90606b/volumes" Dec 03 22:23:31.358407 master-0 kubenswrapper[36504]: W1203 22:23:31.358215 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2be06279_64ce_428e_b10e_6242685edf0f.slice/crio-ebf58ac35deff604e242dc6b7a149aa538ec7a2b63bae6cba8353546f7567f6f WatchSource:0}: Error finding container ebf58ac35deff604e242dc6b7a149aa538ec7a2b63bae6cba8353546f7567f6f: Status 404 returned error can't find the container with id ebf58ac35deff604e242dc6b7a149aa538ec7a2b63bae6cba8353546f7567f6f Dec 03 22:23:31.446650 master-0 kubenswrapper[36504]: W1203 22:23:31.446445 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41ca94a5_8da8_4b10_85f1_5dace852dcf4.slice/crio-dbac95ad7664bc548180bd96c1332d8323f209d972742f87b54e8f78183c40ec WatchSource:0}: Error finding container dbac95ad7664bc548180bd96c1332d8323f209d972742f87b54e8f78183c40ec: Status 404 returned error can't find the container with id dbac95ad7664bc548180bd96c1332d8323f209d972742f87b54e8f78183c40ec Dec 03 22:23:31.545890 master-0 kubenswrapper[36504]: I1203 22:23:31.545835 36504 scope.go:117] "RemoveContainer" containerID="b280b189253083a97441e0ef06b856672bdc7de7774b799c182ebdffa8bed6bf" Dec 03 22:23:31.880373 master-0 kubenswrapper[36504]: I1203 22:23:31.880306 36504 scope.go:117] "RemoveContainer" containerID="9cd7b0042a846b98339790dfd1bf87d7afae20db96b6cdd86ceeead0504b8f0a" Dec 03 22:23:32.023672 master-0 kubenswrapper[36504]: I1203 22:23:32.023579 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6be1c526-0db0-4e85-9d73-8c78c20e4273","Type":"ContainerStarted","Data":"d931b4a9ee9e6c7444f8d71c51d35068dc1dd765108eeb277a4ed95fe606739e"} Dec 03 22:23:32.038437 master-0 kubenswrapper[36504]: I1203 22:23:32.025593 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" event={"ID":"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4","Type":"ContainerStarted","Data":"0f8f1ede98817c234333c78ed48ba735a8bc850c2807cf5a8129235e4bbc43f8"} Dec 03 22:23:32.038437 master-0 kubenswrapper[36504]: I1203 22:23:32.028517 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-c8s82" event={"ID":"41ca94a5-8da8-4b10-85f1-5dace852dcf4","Type":"ContainerStarted","Data":"dbac95ad7664bc548180bd96c1332d8323f209d972742f87b54e8f78183c40ec"} Dec 03 22:23:32.038437 master-0 kubenswrapper[36504]: I1203 22:23:32.029674 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" event={"ID":"2be06279-64ce-428e-b10e-6242685edf0f","Type":"ContainerStarted","Data":"ebf58ac35deff604e242dc6b7a149aa538ec7a2b63bae6cba8353546f7567f6f"} Dec 03 22:23:32.229246 master-0 kubenswrapper[36504]: I1203 22:23:32.229167 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-658bb5765c-z6l5h" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.176:5353: i/o timeout" Dec 03 22:23:33.048868 master-0 kubenswrapper[36504]: I1203 22:23:33.047659 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-54zjs" event={"ID":"1aa8ce82-8c7e-4f58-8585-8ca5680b6951","Type":"ContainerStarted","Data":"770bfd32bb7ed6c972e8ca4d18ad424b1f95738c4b3164e03be72be317c344f5"} Dec 03 22:23:33.054790 master-0 kubenswrapper[36504]: I1203 22:23:33.053052 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6523a229-5417-4476-bf50-cdade89475cd","Type":"ContainerStarted","Data":"a2cc072293a4b8b4452b0b1f0be6de5f5e88dcfd08d8b209a9f1ac8f349210bb"} Dec 03 22:23:33.058291 master-0 kubenswrapper[36504]: I1203 22:23:33.057186 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b" event={"ID":"13962c28-06ea-4b66-aa46-f00d50e29eda","Type":"ContainerStarted","Data":"581df5167490695d9913981693836f4b8861f3c86a9cedb8d266a4625a28f246"} Dec 03 22:23:33.058291 master-0 kubenswrapper[36504]: I1203 22:23:33.057348 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zqx7b" Dec 03 22:23:33.060379 master-0 kubenswrapper[36504]: I1203 22:23:33.060218 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1ed8445-bac4-4e69-9462-c1e33f646315","Type":"ContainerStarted","Data":"1dbdb4deba8c8efcf57012a3290c56b7fd84aa1c288d8d4282152edafe1396c4"} Dec 03 22:23:33.062480 master-0 kubenswrapper[36504]: I1203 22:23:33.062434 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7","Type":"ContainerStarted","Data":"e918f84a2ce973f71264121a8ce485cdbc138f1ce94cf99b6cb1531244036d45"} Dec 03 22:23:33.064757 master-0 kubenswrapper[36504]: I1203 22:23:33.064718 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1284fac2-9956-456c-9781-135e637e85bd","Type":"ContainerStarted","Data":"8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3"} Dec 03 22:23:33.064910 master-0 kubenswrapper[36504]: I1203 22:23:33.064859 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 22:23:33.066835 master-0 kubenswrapper[36504]: I1203 22:23:33.066752 36504 generic.go:334] "Generic (PLEG): container finished" podID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerID="6429a275acb5013bead8f19bcf570490cb7516920ad031acc6331769aa93b329" exitCode=0 Dec 03 22:23:33.066900 master-0 kubenswrapper[36504]: I1203 22:23:33.066873 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" event={"ID":"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4","Type":"ContainerDied","Data":"6429a275acb5013bead8f19bcf570490cb7516920ad031acc6331769aa93b329"} Dec 03 22:23:33.084367 master-0 kubenswrapper[36504]: I1203 22:23:33.074448 36504 generic.go:334] "Generic (PLEG): container finished" podID="2be06279-64ce-428e-b10e-6242685edf0f" containerID="fe98476cce556dd3262d8fcd8494f71da6bedc788c482f94c3b98b79cf8b31b7" exitCode=0 Dec 03 22:23:33.084367 master-0 kubenswrapper[36504]: I1203 22:23:33.074605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" event={"ID":"2be06279-64ce-428e-b10e-6242685edf0f","Type":"ContainerDied","Data":"fe98476cce556dd3262d8fcd8494f71da6bedc788c482f94c3b98b79cf8b31b7"} Dec 03 22:23:33.091939 master-0 kubenswrapper[36504]: I1203 22:23:33.090370 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16589858-9fd3-467e-9b55-e3cefb7d7a1a","Type":"ContainerStarted","Data":"96b15bf05903c9566054a8d8794d08a4b5a5875086901bedf858cb90c951aaec"} Dec 03 22:23:33.121986 master-0 kubenswrapper[36504]: I1203 22:23:33.121843 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.351873357 podStartE2EDuration="35.121813511s" podCreationTimestamp="2025-12-03 22:22:58 +0000 UTC" firstStartedPulling="2025-12-03 22:23:19.896809585 +0000 UTC m=+765.116581592" lastFinishedPulling="2025-12-03 22:23:31.666749739 +0000 UTC m=+776.886521746" observedRunningTime="2025-12-03 22:23:33.103077142 +0000 UTC m=+778.322849159" watchObservedRunningTime="2025-12-03 22:23:33.121813511 +0000 UTC m=+778.341585518" Dec 03 22:23:33.170438 master-0 kubenswrapper[36504]: I1203 22:23:33.169905 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zqx7b" podStartSLOduration=26.352818716 podStartE2EDuration="35.169877881s" podCreationTimestamp="2025-12-03 22:22:58 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.756720852 +0000 UTC m=+765.976492859" lastFinishedPulling="2025-12-03 22:23:29.573780017 +0000 UTC m=+774.793552024" observedRunningTime="2025-12-03 22:23:33.156894113 +0000 UTC m=+778.376666130" watchObservedRunningTime="2025-12-03 22:23:33.169877881 +0000 UTC m=+778.389649888" Dec 03 22:23:34.112475 master-0 kubenswrapper[36504]: I1203 22:23:34.112410 36504 generic.go:334] "Generic (PLEG): container finished" podID="1aa8ce82-8c7e-4f58-8585-8ca5680b6951" containerID="770bfd32bb7ed6c972e8ca4d18ad424b1f95738c4b3164e03be72be317c344f5" exitCode=0 Dec 03 22:23:34.113428 master-0 kubenswrapper[36504]: I1203 22:23:34.112576 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-54zjs" event={"ID":"1aa8ce82-8c7e-4f58-8585-8ca5680b6951","Type":"ContainerDied","Data":"770bfd32bb7ed6c972e8ca4d18ad424b1f95738c4b3164e03be72be317c344f5"} Dec 03 22:23:35.130564 master-0 kubenswrapper[36504]: I1203 22:23:35.130371 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed","Type":"ContainerStarted","Data":"b06d77b769597bccba710e9daadeadfb465fe9daf309858d47da3df25d06f5b3"} Dec 03 22:23:35.135987 master-0 kubenswrapper[36504]: I1203 22:23:35.135605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" event={"ID":"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4","Type":"ContainerStarted","Data":"6346a4ab8837b2f87a2edc9db97d7e0202ead3dccab811f25aa2838ccb62d005"} Dec 03 22:23:35.135987 master-0 kubenswrapper[36504]: I1203 22:23:35.135740 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:35.141135 master-0 kubenswrapper[36504]: I1203 22:23:35.139493 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" event={"ID":"2be06279-64ce-428e-b10e-6242685edf0f","Type":"ContainerStarted","Data":"69dbd6c6a3c44d811956ccd24dfc2d79504d3799decacf2b66102791b8253b91"} Dec 03 22:23:35.141135 master-0 kubenswrapper[36504]: I1203 22:23:35.140522 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:35.144150 master-0 kubenswrapper[36504]: I1203 22:23:35.144102 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-54zjs" event={"ID":"1aa8ce82-8c7e-4f58-8585-8ca5680b6951","Type":"ContainerStarted","Data":"07c3289effb04107e5f1f4c3955a5c38e4b19ee65daabd7345d25eb4b7db80f7"} Dec 03 22:23:35.206596 master-0 kubenswrapper[36504]: I1203 22:23:35.206435 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" podStartSLOduration=11.206403738 podStartE2EDuration="11.206403738s" podCreationTimestamp="2025-12-03 22:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:23:35.19915475 +0000 UTC m=+780.418926767" watchObservedRunningTime="2025-12-03 22:23:35.206403738 +0000 UTC m=+780.426175745" Dec 03 22:23:35.224987 master-0 kubenswrapper[36504]: I1203 22:23:35.224742 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" podStartSLOduration=11.224712754 podStartE2EDuration="11.224712754s" podCreationTimestamp="2025-12-03 22:23:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:23:35.224328542 +0000 UTC m=+780.444100569" watchObservedRunningTime="2025-12-03 22:23:35.224712754 +0000 UTC m=+780.444484761" Dec 03 22:23:35.717803 master-0 kubenswrapper[36504]: E1203 22:23:35.714245 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:23:35.994436 master-0 kubenswrapper[36504]: I1203 22:23:35.994271 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 22:23:36.168322 master-0 kubenswrapper[36504]: I1203 22:23:36.168244 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerStarted","Data":"da97bf42436e285c9bb845abd9c57693d0e89d6eab143891326d1104005396f2"} Dec 03 22:23:36.177417 master-0 kubenswrapper[36504]: I1203 22:23:36.177346 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55ac1860-363d-42ff-90dd-3b6bf2e78864","Type":"ContainerStarted","Data":"01bb597e9de0c339f23df271e698add5189147ab7875d6d3bf0c690e20096ea1"} Dec 03 22:23:36.272465 master-0 kubenswrapper[36504]: I1203 22:23:36.271048 36504 trace.go:236] Trace[2084927500]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (03-Dec-2025 22:23:35.089) (total time: 1181ms): Dec 03 22:23:36.272465 master-0 kubenswrapper[36504]: Trace[2084927500]: [1.181426661s] [1.181426661s] END Dec 03 22:23:36.431830 master-0 kubenswrapper[36504]: I1203 22:23:36.427731 36504 trace.go:236] Trace[2131560969]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (03-Dec-2025 22:23:35.090) (total time: 1337ms): Dec 03 22:23:36.431830 master-0 kubenswrapper[36504]: Trace[2131560969]: [1.337524778s] [1.337524778s] END Dec 03 22:23:36.608261 master-0 kubenswrapper[36504]: I1203 22:23:36.608212 36504 trace.go:236] Trace[1451175685]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (03-Dec-2025 22:23:35.088) (total time: 1519ms): Dec 03 22:23:36.608261 master-0 kubenswrapper[36504]: Trace[1451175685]: [1.519358024s] [1.519358024s] END Dec 03 22:23:36.770919 master-0 kubenswrapper[36504]: I1203 22:23:36.770875 36504 trace.go:236] Trace[1005924112]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (03-Dec-2025 22:23:35.087) (total time: 1683ms): Dec 03 22:23:36.770919 master-0 kubenswrapper[36504]: Trace[1005924112]: [1.683298074s] [1.683298074s] END Dec 03 22:23:37.191724 master-0 kubenswrapper[36504]: I1203 22:23:37.191366 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6523a229-5417-4476-bf50-cdade89475cd","Type":"ContainerStarted","Data":"202cbf7425dc46a2572d023744cacc0d90f07ecb4dad6a8bc5b8005ae1433b3c"} Dec 03 22:23:37.193130 master-0 kubenswrapper[36504]: I1203 22:23:37.193105 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"d26acd77-0fe5-4aa6-990e-e62fc1cd1da7","Type":"ContainerStarted","Data":"bc8f5ebdc75be30b6ea99e6e939f928fed688d091ee3fa7aa69cee36dae69a09"} Dec 03 22:23:37.195363 master-0 kubenswrapper[36504]: I1203 22:23:37.195333 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-c8s82" event={"ID":"41ca94a5-8da8-4b10-85f1-5dace852dcf4","Type":"ContainerStarted","Data":"2974b9311f4ddc554391daab69dec7cb9e83f0eefe45a261fe7928a839c58a99"} Dec 03 22:23:37.199671 master-0 kubenswrapper[36504]: I1203 22:23:37.199591 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-54zjs" event={"ID":"1aa8ce82-8c7e-4f58-8585-8ca5680b6951","Type":"ContainerStarted","Data":"3a224b08304137f223b7ac75642cf7629d9791e304e7dacde83cd9cd2acd7732"} Dec 03 22:23:37.200261 master-0 kubenswrapper[36504]: I1203 22:23:37.200236 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:23:37.200323 master-0 kubenswrapper[36504]: I1203 22:23:37.200269 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:23:37.222935 master-0 kubenswrapper[36504]: I1203 22:23:37.222814 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.104468251 podStartE2EDuration="38.222784352s" podCreationTimestamp="2025-12-03 22:22:59 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.366196748 +0000 UTC m=+765.585968755" lastFinishedPulling="2025-12-03 22:23:36.484512849 +0000 UTC m=+781.704284856" observedRunningTime="2025-12-03 22:23:37.220588663 +0000 UTC m=+782.440360730" watchObservedRunningTime="2025-12-03 22:23:37.222784352 +0000 UTC m=+782.442556369" Dec 03 22:23:37.260952 master-0 kubenswrapper[36504]: I1203 22:23:37.260835 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-54zjs" podStartSLOduration=30.654534217 podStartE2EDuration="39.260813337s" podCreationTimestamp="2025-12-03 22:22:58 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.776569106 +0000 UTC m=+765.996341113" lastFinishedPulling="2025-12-03 22:23:29.382848226 +0000 UTC m=+774.602620233" observedRunningTime="2025-12-03 22:23:37.254328194 +0000 UTC m=+782.474100201" watchObservedRunningTime="2025-12-03 22:23:37.260813337 +0000 UTC m=+782.480585354" Dec 03 22:23:37.285987 master-0 kubenswrapper[36504]: I1203 22:23:37.285897 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-c8s82" podStartSLOduration=8.240708118 podStartE2EDuration="13.285824624s" podCreationTimestamp="2025-12-03 22:23:24 +0000 UTC" firstStartedPulling="2025-12-03 22:23:31.450059318 +0000 UTC m=+776.669831325" lastFinishedPulling="2025-12-03 22:23:36.495175824 +0000 UTC m=+781.714947831" observedRunningTime="2025-12-03 22:23:37.272264287 +0000 UTC m=+782.492036294" watchObservedRunningTime="2025-12-03 22:23:37.285824624 +0000 UTC m=+782.505596631" Dec 03 22:23:37.310091 master-0 kubenswrapper[36504]: I1203 22:23:37.309721 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.593685328 podStartE2EDuration="34.309695634s" podCreationTimestamp="2025-12-03 22:23:03 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.7652367 +0000 UTC m=+765.985008707" lastFinishedPulling="2025-12-03 22:23:36.481246986 +0000 UTC m=+781.701019013" observedRunningTime="2025-12-03 22:23:37.295382964 +0000 UTC m=+782.515155011" watchObservedRunningTime="2025-12-03 22:23:37.309695634 +0000 UTC m=+782.529467641" Dec 03 22:23:37.798010 master-0 kubenswrapper[36504]: I1203 22:23:37.797797 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:37.846524 master-0 kubenswrapper[36504]: I1203 22:23:37.846431 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:38.215885 master-0 kubenswrapper[36504]: I1203 22:23:38.215767 36504 generic.go:334] "Generic (PLEG): container finished" podID="a1ed8445-bac4-4e69-9462-c1e33f646315" containerID="1dbdb4deba8c8efcf57012a3290c56b7fd84aa1c288d8d4282152edafe1396c4" exitCode=0 Dec 03 22:23:38.217069 master-0 kubenswrapper[36504]: I1203 22:23:38.215951 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1ed8445-bac4-4e69-9462-c1e33f646315","Type":"ContainerDied","Data":"1dbdb4deba8c8efcf57012a3290c56b7fd84aa1c288d8d4282152edafe1396c4"} Dec 03 22:23:38.217652 master-0 kubenswrapper[36504]: I1203 22:23:38.217586 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:38.278160 master-0 kubenswrapper[36504]: I1203 22:23:38.278056 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 22:23:38.542616 master-0 kubenswrapper[36504]: I1203 22:23:38.542511 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 22:23:38.983424 master-0 kubenswrapper[36504]: I1203 22:23:38.978712 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5556b9959f-bv6bf"] Dec 03 22:23:38.983424 master-0 kubenswrapper[36504]: I1203 22:23:38.979117 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" podUID="2be06279-64ce-428e-b10e-6242685edf0f" containerName="dnsmasq-dns" containerID="cri-o://69dbd6c6a3c44d811956ccd24dfc2d79504d3799decacf2b66102791b8253b91" gracePeriod=10 Dec 03 22:23:39.013704 master-0 kubenswrapper[36504]: I1203 22:23:39.013374 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-tzsmz"] Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: E1203 22:23:39.013853 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerName="init" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: I1203 22:23:39.013891 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerName="init" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: E1203 22:23:39.013911 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerName="dnsmasq-dns" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: I1203 22:23:39.013920 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerName="dnsmasq-dns" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: E1203 22:23:39.013950 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="dnsmasq-dns" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: I1203 22:23:39.013957 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="dnsmasq-dns" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: E1203 22:23:39.013966 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="init" Dec 03 22:23:39.013996 master-0 kubenswrapper[36504]: I1203 22:23:39.013972 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="init" Dec 03 22:23:39.014358 master-0 kubenswrapper[36504]: I1203 22:23:39.014348 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="4339d191-869f-43be-80f4-92e25c8f6a7a" containerName="dnsmasq-dns" Dec 03 22:23:39.014405 master-0 kubenswrapper[36504]: I1203 22:23:39.014377 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0d6984-46f5-4f29-ba6e-75ac1a90606b" containerName="dnsmasq-dns" Dec 03 22:23:39.016179 master-0 kubenswrapper[36504]: I1203 22:23:39.016134 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.076261 master-0 kubenswrapper[36504]: I1203 22:23:39.076180 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-tzsmz"] Dec 03 22:23:39.139265 master-0 kubenswrapper[36504]: I1203 22:23:39.139190 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-dns-svc\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.139482 master-0 kubenswrapper[36504]: I1203 22:23:39.139311 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtnp\" (UniqueName: \"kubernetes.io/projected/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-kube-api-access-vvtnp\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.140187 master-0 kubenswrapper[36504]: I1203 22:23:39.140104 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.140259 master-0 kubenswrapper[36504]: I1203 22:23:39.140234 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.140489 master-0 kubenswrapper[36504]: I1203 22:23:39.140459 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-config\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.210176 master-0 kubenswrapper[36504]: I1203 22:23:39.210132 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:39.210478 master-0 kubenswrapper[36504]: I1203 22:23:39.210467 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:39.235956 master-0 kubenswrapper[36504]: I1203 22:23:39.235892 36504 generic.go:334] "Generic (PLEG): container finished" podID="2be06279-64ce-428e-b10e-6242685edf0f" containerID="69dbd6c6a3c44d811956ccd24dfc2d79504d3799decacf2b66102791b8253b91" exitCode=0 Dec 03 22:23:39.236602 master-0 kubenswrapper[36504]: I1203 22:23:39.235992 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" event={"ID":"2be06279-64ce-428e-b10e-6242685edf0f","Type":"ContainerDied","Data":"69dbd6c6a3c44d811956ccd24dfc2d79504d3799decacf2b66102791b8253b91"} Dec 03 22:23:39.238497 master-0 kubenswrapper[36504]: I1203 22:23:39.238318 36504 generic.go:334] "Generic (PLEG): container finished" podID="16589858-9fd3-467e-9b55-e3cefb7d7a1a" containerID="96b15bf05903c9566054a8d8794d08a4b5a5875086901bedf858cb90c951aaec" exitCode=0 Dec 03 22:23:39.238497 master-0 kubenswrapper[36504]: I1203 22:23:39.238442 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16589858-9fd3-467e-9b55-e3cefb7d7a1a","Type":"ContainerDied","Data":"96b15bf05903c9566054a8d8794d08a4b5a5875086901bedf858cb90c951aaec"} Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.243110 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-dns-svc\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.243247 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtnp\" (UniqueName: \"kubernetes.io/projected/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-kube-api-access-vvtnp\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.243523 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.243558 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.243703 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-config\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.244746 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-config\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.245353 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-dns-svc\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.247091 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.255894 master-0 kubenswrapper[36504]: I1203 22:23:39.247688 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.267164 master-0 kubenswrapper[36504]: I1203 22:23:39.265634 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a1ed8445-bac4-4e69-9462-c1e33f646315","Type":"ContainerStarted","Data":"921c71870c14d6eef862c44cc0861caf0a00ff6c124da6715daca66b3f583456"} Dec 03 22:23:39.277967 master-0 kubenswrapper[36504]: I1203 22:23:39.270654 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtnp\" (UniqueName: \"kubernetes.io/projected/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-kube-api-access-vvtnp\") pod \"dnsmasq-dns-7d4d74cb79-tzsmz\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.287169 master-0 kubenswrapper[36504]: I1203 22:23:39.287101 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:39.369795 master-0 kubenswrapper[36504]: I1203 22:23:39.363269 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=37.556643599 podStartE2EDuration="47.363230455s" podCreationTimestamp="2025-12-03 22:22:52 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.213046374 +0000 UTC m=+765.432818381" lastFinishedPulling="2025-12-03 22:23:30.01963322 +0000 UTC m=+775.239405237" observedRunningTime="2025-12-03 22:23:39.309715184 +0000 UTC m=+784.529487191" watchObservedRunningTime="2025-12-03 22:23:39.363230455 +0000 UTC m=+784.583002462" Dec 03 22:23:39.441813 master-0 kubenswrapper[36504]: I1203 22:23:39.439095 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:39.590913 master-0 kubenswrapper[36504]: I1203 22:23:39.590580 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:39.681681 master-0 kubenswrapper[36504]: I1203 22:23:39.681617 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-config\") pod \"2be06279-64ce-428e-b10e-6242685edf0f\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " Dec 03 22:23:39.682090 master-0 kubenswrapper[36504]: I1203 22:23:39.682075 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vqrc\" (UniqueName: \"kubernetes.io/projected/2be06279-64ce-428e-b10e-6242685edf0f-kube-api-access-4vqrc\") pod \"2be06279-64ce-428e-b10e-6242685edf0f\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " Dec 03 22:23:39.682284 master-0 kubenswrapper[36504]: I1203 22:23:39.682270 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-dns-svc\") pod \"2be06279-64ce-428e-b10e-6242685edf0f\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " Dec 03 22:23:39.682444 master-0 kubenswrapper[36504]: I1203 22:23:39.682431 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-sb\") pod \"2be06279-64ce-428e-b10e-6242685edf0f\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " Dec 03 22:23:39.682539 master-0 kubenswrapper[36504]: I1203 22:23:39.682527 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-nb\") pod \"2be06279-64ce-428e-b10e-6242685edf0f\" (UID: \"2be06279-64ce-428e-b10e-6242685edf0f\") " Dec 03 22:23:39.689474 master-0 kubenswrapper[36504]: I1203 22:23:39.689432 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be06279-64ce-428e-b10e-6242685edf0f-kube-api-access-4vqrc" (OuterVolumeSpecName: "kube-api-access-4vqrc") pod "2be06279-64ce-428e-b10e-6242685edf0f" (UID: "2be06279-64ce-428e-b10e-6242685edf0f"). InnerVolumeSpecName "kube-api-access-4vqrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:39.790498 master-0 kubenswrapper[36504]: I1203 22:23:39.790372 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vqrc\" (UniqueName: \"kubernetes.io/projected/2be06279-64ce-428e-b10e-6242685edf0f-kube-api-access-4vqrc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:39.837050 master-0 kubenswrapper[36504]: I1203 22:23:39.836830 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-config" (OuterVolumeSpecName: "config") pod "2be06279-64ce-428e-b10e-6242685edf0f" (UID: "2be06279-64ce-428e-b10e-6242685edf0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:39.842666 master-0 kubenswrapper[36504]: I1203 22:23:39.841367 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2be06279-64ce-428e-b10e-6242685edf0f" (UID: "2be06279-64ce-428e-b10e-6242685edf0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:39.846718 master-0 kubenswrapper[36504]: I1203 22:23:39.846642 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2be06279-64ce-428e-b10e-6242685edf0f" (UID: "2be06279-64ce-428e-b10e-6242685edf0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:39.866692 master-0 kubenswrapper[36504]: I1203 22:23:39.863461 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2be06279-64ce-428e-b10e-6242685edf0f" (UID: "2be06279-64ce-428e-b10e-6242685edf0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:39.904289 master-0 kubenswrapper[36504]: I1203 22:23:39.899031 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:39.904289 master-0 kubenswrapper[36504]: I1203 22:23:39.899090 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:39.904289 master-0 kubenswrapper[36504]: I1203 22:23:39.899105 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:39.904289 master-0 kubenswrapper[36504]: I1203 22:23:39.899115 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be06279-64ce-428e-b10e-6242685edf0f-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:40.096978 master-0 kubenswrapper[36504]: I1203 22:23:40.096699 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-tzsmz"] Dec 03 22:23:40.159135 master-0 kubenswrapper[36504]: I1203 22:23:40.158971 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:40.282930 master-0 kubenswrapper[36504]: I1203 22:23:40.282865 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" event={"ID":"2be06279-64ce-428e-b10e-6242685edf0f","Type":"ContainerDied","Data":"ebf58ac35deff604e242dc6b7a149aa538ec7a2b63bae6cba8353546f7567f6f"} Dec 03 22:23:40.283566 master-0 kubenswrapper[36504]: I1203 22:23:40.282944 36504 scope.go:117] "RemoveContainer" containerID="69dbd6c6a3c44d811956ccd24dfc2d79504d3799decacf2b66102791b8253b91" Dec 03 22:23:40.283566 master-0 kubenswrapper[36504]: I1203 22:23:40.283108 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5556b9959f-bv6bf" Dec 03 22:23:40.295295 master-0 kubenswrapper[36504]: I1203 22:23:40.295227 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"16589858-9fd3-467e-9b55-e3cefb7d7a1a","Type":"ContainerStarted","Data":"6e6ce4ab3ed31b51b27600b13769fa43103d1d88b50e31c1cca8a5334107487e"} Dec 03 22:23:40.299406 master-0 kubenswrapper[36504]: I1203 22:23:40.299355 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" event={"ID":"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68","Type":"ContainerStarted","Data":"eca8fcfc671d95e8d2d56670c2995b26241db87d626281256605715354de2d8b"} Dec 03 22:23:40.346009 master-0 kubenswrapper[36504]: I1203 22:23:40.345828 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=38.539278074 podStartE2EDuration="48.345798908s" podCreationTimestamp="2025-12-03 22:22:52 +0000 UTC" firstStartedPulling="2025-12-03 22:23:19.931189566 +0000 UTC m=+765.150961573" lastFinishedPulling="2025-12-03 22:23:29.7377104 +0000 UTC m=+774.957482407" observedRunningTime="2025-12-03 22:23:40.333600104 +0000 UTC m=+785.553372121" watchObservedRunningTime="2025-12-03 22:23:40.345798908 +0000 UTC m=+785.565570915" Dec 03 22:23:40.381016 master-0 kubenswrapper[36504]: I1203 22:23:40.379867 36504 scope.go:117] "RemoveContainer" containerID="fe98476cce556dd3262d8fcd8494f71da6bedc788c482f94c3b98b79cf8b31b7" Dec 03 22:23:40.393596 master-0 kubenswrapper[36504]: I1203 22:23:40.393121 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 22:23:40.430014 master-0 kubenswrapper[36504]: I1203 22:23:40.428682 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5556b9959f-bv6bf"] Dec 03 22:23:40.506977 master-0 kubenswrapper[36504]: I1203 22:23:40.506463 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5556b9959f-bv6bf"] Dec 03 22:23:40.625940 master-0 kubenswrapper[36504]: I1203 22:23:40.620293 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 22:23:40.625940 master-0 kubenswrapper[36504]: E1203 22:23:40.621038 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be06279-64ce-428e-b10e-6242685edf0f" containerName="init" Dec 03 22:23:40.625940 master-0 kubenswrapper[36504]: I1203 22:23:40.621060 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be06279-64ce-428e-b10e-6242685edf0f" containerName="init" Dec 03 22:23:40.625940 master-0 kubenswrapper[36504]: E1203 22:23:40.621086 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be06279-64ce-428e-b10e-6242685edf0f" containerName="dnsmasq-dns" Dec 03 22:23:40.625940 master-0 kubenswrapper[36504]: I1203 22:23:40.621096 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be06279-64ce-428e-b10e-6242685edf0f" containerName="dnsmasq-dns" Dec 03 22:23:40.625940 master-0 kubenswrapper[36504]: I1203 22:23:40.621473 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be06279-64ce-428e-b10e-6242685edf0f" containerName="dnsmasq-dns" Dec 03 22:23:40.631053 master-0 kubenswrapper[36504]: I1203 22:23:40.630993 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 22:23:40.634600 master-0 kubenswrapper[36504]: I1203 22:23:40.634554 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 22:23:40.637680 master-0 kubenswrapper[36504]: I1203 22:23:40.635717 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 22:23:40.637680 master-0 kubenswrapper[36504]: I1203 22:23:40.635883 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 22:23:40.671888 master-0 kubenswrapper[36504]: I1203 22:23:40.663091 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 22:23:40.739087 master-0 kubenswrapper[36504]: I1203 22:23:40.738984 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/767eba72-715d-4e13-9116-ac583b611b78-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.739087 master-0 kubenswrapper[36504]: I1203 22:23:40.739082 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.739445 master-0 kubenswrapper[36504]: I1203 22:23:40.739118 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.739445 master-0 kubenswrapper[36504]: I1203 22:23:40.739149 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.739445 master-0 kubenswrapper[36504]: I1203 22:23:40.739203 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/767eba72-715d-4e13-9116-ac583b611b78-scripts\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.739445 master-0 kubenswrapper[36504]: I1203 22:23:40.739247 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbfsl\" (UniqueName: \"kubernetes.io/projected/767eba72-715d-4e13-9116-ac583b611b78-kube-api-access-rbfsl\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.739445 master-0 kubenswrapper[36504]: I1203 22:23:40.739272 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/767eba72-715d-4e13-9116-ac583b611b78-config\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842209 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842341 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842366 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842432 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/767eba72-715d-4e13-9116-ac583b611b78-scripts\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842492 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbfsl\" (UniqueName: \"kubernetes.io/projected/767eba72-715d-4e13-9116-ac583b611b78-kube-api-access-rbfsl\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842522 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/767eba72-715d-4e13-9116-ac583b611b78-config\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.843799 master-0 kubenswrapper[36504]: I1203 22:23:40.842615 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/767eba72-715d-4e13-9116-ac583b611b78-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.847247 master-0 kubenswrapper[36504]: I1203 22:23:40.847205 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.850892 master-0 kubenswrapper[36504]: I1203 22:23:40.850859 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.854412 master-0 kubenswrapper[36504]: I1203 22:23:40.854371 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/767eba72-715d-4e13-9116-ac583b611b78-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.855217 master-0 kubenswrapper[36504]: I1203 22:23:40.855172 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/767eba72-715d-4e13-9116-ac583b611b78-scripts\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.856149 master-0 kubenswrapper[36504]: I1203 22:23:40.856106 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/767eba72-715d-4e13-9116-ac583b611b78-config\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:40.856464 master-0 kubenswrapper[36504]: I1203 22:23:40.856423 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/767eba72-715d-4e13-9116-ac583b611b78-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:41.102221 master-0 kubenswrapper[36504]: I1203 22:23:41.102047 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbfsl\" (UniqueName: \"kubernetes.io/projected/767eba72-715d-4e13-9116-ac583b611b78-kube-api-access-rbfsl\") pod \"ovn-northd-0\" (UID: \"767eba72-715d-4e13-9116-ac583b611b78\") " pod="openstack/ovn-northd-0" Dec 03 22:23:41.111087 master-0 kubenswrapper[36504]: I1203 22:23:41.111028 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be06279-64ce-428e-b10e-6242685edf0f" path="/var/lib/kubelet/pods/2be06279-64ce-428e-b10e-6242685edf0f/volumes" Dec 03 22:23:41.144801 master-0 kubenswrapper[36504]: I1203 22:23:41.141845 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 22:23:41.155692 master-0 kubenswrapper[36504]: I1203 22:23:41.153791 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 22:23:41.159136 master-0 kubenswrapper[36504]: I1203 22:23:41.158946 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 22:23:41.161567 master-0 kubenswrapper[36504]: I1203 22:23:41.159374 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 22:23:41.161567 master-0 kubenswrapper[36504]: I1203 22:23:41.159675 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 22:23:41.208801 master-0 kubenswrapper[36504]: I1203 22:23:41.207682 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 22:23:41.280422 master-0 kubenswrapper[36504]: I1203 22:23:41.279842 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 22:23:41.288873 master-0 kubenswrapper[36504]: I1203 22:23:41.286708 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bd7edf8c-1bec-444f-8c92-24d875291097\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3e6022d-4434-445a-9252-9700fe6f6b88\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.288873 master-0 kubenswrapper[36504]: I1203 22:23:41.287029 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.288873 master-0 kubenswrapper[36504]: I1203 22:23:41.287100 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/07b5d783-be96-4a6b-8603-e6f56f13f233-cache\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.288873 master-0 kubenswrapper[36504]: I1203 22:23:41.287171 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/07b5d783-be96-4a6b-8603-e6f56f13f233-lock\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.288873 master-0 kubenswrapper[36504]: I1203 22:23:41.287329 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwdtx\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-kube-api-access-lwdtx\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.322815 master-0 kubenswrapper[36504]: I1203 22:23:41.322681 36504 generic.go:334] "Generic (PLEG): container finished" podID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerID="c73f5b3280ba614c0f0b119d6e3296fb48fc8dc1df7ae7487f2d4fe89a88941a" exitCode=0 Dec 03 22:23:41.323124 master-0 kubenswrapper[36504]: I1203 22:23:41.322860 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" event={"ID":"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68","Type":"ContainerDied","Data":"c73f5b3280ba614c0f0b119d6e3296fb48fc8dc1df7ae7487f2d4fe89a88941a"} Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.389701 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.389800 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/07b5d783-be96-4a6b-8603-e6f56f13f233-cache\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.389848 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/07b5d783-be96-4a6b-8603-e6f56f13f233-lock\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.389927 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwdtx\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-kube-api-access-lwdtx\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.389959 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bd7edf8c-1bec-444f-8c92-24d875291097\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3e6022d-4434-445a-9252-9700fe6f6b88\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: E1203 22:23:41.390628 36504 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: E1203 22:23:41.390650 36504 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: E1203 22:23:41.390706 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift podName:07b5d783-be96-4a6b-8603-e6f56f13f233 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:41.890690487 +0000 UTC m=+787.110462494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift") pod "swift-storage-0" (UID: "07b5d783-be96-4a6b-8603-e6f56f13f233") : configmap "swift-ring-files" not found Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.391295 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/07b5d783-be96-4a6b-8603-e6f56f13f233-cache\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.391805 master-0 kubenswrapper[36504]: I1203 22:23:41.391536 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/07b5d783-be96-4a6b-8603-e6f56f13f233-lock\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.412809 master-0 kubenswrapper[36504]: I1203 22:23:41.412724 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:23:41.412809 master-0 kubenswrapper[36504]: I1203 22:23:41.412810 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bd7edf8c-1bec-444f-8c92-24d875291097\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3e6022d-4434-445a-9252-9700fe6f6b88\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/1d93a95bca1dea8f2e31874c14abe500b767dc9ce08f3b63cdb3dd551ba171e9/globalmount\"" pod="openstack/swift-storage-0" Dec 03 22:23:41.478710 master-0 kubenswrapper[36504]: I1203 22:23:41.477183 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwdtx\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-kube-api-access-lwdtx\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.906694 master-0 kubenswrapper[36504]: I1203 22:23:41.906463 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:41.907044 master-0 kubenswrapper[36504]: E1203 22:23:41.906757 36504 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:23:41.907044 master-0 kubenswrapper[36504]: E1203 22:23:41.906794 36504 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:23:41.907044 master-0 kubenswrapper[36504]: E1203 22:23:41.906859 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift podName:07b5d783-be96-4a6b-8603-e6f56f13f233 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:42.90683705 +0000 UTC m=+788.126609057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift") pod "swift-storage-0" (UID: "07b5d783-be96-4a6b-8603-e6f56f13f233") : configmap "swift-ring-files" not found Dec 03 22:23:41.989237 master-0 kubenswrapper[36504]: W1203 22:23:41.989160 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod767eba72_715d_4e13_9116_ac583b611b78.slice/crio-27024a4e5cc5d4bf1fe47058eb94e1a210f848a635beac61ccccb6621873d807 WatchSource:0}: Error finding container 27024a4e5cc5d4bf1fe47058eb94e1a210f848a635beac61ccccb6621873d807: Status 404 returned error can't find the container with id 27024a4e5cc5d4bf1fe47058eb94e1a210f848a635beac61ccccb6621873d807 Dec 03 22:23:41.996618 master-0 kubenswrapper[36504]: I1203 22:23:41.996504 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 22:23:42.380833 master-0 kubenswrapper[36504]: I1203 22:23:42.377146 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" event={"ID":"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68","Type":"ContainerStarted","Data":"80780ce9696d05c93094744c8896979af176bea4c25495509bf907b498957ade"} Dec 03 22:23:42.380833 master-0 kubenswrapper[36504]: I1203 22:23:42.377258 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:42.380833 master-0 kubenswrapper[36504]: I1203 22:23:42.379496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"767eba72-715d-4e13-9116-ac583b611b78","Type":"ContainerStarted","Data":"27024a4e5cc5d4bf1fe47058eb94e1a210f848a635beac61ccccb6621873d807"} Dec 03 22:23:42.407300 master-0 kubenswrapper[36504]: I1203 22:23:42.407190 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" podStartSLOduration=4.407165495 podStartE2EDuration="4.407165495s" podCreationTimestamp="2025-12-03 22:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:23:42.40602874 +0000 UTC m=+787.625800747" watchObservedRunningTime="2025-12-03 22:23:42.407165495 +0000 UTC m=+787.626937502" Dec 03 22:23:42.880407 master-0 kubenswrapper[36504]: I1203 22:23:42.880327 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bd7edf8c-1bec-444f-8c92-24d875291097\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a3e6022d-4434-445a-9252-9700fe6f6b88\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:42.936616 master-0 kubenswrapper[36504]: I1203 22:23:42.936503 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:42.936929 master-0 kubenswrapper[36504]: E1203 22:23:42.936875 36504 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:23:42.936999 master-0 kubenswrapper[36504]: E1203 22:23:42.936931 36504 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:23:42.937292 master-0 kubenswrapper[36504]: E1203 22:23:42.937252 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift podName:07b5d783-be96-4a6b-8603-e6f56f13f233 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:44.937193634 +0000 UTC m=+790.156965641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift") pod "swift-storage-0" (UID: "07b5d783-be96-4a6b-8603-e6f56f13f233") : configmap "swift-ring-files" not found Dec 03 22:23:44.345694 master-0 kubenswrapper[36504]: I1203 22:23:44.345562 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-4grtq"] Dec 03 22:23:44.348137 master-0 kubenswrapper[36504]: I1203 22:23:44.348084 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.351540 master-0 kubenswrapper[36504]: I1203 22:23:44.351473 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 22:23:44.352967 master-0 kubenswrapper[36504]: I1203 22:23:44.352942 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 22:23:44.356887 master-0 kubenswrapper[36504]: I1203 22:23:44.356816 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 22:23:44.389549 master-0 kubenswrapper[36504]: I1203 22:23:44.389468 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-dispersionconf\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.389895 master-0 kubenswrapper[36504]: I1203 22:23:44.389575 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpc5\" (UniqueName: \"kubernetes.io/projected/e8dbecb3-17d6-45fd-9045-ee54e5009171-kube-api-access-tzpc5\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.389895 master-0 kubenswrapper[36504]: I1203 22:23:44.389660 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8dbecb3-17d6-45fd-9045-ee54e5009171-etc-swift\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.390057 master-0 kubenswrapper[36504]: I1203 22:23:44.389990 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-scripts\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.390114 master-0 kubenswrapper[36504]: I1203 22:23:44.390062 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-ring-data-devices\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.390178 master-0 kubenswrapper[36504]: I1203 22:23:44.390152 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-swiftconf\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.390283 master-0 kubenswrapper[36504]: I1203 22:23:44.390254 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-combined-ca-bundle\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.410362 master-0 kubenswrapper[36504]: I1203 22:23:44.410275 36504 generic.go:334] "Generic (PLEG): container finished" podID="55ac1860-363d-42ff-90dd-3b6bf2e78864" containerID="01bb597e9de0c339f23df271e698add5189147ab7875d6d3bf0c690e20096ea1" exitCode=0 Dec 03 22:23:44.410695 master-0 kubenswrapper[36504]: I1203 22:23:44.410387 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55ac1860-363d-42ff-90dd-3b6bf2e78864","Type":"ContainerDied","Data":"01bb597e9de0c339f23df271e698add5189147ab7875d6d3bf0c690e20096ea1"} Dec 03 22:23:44.414828 master-0 kubenswrapper[36504]: I1203 22:23:44.414739 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"767eba72-715d-4e13-9116-ac583b611b78","Type":"ContainerStarted","Data":"1dc5a551f794ad265ea5e2339fb7b4867324263ab86bf52495b394235ffd3f32"} Dec 03 22:23:44.414828 master-0 kubenswrapper[36504]: I1203 22:23:44.414832 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"767eba72-715d-4e13-9116-ac583b611b78","Type":"ContainerStarted","Data":"fe4536f14272a28e55173e24d635c684d7294e1e24d57550760a88c2f3b661e1"} Dec 03 22:23:44.415027 master-0 kubenswrapper[36504]: I1203 22:23:44.414972 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 22:23:44.427672 master-0 kubenswrapper[36504]: I1203 22:23:44.426994 36504 generic.go:334] "Generic (PLEG): container finished" podID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerID="da97bf42436e285c9bb845abd9c57693d0e89d6eab143891326d1104005396f2" exitCode=0 Dec 03 22:23:44.427672 master-0 kubenswrapper[36504]: I1203 22:23:44.427084 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerDied","Data":"da97bf42436e285c9bb845abd9c57693d0e89d6eab143891326d1104005396f2"} Dec 03 22:23:44.450883 master-0 kubenswrapper[36504]: I1203 22:23:44.446960 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4grtq"] Dec 03 22:23:44.493113 master-0 kubenswrapper[36504]: I1203 22:23:44.493027 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-dispersionconf\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.493543 master-0 kubenswrapper[36504]: I1203 22:23:44.493154 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpc5\" (UniqueName: \"kubernetes.io/projected/e8dbecb3-17d6-45fd-9045-ee54e5009171-kube-api-access-tzpc5\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.493543 master-0 kubenswrapper[36504]: I1203 22:23:44.493211 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8dbecb3-17d6-45fd-9045-ee54e5009171-etc-swift\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.493543 master-0 kubenswrapper[36504]: I1203 22:23:44.493281 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-scripts\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.493543 master-0 kubenswrapper[36504]: I1203 22:23:44.493308 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-ring-data-devices\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.493543 master-0 kubenswrapper[36504]: I1203 22:23:44.493345 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-swiftconf\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.493543 master-0 kubenswrapper[36504]: I1203 22:23:44.493378 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-combined-ca-bundle\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.494953 master-0 kubenswrapper[36504]: I1203 22:23:44.494010 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8dbecb3-17d6-45fd-9045-ee54e5009171-etc-swift\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.494953 master-0 kubenswrapper[36504]: I1203 22:23:44.494864 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-scripts\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.494953 master-0 kubenswrapper[36504]: I1203 22:23:44.494861 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-ring-data-devices\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.507273 master-0 kubenswrapper[36504]: I1203 22:23:44.507185 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-dispersionconf\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.507910 master-0 kubenswrapper[36504]: I1203 22:23:44.507785 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-swiftconf\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.508388 master-0 kubenswrapper[36504]: I1203 22:23:44.508344 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-combined-ca-bundle\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.533786 master-0 kubenswrapper[36504]: I1203 22:23:44.533707 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpc5\" (UniqueName: \"kubernetes.io/projected/e8dbecb3-17d6-45fd-9045-ee54e5009171-kube-api-access-tzpc5\") pod \"swift-ring-rebalance-4grtq\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.601002 master-0 kubenswrapper[36504]: I1203 22:23:44.600840 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 22:23:44.601002 master-0 kubenswrapper[36504]: I1203 22:23:44.600908 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 22:23:44.669994 master-0 kubenswrapper[36504]: I1203 22:23:44.669839 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:23:44.836168 master-0 kubenswrapper[36504]: I1203 22:23:44.836011 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.5523062960000003 podStartE2EDuration="4.835979342s" podCreationTimestamp="2025-12-03 22:23:40 +0000 UTC" firstStartedPulling="2025-12-03 22:23:41.992503832 +0000 UTC m=+787.212275829" lastFinishedPulling="2025-12-03 22:23:43.276176868 +0000 UTC m=+788.495948875" observedRunningTime="2025-12-03 22:23:44.821950441 +0000 UTC m=+790.041722458" watchObservedRunningTime="2025-12-03 22:23:44.835979342 +0000 UTC m=+790.055751349" Dec 03 22:23:45.012837 master-0 kubenswrapper[36504]: I1203 22:23:45.011139 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:45.013589 master-0 kubenswrapper[36504]: E1203 22:23:45.011981 36504 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:23:45.013589 master-0 kubenswrapper[36504]: E1203 22:23:45.013219 36504 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:23:45.013589 master-0 kubenswrapper[36504]: E1203 22:23:45.013382 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift podName:07b5d783-be96-4a6b-8603-e6f56f13f233 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:49.013355636 +0000 UTC m=+794.233127643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift") pod "swift-storage-0" (UID: "07b5d783-be96-4a6b-8603-e6f56f13f233") : configmap "swift-ring-files" not found Dec 03 22:23:45.488820 master-0 kubenswrapper[36504]: I1203 22:23:45.488178 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-4grtq"] Dec 03 22:23:45.634922 master-0 kubenswrapper[36504]: I1203 22:23:45.630976 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 22:23:45.745010 master-0 kubenswrapper[36504]: I1203 22:23:45.744859 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 22:23:46.146601 master-0 kubenswrapper[36504]: I1203 22:23:46.146488 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6lvls"] Dec 03 22:23:46.154970 master-0 kubenswrapper[36504]: I1203 22:23:46.154882 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.175378 master-0 kubenswrapper[36504]: I1203 22:23:46.172692 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6lvls"] Dec 03 22:23:46.222970 master-0 kubenswrapper[36504]: I1203 22:23:46.222563 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:46.222970 master-0 kubenswrapper[36504]: I1203 22:23:46.222644 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:46.288602 master-0 kubenswrapper[36504]: I1203 22:23:46.288484 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0ba6-account-create-update-shlcw"] Dec 03 22:23:46.291822 master-0 kubenswrapper[36504]: I1203 22:23:46.291758 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.294298 master-0 kubenswrapper[36504]: I1203 22:23:46.294191 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 22:23:46.364456 master-0 kubenswrapper[36504]: I1203 22:23:46.364294 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-operator-scripts\") pod \"placement-db-create-6lvls\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.364820 master-0 kubenswrapper[36504]: I1203 22:23:46.364494 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6pq\" (UniqueName: \"kubernetes.io/projected/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-kube-api-access-4r6pq\") pod \"placement-db-create-6lvls\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.365736 master-0 kubenswrapper[36504]: I1203 22:23:46.365686 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:46.419370 master-0 kubenswrapper[36504]: I1203 22:23:46.419201 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0ba6-account-create-update-shlcw"] Dec 03 22:23:46.472549 master-0 kubenswrapper[36504]: I1203 22:23:46.472148 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kn4h\" (UniqueName: \"kubernetes.io/projected/51e40216-8f57-45da-827f-a42c5fc3fb96-kube-api-access-6kn4h\") pod \"placement-0ba6-account-create-update-shlcw\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.472549 master-0 kubenswrapper[36504]: I1203 22:23:46.472422 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-operator-scripts\") pod \"placement-db-create-6lvls\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.477801 master-0 kubenswrapper[36504]: I1203 22:23:46.474113 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6pq\" (UniqueName: \"kubernetes.io/projected/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-kube-api-access-4r6pq\") pod \"placement-db-create-6lvls\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.477801 master-0 kubenswrapper[36504]: I1203 22:23:46.474208 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e40216-8f57-45da-827f-a42c5fc3fb96-operator-scripts\") pod \"placement-0ba6-account-create-update-shlcw\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.477801 master-0 kubenswrapper[36504]: I1203 22:23:46.474902 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4grtq" event={"ID":"e8dbecb3-17d6-45fd-9045-ee54e5009171","Type":"ContainerStarted","Data":"afef924da9fac45824aad69b6c6e3eea4f5a20b4a6605b8721f066ff11681fb2"} Dec 03 22:23:46.477801 master-0 kubenswrapper[36504]: I1203 22:23:46.475130 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-operator-scripts\") pod \"placement-db-create-6lvls\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.520088 master-0 kubenswrapper[36504]: I1203 22:23:46.520026 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6pq\" (UniqueName: \"kubernetes.io/projected/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-kube-api-access-4r6pq\") pod \"placement-db-create-6lvls\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.532145 master-0 kubenswrapper[36504]: I1203 22:23:46.529742 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v6gf7"] Dec 03 22:23:46.532145 master-0 kubenswrapper[36504]: I1203 22:23:46.532023 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:46.578413 master-0 kubenswrapper[36504]: I1203 22:23:46.577750 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e40216-8f57-45da-827f-a42c5fc3fb96-operator-scripts\") pod \"placement-0ba6-account-create-update-shlcw\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.578413 master-0 kubenswrapper[36504]: I1203 22:23:46.577936 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kn4h\" (UniqueName: \"kubernetes.io/projected/51e40216-8f57-45da-827f-a42c5fc3fb96-kube-api-access-6kn4h\") pod \"placement-0ba6-account-create-update-shlcw\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.581987 master-0 kubenswrapper[36504]: I1203 22:23:46.581024 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e40216-8f57-45da-827f-a42c5fc3fb96-operator-scripts\") pod \"placement-0ba6-account-create-update-shlcw\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.605237 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v6gf7"] Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.628741 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-84f8-account-create-update-ds2dl"] Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.632220 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.752029 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.753235 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.753295 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kn4h\" (UniqueName: \"kubernetes.io/projected/51e40216-8f57-45da-827f-a42c5fc3fb96-kube-api-access-6kn4h\") pod \"placement-0ba6-account-create-update-shlcw\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.757826 master-0 kubenswrapper[36504]: I1203 22:23:46.755619 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-84f8-account-create-update-ds2dl"] Dec 03 22:23:46.782926 master-0 kubenswrapper[36504]: I1203 22:23:46.782845 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:46.787033 master-0 kubenswrapper[36504]: I1203 22:23:46.786984 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6lvls" Dec 03 22:23:46.850592 master-0 kubenswrapper[36504]: I1203 22:23:46.850522 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa046c9-0892-46ec-8325-4739744f5c72-operator-scripts\") pod \"glance-db-create-v6gf7\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:46.850971 master-0 kubenswrapper[36504]: I1203 22:23:46.850613 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-operator-scripts\") pod \"glance-84f8-account-create-update-ds2dl\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.850971 master-0 kubenswrapper[36504]: I1203 22:23:46.850664 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89cxn\" (UniqueName: \"kubernetes.io/projected/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-kube-api-access-89cxn\") pod \"glance-84f8-account-create-update-ds2dl\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.850971 master-0 kubenswrapper[36504]: I1203 22:23:46.850745 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplk2\" (UniqueName: \"kubernetes.io/projected/afa046c9-0892-46ec-8325-4739744f5c72-kube-api-access-nplk2\") pod \"glance-db-create-v6gf7\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:46.954380 master-0 kubenswrapper[36504]: I1203 22:23:46.953631 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplk2\" (UniqueName: \"kubernetes.io/projected/afa046c9-0892-46ec-8325-4739744f5c72-kube-api-access-nplk2\") pod \"glance-db-create-v6gf7\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:46.954380 master-0 kubenswrapper[36504]: I1203 22:23:46.953827 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa046c9-0892-46ec-8325-4739744f5c72-operator-scripts\") pod \"glance-db-create-v6gf7\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:46.954380 master-0 kubenswrapper[36504]: I1203 22:23:46.953874 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-operator-scripts\") pod \"glance-84f8-account-create-update-ds2dl\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.954380 master-0 kubenswrapper[36504]: I1203 22:23:46.953946 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89cxn\" (UniqueName: \"kubernetes.io/projected/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-kube-api-access-89cxn\") pod \"glance-84f8-account-create-update-ds2dl\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.956299 master-0 kubenswrapper[36504]: I1203 22:23:46.956279 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa046c9-0892-46ec-8325-4739744f5c72-operator-scripts\") pod \"glance-db-create-v6gf7\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:46.957065 master-0 kubenswrapper[36504]: I1203 22:23:46.957046 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-operator-scripts\") pod \"glance-84f8-account-create-update-ds2dl\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.976456 master-0 kubenswrapper[36504]: I1203 22:23:46.976401 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89cxn\" (UniqueName: \"kubernetes.io/projected/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-kube-api-access-89cxn\") pod \"glance-84f8-account-create-update-ds2dl\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:46.977501 master-0 kubenswrapper[36504]: I1203 22:23:46.977476 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplk2\" (UniqueName: \"kubernetes.io/projected/afa046c9-0892-46ec-8325-4739744f5c72-kube-api-access-nplk2\") pod \"glance-db-create-v6gf7\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:47.119133 master-0 kubenswrapper[36504]: I1203 22:23:47.119052 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:47.125449 master-0 kubenswrapper[36504]: I1203 22:23:47.124792 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:48.528223 master-0 kubenswrapper[36504]: I1203 22:23:48.528026 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55ac1860-363d-42ff-90dd-3b6bf2e78864","Type":"ContainerStarted","Data":"4d0faef4f447368d78d9c06ec568366ead7b4acc67ae493046ec411ef062195e"} Dec 03 22:23:48.670900 master-0 kubenswrapper[36504]: I1203 22:23:48.670389 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0ba6-account-create-update-shlcw"] Dec 03 22:23:48.684461 master-0 kubenswrapper[36504]: I1203 22:23:48.683289 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v6gf7"] Dec 03 22:23:48.701113 master-0 kubenswrapper[36504]: I1203 22:23:48.700892 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6lvls"] Dec 03 22:23:48.804059 master-0 kubenswrapper[36504]: I1203 22:23:48.803964 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-84f8-account-create-update-ds2dl"] Dec 03 22:23:48.808401 master-0 kubenswrapper[36504]: W1203 22:23:48.808064 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e8ecdd_ac48_4859_9bc3_a17949d9879c.slice/crio-a8d52d214d07c7d103bad6cd22804d6912d90f7e525602fd1910c7c714d5516a WatchSource:0}: Error finding container a8d52d214d07c7d103bad6cd22804d6912d90f7e525602fd1910c7c714d5516a: Status 404 returned error can't find the container with id a8d52d214d07c7d103bad6cd22804d6912d90f7e525602fd1910c7c714d5516a Dec 03 22:23:49.075811 master-0 kubenswrapper[36504]: I1203 22:23:49.075706 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:49.076706 master-0 kubenswrapper[36504]: E1203 22:23:49.076669 36504 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:23:49.076706 master-0 kubenswrapper[36504]: E1203 22:23:49.076700 36504 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:23:49.077210 master-0 kubenswrapper[36504]: E1203 22:23:49.076983 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift podName:07b5d783-be96-4a6b-8603-e6f56f13f233 nodeName:}" failed. No retries permitted until 2025-12-03 22:23:57.076751847 +0000 UTC m=+802.296523854 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift") pod "swift-storage-0" (UID: "07b5d783-be96-4a6b-8603-e6f56f13f233") : configmap "swift-ring-files" not found Dec 03 22:23:49.443418 master-0 kubenswrapper[36504]: I1203 22:23:49.443156 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:23:49.604753 master-0 kubenswrapper[36504]: I1203 22:23:49.604616 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6lvls" event={"ID":"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8","Type":"ContainerStarted","Data":"834322f81c10cc9b200b6b0795cef57bcd622a98f1882bc1e3cc706b92e74906"} Dec 03 22:23:49.604753 master-0 kubenswrapper[36504]: I1203 22:23:49.604702 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6lvls" event={"ID":"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8","Type":"ContainerStarted","Data":"01e5e8ac757a37844be242d60561a8c600fbe5de21fb148b4eaf408eb39e8360"} Dec 03 22:23:49.619320 master-0 kubenswrapper[36504]: I1203 22:23:49.619232 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-xg7f9"] Dec 03 22:23:49.619854 master-0 kubenswrapper[36504]: I1203 22:23:49.619636 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="dnsmasq-dns" containerID="cri-o://6346a4ab8837b2f87a2edc9db97d7e0202ead3dccab811f25aa2838ccb62d005" gracePeriod=10 Dec 03 22:23:49.625134 master-0 kubenswrapper[36504]: I1203 22:23:49.624705 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v6gf7" event={"ID":"afa046c9-0892-46ec-8325-4739744f5c72","Type":"ContainerStarted","Data":"c0506e45f23e5b48d58dc5df76675eee7a757807270faeadfc27b651e298570b"} Dec 03 22:23:49.625134 master-0 kubenswrapper[36504]: I1203 22:23:49.624793 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v6gf7" event={"ID":"afa046c9-0892-46ec-8325-4739744f5c72","Type":"ContainerStarted","Data":"a0ca1a962c814b62be64713df08b90b88875687a804cac7ccd11996f904b65b7"} Dec 03 22:23:49.633271 master-0 kubenswrapper[36504]: I1203 22:23:49.633187 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84f8-account-create-update-ds2dl" event={"ID":"d5e8ecdd-ac48-4859-9bc3-a17949d9879c","Type":"ContainerStarted","Data":"3f7f81cf5e2c3ae297d2b11e646a0fe97de4ef4507232b9045ef00bc1f3f3453"} Dec 03 22:23:49.633271 master-0 kubenswrapper[36504]: I1203 22:23:49.633276 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84f8-account-create-update-ds2dl" event={"ID":"d5e8ecdd-ac48-4859-9bc3-a17949d9879c","Type":"ContainerStarted","Data":"a8d52d214d07c7d103bad6cd22804d6912d90f7e525602fd1910c7c714d5516a"} Dec 03 22:23:49.669166 master-0 kubenswrapper[36504]: I1203 22:23:49.669076 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0ba6-account-create-update-shlcw" event={"ID":"51e40216-8f57-45da-827f-a42c5fc3fb96","Type":"ContainerStarted","Data":"7686b747fcda6baeebfab44a169cdaf949623061781dd9e89816ab06feee6aeb"} Dec 03 22:23:49.669166 master-0 kubenswrapper[36504]: I1203 22:23:49.669161 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0ba6-account-create-update-shlcw" event={"ID":"51e40216-8f57-45da-827f-a42c5fc3fb96","Type":"ContainerStarted","Data":"5c486fffcfa582781cac51efe99c10241f509c308987297821d97fbc8e8baa17"} Dec 03 22:23:49.680578 master-0 kubenswrapper[36504]: I1203 22:23:49.680471 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-6lvls" podStartSLOduration=3.68042856 podStartE2EDuration="3.68042856s" podCreationTimestamp="2025-12-03 22:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:23:49.644603484 +0000 UTC m=+794.864375491" watchObservedRunningTime="2025-12-03 22:23:49.68042856 +0000 UTC m=+794.900200567" Dec 03 22:23:49.779996 master-0 kubenswrapper[36504]: I1203 22:23:49.779873 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-84f8-account-create-update-ds2dl" podStartSLOduration=3.779846105 podStartE2EDuration="3.779846105s" podCreationTimestamp="2025-12-03 22:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:23:49.684737146 +0000 UTC m=+794.904509173" watchObservedRunningTime="2025-12-03 22:23:49.779846105 +0000 UTC m=+794.999618112" Dec 03 22:23:49.837178 master-0 kubenswrapper[36504]: I1203 22:23:49.837057 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-0ba6-account-create-update-shlcw" podStartSLOduration=3.837018361 podStartE2EDuration="3.837018361s" podCreationTimestamp="2025-12-03 22:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:23:49.764567954 +0000 UTC m=+794.984339961" watchObservedRunningTime="2025-12-03 22:23:49.837018361 +0000 UTC m=+795.056790368" Dec 03 22:23:50.157640 master-0 kubenswrapper[36504]: I1203 22:23:50.157559 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.190:5353: connect: connection refused" Dec 03 22:23:50.684371 master-0 kubenswrapper[36504]: I1203 22:23:50.684282 36504 generic.go:334] "Generic (PLEG): container finished" podID="d5e8ecdd-ac48-4859-9bc3-a17949d9879c" containerID="3f7f81cf5e2c3ae297d2b11e646a0fe97de4ef4507232b9045ef00bc1f3f3453" exitCode=0 Dec 03 22:23:50.685225 master-0 kubenswrapper[36504]: I1203 22:23:50.684416 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84f8-account-create-update-ds2dl" event={"ID":"d5e8ecdd-ac48-4859-9bc3-a17949d9879c","Type":"ContainerDied","Data":"3f7f81cf5e2c3ae297d2b11e646a0fe97de4ef4507232b9045ef00bc1f3f3453"} Dec 03 22:23:50.687136 master-0 kubenswrapper[36504]: I1203 22:23:50.687097 36504 generic.go:334] "Generic (PLEG): container finished" podID="51e40216-8f57-45da-827f-a42c5fc3fb96" containerID="7686b747fcda6baeebfab44a169cdaf949623061781dd9e89816ab06feee6aeb" exitCode=0 Dec 03 22:23:50.687198 master-0 kubenswrapper[36504]: I1203 22:23:50.687166 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0ba6-account-create-update-shlcw" event={"ID":"51e40216-8f57-45da-827f-a42c5fc3fb96","Type":"ContainerDied","Data":"7686b747fcda6baeebfab44a169cdaf949623061781dd9e89816ab06feee6aeb"} Dec 03 22:23:50.689849 master-0 kubenswrapper[36504]: I1203 22:23:50.689786 36504 generic.go:334] "Generic (PLEG): container finished" podID="bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" containerID="834322f81c10cc9b200b6b0795cef57bcd622a98f1882bc1e3cc706b92e74906" exitCode=0 Dec 03 22:23:50.689931 master-0 kubenswrapper[36504]: I1203 22:23:50.689884 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6lvls" event={"ID":"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8","Type":"ContainerDied","Data":"834322f81c10cc9b200b6b0795cef57bcd622a98f1882bc1e3cc706b92e74906"} Dec 03 22:23:50.691826 master-0 kubenswrapper[36504]: I1203 22:23:50.691790 36504 generic.go:334] "Generic (PLEG): container finished" podID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerID="6346a4ab8837b2f87a2edc9db97d7e0202ead3dccab811f25aa2838ccb62d005" exitCode=0 Dec 03 22:23:50.691941 master-0 kubenswrapper[36504]: I1203 22:23:50.691844 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" event={"ID":"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4","Type":"ContainerDied","Data":"6346a4ab8837b2f87a2edc9db97d7e0202ead3dccab811f25aa2838ccb62d005"} Dec 03 22:23:50.693575 master-0 kubenswrapper[36504]: I1203 22:23:50.693536 36504 generic.go:334] "Generic (PLEG): container finished" podID="afa046c9-0892-46ec-8325-4739744f5c72" containerID="c0506e45f23e5b48d58dc5df76675eee7a757807270faeadfc27b651e298570b" exitCode=0 Dec 03 22:23:50.693636 master-0 kubenswrapper[36504]: I1203 22:23:50.693577 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v6gf7" event={"ID":"afa046c9-0892-46ec-8325-4739744f5c72","Type":"ContainerDied","Data":"c0506e45f23e5b48d58dc5df76675eee7a757807270faeadfc27b651e298570b"} Dec 03 22:23:52.723254 master-0 kubenswrapper[36504]: I1203 22:23:52.723155 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55ac1860-363d-42ff-90dd-3b6bf2e78864","Type":"ContainerStarted","Data":"5e12fb67a0b1d254ecf626de003532722b96f534c24a9da055331be8e6175b3f"} Dec 03 22:23:52.724118 master-0 kubenswrapper[36504]: I1203 22:23:52.723498 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:23:52.728264 master-0 kubenswrapper[36504]: I1203 22:23:52.728197 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 03 22:23:53.060823 master-0 kubenswrapper[36504]: I1203 22:23:53.060725 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:53.235107 master-0 kubenswrapper[36504]: I1203 22:23:53.235006 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-operator-scripts\") pod \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " Dec 03 22:23:53.235107 master-0 kubenswrapper[36504]: I1203 22:23:53.235108 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89cxn\" (UniqueName: \"kubernetes.io/projected/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-kube-api-access-89cxn\") pod \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\" (UID: \"d5e8ecdd-ac48-4859-9bc3-a17949d9879c\") " Dec 03 22:23:53.235906 master-0 kubenswrapper[36504]: I1203 22:23:53.235797 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5e8ecdd-ac48-4859-9bc3-a17949d9879c" (UID: "d5e8ecdd-ac48-4859-9bc3-a17949d9879c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:53.236338 master-0 kubenswrapper[36504]: I1203 22:23:53.236294 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:53.239268 master-0 kubenswrapper[36504]: I1203 22:23:53.239205 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-kube-api-access-89cxn" (OuterVolumeSpecName: "kube-api-access-89cxn") pod "d5e8ecdd-ac48-4859-9bc3-a17949d9879c" (UID: "d5e8ecdd-ac48-4859-9bc3-a17949d9879c"). InnerVolumeSpecName "kube-api-access-89cxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:53.313332 master-0 kubenswrapper[36504]: I1203 22:23:53.313120 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=28.247792315 podStartE2EDuration="55.313049792s" podCreationTimestamp="2025-12-03 22:22:58 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.776332679 +0000 UTC m=+765.996104686" lastFinishedPulling="2025-12-03 22:23:47.841590146 +0000 UTC m=+793.061362163" observedRunningTime="2025-12-03 22:23:53.303432399 +0000 UTC m=+798.523204416" watchObservedRunningTime="2025-12-03 22:23:53.313049792 +0000 UTC m=+798.532821799" Dec 03 22:23:53.351197 master-0 kubenswrapper[36504]: I1203 22:23:53.349098 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89cxn\" (UniqueName: \"kubernetes.io/projected/d5e8ecdd-ac48-4859-9bc3-a17949d9879c-kube-api-access-89cxn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:53.767890 master-0 kubenswrapper[36504]: I1203 22:23:53.767727 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-84f8-account-create-update-ds2dl" Dec 03 22:23:53.767890 master-0 kubenswrapper[36504]: I1203 22:23:53.767884 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-84f8-account-create-update-ds2dl" event={"ID":"d5e8ecdd-ac48-4859-9bc3-a17949d9879c","Type":"ContainerDied","Data":"a8d52d214d07c7d103bad6cd22804d6912d90f7e525602fd1910c7c714d5516a"} Dec 03 22:23:53.768747 master-0 kubenswrapper[36504]: I1203 22:23:53.767932 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d52d214d07c7d103bad6cd22804d6912d90f7e525602fd1910c7c714d5516a" Dec 03 22:23:55.158390 master-0 kubenswrapper[36504]: I1203 22:23:55.158271 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.190:5353: connect: connection refused" Dec 03 22:23:55.437923 master-0 kubenswrapper[36504]: I1203 22:23:55.437681 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jcc6q"] Dec 03 22:23:55.438447 master-0 kubenswrapper[36504]: E1203 22:23:55.438406 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e8ecdd-ac48-4859-9bc3-a17949d9879c" containerName="mariadb-account-create-update" Dec 03 22:23:55.438447 master-0 kubenswrapper[36504]: I1203 22:23:55.438437 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e8ecdd-ac48-4859-9bc3-a17949d9879c" containerName="mariadb-account-create-update" Dec 03 22:23:55.438698 master-0 kubenswrapper[36504]: I1203 22:23:55.438670 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e8ecdd-ac48-4859-9bc3-a17949d9879c" containerName="mariadb-account-create-update" Dec 03 22:23:55.440653 master-0 kubenswrapper[36504]: I1203 22:23:55.440572 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:55.462421 master-0 kubenswrapper[36504]: I1203 22:23:55.462310 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jcc6q"] Dec 03 22:23:55.630988 master-0 kubenswrapper[36504]: I1203 22:23:55.630885 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92513a58-839a-4447-97a0-8291a5ba09c9-operator-scripts\") pod \"keystone-db-create-jcc6q\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:55.631754 master-0 kubenswrapper[36504]: I1203 22:23:55.631190 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx4hc\" (UniqueName: \"kubernetes.io/projected/92513a58-839a-4447-97a0-8291a5ba09c9-kube-api-access-kx4hc\") pod \"keystone-db-create-jcc6q\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:55.732895 master-0 kubenswrapper[36504]: I1203 22:23:55.732799 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx4hc\" (UniqueName: \"kubernetes.io/projected/92513a58-839a-4447-97a0-8291a5ba09c9-kube-api-access-kx4hc\") pod \"keystone-db-create-jcc6q\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:55.735655 master-0 kubenswrapper[36504]: I1203 22:23:55.735608 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92513a58-839a-4447-97a0-8291a5ba09c9-operator-scripts\") pod \"keystone-db-create-jcc6q\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:55.740336 master-0 kubenswrapper[36504]: I1203 22:23:55.740293 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92513a58-839a-4447-97a0-8291a5ba09c9-operator-scripts\") pod \"keystone-db-create-jcc6q\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:55.746484 master-0 kubenswrapper[36504]: I1203 22:23:55.746418 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2d6c-account-create-update-55xwh"] Dec 03 22:23:55.748170 master-0 kubenswrapper[36504]: I1203 22:23:55.748145 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:55.752105 master-0 kubenswrapper[36504]: I1203 22:23:55.750961 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 22:23:55.797138 master-0 kubenswrapper[36504]: I1203 22:23:55.797046 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0ba6-account-create-update-shlcw" event={"ID":"51e40216-8f57-45da-827f-a42c5fc3fb96","Type":"ContainerDied","Data":"5c486fffcfa582781cac51efe99c10241f509c308987297821d97fbc8e8baa17"} Dec 03 22:23:55.797138 master-0 kubenswrapper[36504]: I1203 22:23:55.797118 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c486fffcfa582781cac51efe99c10241f509c308987297821d97fbc8e8baa17" Dec 03 22:23:55.800978 master-0 kubenswrapper[36504]: I1203 22:23:55.800885 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6lvls" event={"ID":"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8","Type":"ContainerDied","Data":"01e5e8ac757a37844be242d60561a8c600fbe5de21fb148b4eaf408eb39e8360"} Dec 03 22:23:55.800978 master-0 kubenswrapper[36504]: I1203 22:23:55.800967 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01e5e8ac757a37844be242d60561a8c600fbe5de21fb148b4eaf408eb39e8360" Dec 03 22:23:55.803094 master-0 kubenswrapper[36504]: I1203 22:23:55.803048 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v6gf7" event={"ID":"afa046c9-0892-46ec-8325-4739744f5c72","Type":"ContainerDied","Data":"a0ca1a962c814b62be64713df08b90b88875687a804cac7ccd11996f904b65b7"} Dec 03 22:23:55.803094 master-0 kubenswrapper[36504]: I1203 22:23:55.803088 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0ca1a962c814b62be64713df08b90b88875687a804cac7ccd11996f904b65b7" Dec 03 22:23:55.838982 master-0 kubenswrapper[36504]: I1203 22:23:55.838354 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:55.847176 master-0 kubenswrapper[36504]: I1203 22:23:55.844114 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqzxc\" (UniqueName: \"kubernetes.io/projected/e27ac245-600a-4798-a749-bd7884a3fe95-kube-api-access-fqzxc\") pod \"keystone-2d6c-account-create-update-55xwh\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:55.847176 master-0 kubenswrapper[36504]: I1203 22:23:55.844410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27ac245-600a-4798-a749-bd7884a3fe95-operator-scripts\") pod \"keystone-2d6c-account-create-update-55xwh\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:55.847176 master-0 kubenswrapper[36504]: I1203 22:23:55.845926 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6lvls" Dec 03 22:23:55.949034 master-0 kubenswrapper[36504]: I1203 22:23:55.947572 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e40216-8f57-45da-827f-a42c5fc3fb96-operator-scripts\") pod \"51e40216-8f57-45da-827f-a42c5fc3fb96\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " Dec 03 22:23:55.949034 master-0 kubenswrapper[36504]: I1203 22:23:55.947654 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r6pq\" (UniqueName: \"kubernetes.io/projected/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-kube-api-access-4r6pq\") pod \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " Dec 03 22:23:55.949034 master-0 kubenswrapper[36504]: I1203 22:23:55.947708 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kn4h\" (UniqueName: \"kubernetes.io/projected/51e40216-8f57-45da-827f-a42c5fc3fb96-kube-api-access-6kn4h\") pod \"51e40216-8f57-45da-827f-a42c5fc3fb96\" (UID: \"51e40216-8f57-45da-827f-a42c5fc3fb96\") " Dec 03 22:23:55.949034 master-0 kubenswrapper[36504]: I1203 22:23:55.948004 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-operator-scripts\") pod \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\" (UID: \"bad6d333-f7eb-48b5-bce2-53d69d9b2ab8\") " Dec 03 22:23:55.949034 master-0 kubenswrapper[36504]: I1203 22:23:55.948526 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27ac245-600a-4798-a749-bd7884a3fe95-operator-scripts\") pod \"keystone-2d6c-account-create-update-55xwh\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:55.949034 master-0 kubenswrapper[36504]: I1203 22:23:55.948866 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqzxc\" (UniqueName: \"kubernetes.io/projected/e27ac245-600a-4798-a749-bd7884a3fe95-kube-api-access-fqzxc\") pod \"keystone-2d6c-account-create-update-55xwh\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:55.950129 master-0 kubenswrapper[36504]: I1203 22:23:55.949694 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51e40216-8f57-45da-827f-a42c5fc3fb96-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51e40216-8f57-45da-827f-a42c5fc3fb96" (UID: "51e40216-8f57-45da-827f-a42c5fc3fb96"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:55.951150 master-0 kubenswrapper[36504]: I1203 22:23:55.950354 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" (UID: "bad6d333-f7eb-48b5-bce2-53d69d9b2ab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:55.951906 master-0 kubenswrapper[36504]: I1203 22:23:55.951880 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27ac245-600a-4798-a749-bd7884a3fe95-operator-scripts\") pod \"keystone-2d6c-account-create-update-55xwh\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:55.952913 master-0 kubenswrapper[36504]: I1203 22:23:55.952887 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-kube-api-access-4r6pq" (OuterVolumeSpecName: "kube-api-access-4r6pq") pod "bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" (UID: "bad6d333-f7eb-48b5-bce2-53d69d9b2ab8"). InnerVolumeSpecName "kube-api-access-4r6pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:55.955086 master-0 kubenswrapper[36504]: I1203 22:23:55.955060 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:55.957301 master-0 kubenswrapper[36504]: I1203 22:23:55.957255 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e40216-8f57-45da-827f-a42c5fc3fb96-kube-api-access-6kn4h" (OuterVolumeSpecName: "kube-api-access-6kn4h") pod "51e40216-8f57-45da-827f-a42c5fc3fb96" (UID: "51e40216-8f57-45da-827f-a42c5fc3fb96"). InnerVolumeSpecName "kube-api-access-6kn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:56.051731 master-0 kubenswrapper[36504]: I1203 22:23:56.051597 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa046c9-0892-46ec-8325-4739744f5c72-operator-scripts\") pod \"afa046c9-0892-46ec-8325-4739744f5c72\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " Dec 03 22:23:56.051731 master-0 kubenswrapper[36504]: I1203 22:23:56.051661 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nplk2\" (UniqueName: \"kubernetes.io/projected/afa046c9-0892-46ec-8325-4739744f5c72-kube-api-access-nplk2\") pod \"afa046c9-0892-46ec-8325-4739744f5c72\" (UID: \"afa046c9-0892-46ec-8325-4739744f5c72\") " Dec 03 22:23:56.052478 master-0 kubenswrapper[36504]: I1203 22:23:56.052413 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa046c9-0892-46ec-8325-4739744f5c72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afa046c9-0892-46ec-8325-4739744f5c72" (UID: "afa046c9-0892-46ec-8325-4739744f5c72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:56.053382 master-0 kubenswrapper[36504]: I1203 22:23:56.052852 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51e40216-8f57-45da-827f-a42c5fc3fb96-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.053382 master-0 kubenswrapper[36504]: I1203 22:23:56.052879 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r6pq\" (UniqueName: \"kubernetes.io/projected/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-kube-api-access-4r6pq\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.053382 master-0 kubenswrapper[36504]: I1203 22:23:56.052893 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kn4h\" (UniqueName: \"kubernetes.io/projected/51e40216-8f57-45da-827f-a42c5fc3fb96-kube-api-access-6kn4h\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.053382 master-0 kubenswrapper[36504]: I1203 22:23:56.052904 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.053382 master-0 kubenswrapper[36504]: I1203 22:23:56.052914 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afa046c9-0892-46ec-8325-4739744f5c72-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.055454 master-0 kubenswrapper[36504]: I1203 22:23:56.055410 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa046c9-0892-46ec-8325-4739744f5c72-kube-api-access-nplk2" (OuterVolumeSpecName: "kube-api-access-nplk2") pod "afa046c9-0892-46ec-8325-4739744f5c72" (UID: "afa046c9-0892-46ec-8325-4739744f5c72"). InnerVolumeSpecName "kube-api-access-nplk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:56.105851 master-0 kubenswrapper[36504]: I1203 22:23:56.105001 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:56.155643 master-0 kubenswrapper[36504]: I1203 22:23:56.155570 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nplk2\" (UniqueName: \"kubernetes.io/projected/afa046c9-0892-46ec-8325-4739744f5c72-kube-api-access-nplk2\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.201424 master-0 kubenswrapper[36504]: I1203 22:23:56.201202 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2d6c-account-create-update-55xwh"] Dec 03 22:23:56.259010 master-0 kubenswrapper[36504]: I1203 22:23:56.258181 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qqjt\" (UniqueName: \"kubernetes.io/projected/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-kube-api-access-9qqjt\") pod \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " Dec 03 22:23:56.259010 master-0 kubenswrapper[36504]: I1203 22:23:56.258556 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-dns-svc\") pod \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " Dec 03 22:23:56.259010 master-0 kubenswrapper[36504]: I1203 22:23:56.258815 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-ovsdbserver-nb\") pod \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " Dec 03 22:23:56.259010 master-0 kubenswrapper[36504]: I1203 22:23:56.258868 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-config\") pod \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\" (UID: \"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4\") " Dec 03 22:23:56.262714 master-0 kubenswrapper[36504]: I1203 22:23:56.262596 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-kube-api-access-9qqjt" (OuterVolumeSpecName: "kube-api-access-9qqjt") pod "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" (UID: "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4"). InnerVolumeSpecName "kube-api-access-9qqjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:56.305233 master-0 kubenswrapper[36504]: I1203 22:23:56.305154 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx4hc\" (UniqueName: \"kubernetes.io/projected/92513a58-839a-4447-97a0-8291a5ba09c9-kube-api-access-kx4hc\") pod \"keystone-db-create-jcc6q\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:56.325144 master-0 kubenswrapper[36504]: I1203 22:23:56.325068 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" (UID: "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:56.331376 master-0 kubenswrapper[36504]: I1203 22:23:56.331305 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-config" (OuterVolumeSpecName: "config") pod "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" (UID: "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:56.331564 master-0 kubenswrapper[36504]: I1203 22:23:56.331543 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" (UID: "2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:56.360134 master-0 kubenswrapper[36504]: I1203 22:23:56.360065 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 22:23:56.362266 master-0 kubenswrapper[36504]: I1203 22:23:56.362179 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.362266 master-0 kubenswrapper[36504]: I1203 22:23:56.362261 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.362413 master-0 kubenswrapper[36504]: I1203 22:23:56.362274 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.362413 master-0 kubenswrapper[36504]: I1203 22:23:56.362286 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qqjt\" (UniqueName: \"kubernetes.io/projected/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4-kube-api-access-9qqjt\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:56.374097 master-0 kubenswrapper[36504]: I1203 22:23:56.374004 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:56.817470 master-0 kubenswrapper[36504]: I1203 22:23:56.817268 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" event={"ID":"2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4","Type":"ContainerDied","Data":"0f8f1ede98817c234333c78ed48ba735a8bc850c2807cf5a8129235e4bbc43f8"} Dec 03 22:23:56.817470 master-0 kubenswrapper[36504]: I1203 22:23:56.817382 36504 scope.go:117] "RemoveContainer" containerID="6346a4ab8837b2f87a2edc9db97d7e0202ead3dccab811f25aa2838ccb62d005" Dec 03 22:23:56.817908 master-0 kubenswrapper[36504]: I1203 22:23:56.817630 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b4bcf6ff-xg7f9" Dec 03 22:23:56.828048 master-0 kubenswrapper[36504]: I1203 22:23:56.827981 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4grtq" event={"ID":"e8dbecb3-17d6-45fd-9045-ee54e5009171","Type":"ContainerStarted","Data":"755a6377cc2999d5ea8104a68f9f3a58d74bd518bcd9daa827ed4c841fdc3c0a"} Dec 03 22:23:56.830213 master-0 kubenswrapper[36504]: I1203 22:23:56.830174 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerStarted","Data":"51f99190829cd0abd5e0817951a586d79cdeeb56f49085a0cae0bd57a18f87de"} Dec 03 22:23:56.830307 master-0 kubenswrapper[36504]: I1203 22:23:56.830260 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0ba6-account-create-update-shlcw" Dec 03 22:23:56.830393 master-0 kubenswrapper[36504]: I1203 22:23:56.830305 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v6gf7" Dec 03 22:23:56.830393 master-0 kubenswrapper[36504]: I1203 22:23:56.830377 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6lvls" Dec 03 22:23:56.881190 master-0 kubenswrapper[36504]: I1203 22:23:56.880143 36504 scope.go:117] "RemoveContainer" containerID="6429a275acb5013bead8f19bcf570490cb7516920ad031acc6331769aa93b329" Dec 03 22:23:56.917150 master-0 kubenswrapper[36504]: I1203 22:23:56.915575 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqzxc\" (UniqueName: \"kubernetes.io/projected/e27ac245-600a-4798-a749-bd7884a3fe95-kube-api-access-fqzxc\") pod \"keystone-2d6c-account-create-update-55xwh\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:57.088643 master-0 kubenswrapper[36504]: I1203 22:23:57.088455 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:23:57.088942 master-0 kubenswrapper[36504]: E1203 22:23:57.088706 36504 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 22:23:57.088942 master-0 kubenswrapper[36504]: E1203 22:23:57.088744 36504 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 22:23:57.088942 master-0 kubenswrapper[36504]: E1203 22:23:57.088900 36504 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift podName:07b5d783-be96-4a6b-8603-e6f56f13f233 nodeName:}" failed. No retries permitted until 2025-12-03 22:24:13.088873874 +0000 UTC m=+818.308645881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift") pod "swift-storage-0" (UID: "07b5d783-be96-4a6b-8603-e6f56f13f233") : configmap "swift-ring-files" not found Dec 03 22:23:57.192096 master-0 kubenswrapper[36504]: I1203 22:23:57.192007 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:23:57.218601 master-0 kubenswrapper[36504]: I1203 22:23:57.218496 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jcc6q"] Dec 03 22:23:57.314678 master-0 kubenswrapper[36504]: W1203 22:23:57.314587 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92513a58_839a_4447_97a0_8291a5ba09c9.slice/crio-fa479274150c69571c71da48ec3724ced4cd939ecdd3c3376860025474fa573e WatchSource:0}: Error finding container fa479274150c69571c71da48ec3724ced4cd939ecdd3c3376860025474fa573e: Status 404 returned error can't find the container with id fa479274150c69571c71da48ec3724ced4cd939ecdd3c3376860025474fa573e Dec 03 22:23:57.727423 master-0 kubenswrapper[36504]: I1203 22:23:57.726057 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-xg7f9"] Dec 03 22:23:57.743490 master-0 kubenswrapper[36504]: I1203 22:23:57.743236 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b4bcf6ff-xg7f9"] Dec 03 22:23:57.779015 master-0 kubenswrapper[36504]: I1203 22:23:57.776376 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2d6c-account-create-update-55xwh"] Dec 03 22:23:57.877652 master-0 kubenswrapper[36504]: I1203 22:23:57.877576 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jcc6q" event={"ID":"92513a58-839a-4447-97a0-8291a5ba09c9","Type":"ContainerStarted","Data":"bcb1f1322002bce60ffab99d1283a433117edab66db6e25f19984b4bed5604ab"} Dec 03 22:23:57.877652 master-0 kubenswrapper[36504]: I1203 22:23:57.877651 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jcc6q" event={"ID":"92513a58-839a-4447-97a0-8291a5ba09c9","Type":"ContainerStarted","Data":"fa479274150c69571c71da48ec3724ced4cd939ecdd3c3376860025474fa573e"} Dec 03 22:23:57.879999 master-0 kubenswrapper[36504]: I1203 22:23:57.879959 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2d6c-account-create-update-55xwh" event={"ID":"e27ac245-600a-4798-a749-bd7884a3fe95","Type":"ContainerStarted","Data":"cb258037a97e6f58041ff2af83a999b26fc8e8f1b0514f22737175769579605d"} Dec 03 22:23:57.958337 master-0 kubenswrapper[36504]: I1203 22:23:57.958218 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-4grtq" podStartSLOduration=3.631230255 podStartE2EDuration="13.958167145s" podCreationTimestamp="2025-12-03 22:23:44 +0000 UTC" firstStartedPulling="2025-12-03 22:23:45.513758444 +0000 UTC m=+790.733530451" lastFinishedPulling="2025-12-03 22:23:55.840695334 +0000 UTC m=+801.060467341" observedRunningTime="2025-12-03 22:23:57.937494916 +0000 UTC m=+803.157266933" watchObservedRunningTime="2025-12-03 22:23:57.958167145 +0000 UTC m=+803.177939152" Dec 03 22:23:58.897523 master-0 kubenswrapper[36504]: I1203 22:23:58.897440 36504 generic.go:334] "Generic (PLEG): container finished" podID="e27ac245-600a-4798-a749-bd7884a3fe95" containerID="5f2bde362e7220f54a9e1257828cfea401e9ab086abf600d81456241c3b24b1a" exitCode=0 Dec 03 22:23:58.898337 master-0 kubenswrapper[36504]: I1203 22:23:58.897557 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2d6c-account-create-update-55xwh" event={"ID":"e27ac245-600a-4798-a749-bd7884a3fe95","Type":"ContainerDied","Data":"5f2bde362e7220f54a9e1257828cfea401e9ab086abf600d81456241c3b24b1a"} Dec 03 22:23:58.900613 master-0 kubenswrapper[36504]: I1203 22:23:58.900563 36504 generic.go:334] "Generic (PLEG): container finished" podID="92513a58-839a-4447-97a0-8291a5ba09c9" containerID="bcb1f1322002bce60ffab99d1283a433117edab66db6e25f19984b4bed5604ab" exitCode=0 Dec 03 22:23:58.900613 master-0 kubenswrapper[36504]: I1203 22:23:58.900609 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jcc6q" event={"ID":"92513a58-839a-4447-97a0-8291a5ba09c9","Type":"ContainerDied","Data":"bcb1f1322002bce60ffab99d1283a433117edab66db6e25f19984b4bed5604ab"} Dec 03 22:23:59.153154 master-0 kubenswrapper[36504]: I1203 22:23:59.152789 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" path="/var/lib/kubelet/pods/2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4/volumes" Dec 03 22:23:59.506588 master-0 kubenswrapper[36504]: I1203 22:23:59.506486 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:59.521191 master-0 kubenswrapper[36504]: I1203 22:23:59.521098 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92513a58-839a-4447-97a0-8291a5ba09c9-operator-scripts\") pod \"92513a58-839a-4447-97a0-8291a5ba09c9\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " Dec 03 22:23:59.521339 master-0 kubenswrapper[36504]: I1203 22:23:59.521280 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx4hc\" (UniqueName: \"kubernetes.io/projected/92513a58-839a-4447-97a0-8291a5ba09c9-kube-api-access-kx4hc\") pod \"92513a58-839a-4447-97a0-8291a5ba09c9\" (UID: \"92513a58-839a-4447-97a0-8291a5ba09c9\") " Dec 03 22:23:59.522617 master-0 kubenswrapper[36504]: I1203 22:23:59.522503 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92513a58-839a-4447-97a0-8291a5ba09c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92513a58-839a-4447-97a0-8291a5ba09c9" (UID: "92513a58-839a-4447-97a0-8291a5ba09c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:23:59.528552 master-0 kubenswrapper[36504]: I1203 22:23:59.528425 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92513a58-839a-4447-97a0-8291a5ba09c9-kube-api-access-kx4hc" (OuterVolumeSpecName: "kube-api-access-kx4hc") pod "92513a58-839a-4447-97a0-8291a5ba09c9" (UID: "92513a58-839a-4447-97a0-8291a5ba09c9"). InnerVolumeSpecName "kube-api-access-kx4hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:23:59.623193 master-0 kubenswrapper[36504]: I1203 22:23:59.623095 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92513a58-839a-4447-97a0-8291a5ba09c9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:59.623193 master-0 kubenswrapper[36504]: I1203 22:23:59.623177 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx4hc\" (UniqueName: \"kubernetes.io/projected/92513a58-839a-4447-97a0-8291a5ba09c9-kube-api-access-kx4hc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:23:59.916214 master-0 kubenswrapper[36504]: I1203 22:23:59.916120 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jcc6q" Dec 03 22:23:59.916996 master-0 kubenswrapper[36504]: I1203 22:23:59.916104 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jcc6q" event={"ID":"92513a58-839a-4447-97a0-8291a5ba09c9","Type":"ContainerDied","Data":"fa479274150c69571c71da48ec3724ced4cd939ecdd3c3376860025474fa573e"} Dec 03 22:23:59.916996 master-0 kubenswrapper[36504]: I1203 22:23:59.916330 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa479274150c69571c71da48ec3724ced4cd939ecdd3c3376860025474fa573e" Dec 03 22:24:00.104462 master-0 kubenswrapper[36504]: E1203 22:24:00.100222 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92513a58_839a_4447_97a0_8291a5ba09c9.slice\": RecentStats: unable to find data in memory cache]" Dec 03 22:24:00.400416 master-0 kubenswrapper[36504]: I1203 22:24:00.399631 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:24:00.447384 master-0 kubenswrapper[36504]: I1203 22:24:00.447185 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27ac245-600a-4798-a749-bd7884a3fe95-operator-scripts\") pod \"e27ac245-600a-4798-a749-bd7884a3fe95\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " Dec 03 22:24:00.447384 master-0 kubenswrapper[36504]: I1203 22:24:00.447293 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqzxc\" (UniqueName: \"kubernetes.io/projected/e27ac245-600a-4798-a749-bd7884a3fe95-kube-api-access-fqzxc\") pod \"e27ac245-600a-4798-a749-bd7884a3fe95\" (UID: \"e27ac245-600a-4798-a749-bd7884a3fe95\") " Dec 03 22:24:00.448085 master-0 kubenswrapper[36504]: I1203 22:24:00.448015 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e27ac245-600a-4798-a749-bd7884a3fe95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e27ac245-600a-4798-a749-bd7884a3fe95" (UID: "e27ac245-600a-4798-a749-bd7884a3fe95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:00.448645 master-0 kubenswrapper[36504]: I1203 22:24:00.448617 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e27ac245-600a-4798-a749-bd7884a3fe95-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:00.487129 master-0 kubenswrapper[36504]: I1203 22:24:00.487044 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27ac245-600a-4798-a749-bd7884a3fe95-kube-api-access-fqzxc" (OuterVolumeSpecName: "kube-api-access-fqzxc") pod "e27ac245-600a-4798-a749-bd7884a3fe95" (UID: "e27ac245-600a-4798-a749-bd7884a3fe95"). InnerVolumeSpecName "kube-api-access-fqzxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:00.550211 master-0 kubenswrapper[36504]: I1203 22:24:00.550063 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqzxc\" (UniqueName: \"kubernetes.io/projected/e27ac245-600a-4798-a749-bd7884a3fe95-kube-api-access-fqzxc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:00.929797 master-0 kubenswrapper[36504]: I1203 22:24:00.929697 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerStarted","Data":"2041cc760388ff620b91c23e3a377192b215481dbe5edd06049cc6ebfb9ec635"} Dec 03 22:24:00.932113 master-0 kubenswrapper[36504]: I1203 22:24:00.932047 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2d6c-account-create-update-55xwh" event={"ID":"e27ac245-600a-4798-a749-bd7884a3fe95","Type":"ContainerDied","Data":"cb258037a97e6f58041ff2af83a999b26fc8e8f1b0514f22737175769579605d"} Dec 03 22:24:00.932113 master-0 kubenswrapper[36504]: I1203 22:24:00.932106 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2d6c-account-create-update-55xwh" Dec 03 22:24:00.932283 master-0 kubenswrapper[36504]: I1203 22:24:00.932122 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb258037a97e6f58041ff2af83a999b26fc8e8f1b0514f22737175769579605d" Dec 03 22:24:02.285575 master-0 kubenswrapper[36504]: I1203 22:24:02.285465 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8bwst"] Dec 03 22:24:02.286757 master-0 kubenswrapper[36504]: E1203 22:24:02.286713 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27ac245-600a-4798-a749-bd7884a3fe95" containerName="mariadb-account-create-update" Dec 03 22:24:02.286757 master-0 kubenswrapper[36504]: I1203 22:24:02.286745 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27ac245-600a-4798-a749-bd7884a3fe95" containerName="mariadb-account-create-update" Dec 03 22:24:02.286890 master-0 kubenswrapper[36504]: E1203 22:24:02.286784 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa046c9-0892-46ec-8325-4739744f5c72" containerName="mariadb-database-create" Dec 03 22:24:02.286890 master-0 kubenswrapper[36504]: I1203 22:24:02.286882 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa046c9-0892-46ec-8325-4739744f5c72" containerName="mariadb-database-create" Dec 03 22:24:02.286963 master-0 kubenswrapper[36504]: E1203 22:24:02.286898 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" containerName="mariadb-database-create" Dec 03 22:24:02.286963 master-0 kubenswrapper[36504]: I1203 22:24:02.286907 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" containerName="mariadb-database-create" Dec 03 22:24:02.287032 master-0 kubenswrapper[36504]: E1203 22:24:02.286995 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51e40216-8f57-45da-827f-a42c5fc3fb96" containerName="mariadb-account-create-update" Dec 03 22:24:02.287032 master-0 kubenswrapper[36504]: I1203 22:24:02.287004 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e40216-8f57-45da-827f-a42c5fc3fb96" containerName="mariadb-account-create-update" Dec 03 22:24:02.287104 master-0 kubenswrapper[36504]: E1203 22:24:02.287037 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="init" Dec 03 22:24:02.287104 master-0 kubenswrapper[36504]: I1203 22:24:02.287045 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="init" Dec 03 22:24:02.287104 master-0 kubenswrapper[36504]: E1203 22:24:02.287064 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="dnsmasq-dns" Dec 03 22:24:02.287104 master-0 kubenswrapper[36504]: I1203 22:24:02.287071 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="dnsmasq-dns" Dec 03 22:24:02.287104 master-0 kubenswrapper[36504]: E1203 22:24:02.287084 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92513a58-839a-4447-97a0-8291a5ba09c9" containerName="mariadb-database-create" Dec 03 22:24:02.287104 master-0 kubenswrapper[36504]: I1203 22:24:02.287092 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="92513a58-839a-4447-97a0-8291a5ba09c9" containerName="mariadb-database-create" Dec 03 22:24:02.287373 master-0 kubenswrapper[36504]: I1203 22:24:02.287347 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="92513a58-839a-4447-97a0-8291a5ba09c9" containerName="mariadb-database-create" Dec 03 22:24:02.287373 master-0 kubenswrapper[36504]: I1203 22:24:02.287367 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" containerName="mariadb-database-create" Dec 03 22:24:02.287462 master-0 kubenswrapper[36504]: I1203 22:24:02.287386 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b09b918-0cbd-4072-b3f8-4fe0efcfcdd4" containerName="dnsmasq-dns" Dec 03 22:24:02.287462 master-0 kubenswrapper[36504]: I1203 22:24:02.287399 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="51e40216-8f57-45da-827f-a42c5fc3fb96" containerName="mariadb-account-create-update" Dec 03 22:24:02.287462 master-0 kubenswrapper[36504]: I1203 22:24:02.287419 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa046c9-0892-46ec-8325-4739744f5c72" containerName="mariadb-database-create" Dec 03 22:24:02.287462 master-0 kubenswrapper[36504]: I1203 22:24:02.287443 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27ac245-600a-4798-a749-bd7884a3fe95" containerName="mariadb-account-create-update" Dec 03 22:24:02.288458 master-0 kubenswrapper[36504]: I1203 22:24:02.288412 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.293835 master-0 kubenswrapper[36504]: I1203 22:24:02.293799 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-baebb-config-data" Dec 03 22:24:02.311693 master-0 kubenswrapper[36504]: I1203 22:24:02.311566 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8bwst"] Dec 03 22:24:02.429217 master-0 kubenswrapper[36504]: I1203 22:24:02.429133 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-db-sync-config-data\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.429631 master-0 kubenswrapper[36504]: I1203 22:24:02.429307 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-combined-ca-bundle\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.429631 master-0 kubenswrapper[36504]: I1203 22:24:02.429504 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-config-data\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.429631 master-0 kubenswrapper[36504]: I1203 22:24:02.429585 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28jlt\" (UniqueName: \"kubernetes.io/projected/b0d98163-c65a-412b-a4c4-32ce8bff30a8-kube-api-access-28jlt\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.532439 master-0 kubenswrapper[36504]: I1203 22:24:02.532338 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-db-sync-config-data\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.533683 master-0 kubenswrapper[36504]: I1203 22:24:02.532520 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-combined-ca-bundle\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.533683 master-0 kubenswrapper[36504]: I1203 22:24:02.532797 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-config-data\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.533683 master-0 kubenswrapper[36504]: I1203 22:24:02.532892 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28jlt\" (UniqueName: \"kubernetes.io/projected/b0d98163-c65a-412b-a4c4-32ce8bff30a8-kube-api-access-28jlt\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.537380 master-0 kubenswrapper[36504]: I1203 22:24:02.537280 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-db-sync-config-data\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.537853 master-0 kubenswrapper[36504]: I1203 22:24:02.537736 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-combined-ca-bundle\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.550211 master-0 kubenswrapper[36504]: I1203 22:24:02.550152 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-config-data\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.554391 master-0 kubenswrapper[36504]: I1203 22:24:02.554344 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28jlt\" (UniqueName: \"kubernetes.io/projected/b0d98163-c65a-412b-a4c4-32ce8bff30a8-kube-api-access-28jlt\") pod \"glance-db-sync-8bwst\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:02.616188 master-0 kubenswrapper[36504]: I1203 22:24:02.616098 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:03.302115 master-0 kubenswrapper[36504]: I1203 22:24:03.302017 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8bwst"] Dec 03 22:24:04.245847 master-0 kubenswrapper[36504]: I1203 22:24:04.244942 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zqx7b" podUID="13962c28-06ea-4b66-aa46-f00d50e29eda" containerName="ovn-controller" probeResult="failure" output=< Dec 03 22:24:04.245847 master-0 kubenswrapper[36504]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 22:24:04.245847 master-0 kubenswrapper[36504]: > Dec 03 22:24:04.325735 master-0 kubenswrapper[36504]: W1203 22:24:04.325650 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0d98163_c65a_412b_a4c4_32ce8bff30a8.slice/crio-49c33769a87d2e1f6c9fd86cdfb97c45537b1abdc9f301d010952ec8c7b69f05 WatchSource:0}: Error finding container 49c33769a87d2e1f6c9fd86cdfb97c45537b1abdc9f301d010952ec8c7b69f05: Status 404 returned error can't find the container with id 49c33769a87d2e1f6c9fd86cdfb97c45537b1abdc9f301d010952ec8c7b69f05 Dec 03 22:24:04.381856 master-0 kubenswrapper[36504]: I1203 22:24:04.381787 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:24:05.124381 master-0 kubenswrapper[36504]: I1203 22:24:05.124302 36504 generic.go:334] "Generic (PLEG): container finished" podID="6be1c526-0db0-4e85-9d73-8c78c20e4273" containerID="d931b4a9ee9e6c7444f8d71c51d35068dc1dd765108eeb277a4ed95fe606739e" exitCode=0 Dec 03 22:24:05.124381 master-0 kubenswrapper[36504]: I1203 22:24:05.124387 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6be1c526-0db0-4e85-9d73-8c78c20e4273","Type":"ContainerDied","Data":"d931b4a9ee9e6c7444f8d71c51d35068dc1dd765108eeb277a4ed95fe606739e"} Dec 03 22:24:05.129176 master-0 kubenswrapper[36504]: I1203 22:24:05.129095 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8bwst" event={"ID":"b0d98163-c65a-412b-a4c4-32ce8bff30a8","Type":"ContainerStarted","Data":"49c33769a87d2e1f6c9fd86cdfb97c45537b1abdc9f301d010952ec8c7b69f05"} Dec 03 22:24:05.136359 master-0 kubenswrapper[36504]: I1203 22:24:05.136284 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8dbecb3-17d6-45fd-9045-ee54e5009171" containerID="755a6377cc2999d5ea8104a68f9f3a58d74bd518bcd9daa827ed4c841fdc3c0a" exitCode=0 Dec 03 22:24:05.136571 master-0 kubenswrapper[36504]: I1203 22:24:05.136400 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4grtq" event={"ID":"e8dbecb3-17d6-45fd-9045-ee54e5009171","Type":"ContainerDied","Data":"755a6377cc2999d5ea8104a68f9f3a58d74bd518bcd9daa827ed4c841fdc3c0a"} Dec 03 22:24:05.140841 master-0 kubenswrapper[36504]: I1203 22:24:05.140725 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerStarted","Data":"72ac727550c6b9edefbb8d84d41ee176a75b8a08dfbc158a35a79286e5bf62a1"} Dec 03 22:24:05.197265 master-0 kubenswrapper[36504]: I1203 22:24:05.196952 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.529392486 podStartE2EDuration="1m7.196928896s" podCreationTimestamp="2025-12-03 22:22:58 +0000 UTC" firstStartedPulling="2025-12-03 22:23:20.78626671 +0000 UTC m=+766.006038717" lastFinishedPulling="2025-12-03 22:24:04.45380312 +0000 UTC m=+809.673575127" observedRunningTime="2025-12-03 22:24:05.186009843 +0000 UTC m=+810.405781860" watchObservedRunningTime="2025-12-03 22:24:05.196928896 +0000 UTC m=+810.416700903" Dec 03 22:24:06.162383 master-0 kubenswrapper[36504]: I1203 22:24:06.162289 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6be1c526-0db0-4e85-9d73-8c78c20e4273","Type":"ContainerStarted","Data":"c32f2de555e4e31c0b2e34c0c2910788a5e69c5eb25a8d131b11cd9c2a806e57"} Dec 03 22:24:06.166386 master-0 kubenswrapper[36504]: I1203 22:24:06.166326 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:06.219316 master-0 kubenswrapper[36504]: I1203 22:24:06.219184 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=64.223500875 podStartE2EDuration="1m14.219146604s" podCreationTimestamp="2025-12-03 22:22:52 +0000 UTC" firstStartedPulling="2025-12-03 22:23:19.338245829 +0000 UTC m=+764.558017836" lastFinishedPulling="2025-12-03 22:23:29.333891558 +0000 UTC m=+774.553663565" observedRunningTime="2025-12-03 22:24:06.202227072 +0000 UTC m=+811.421999079" watchObservedRunningTime="2025-12-03 22:24:06.219146604 +0000 UTC m=+811.438918601" Dec 03 22:24:06.686646 master-0 kubenswrapper[36504]: I1203 22:24:06.685280 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:24:06.819162 master-0 kubenswrapper[36504]: I1203 22:24:06.818954 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-combined-ca-bundle\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.819162 master-0 kubenswrapper[36504]: I1203 22:24:06.819098 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-swiftconf\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.819500 master-0 kubenswrapper[36504]: I1203 22:24:06.819193 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-dispersionconf\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.819500 master-0 kubenswrapper[36504]: I1203 22:24:06.819292 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8dbecb3-17d6-45fd-9045-ee54e5009171-etc-swift\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.819500 master-0 kubenswrapper[36504]: I1203 22:24:06.819397 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-scripts\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.819500 master-0 kubenswrapper[36504]: I1203 22:24:06.819432 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzpc5\" (UniqueName: \"kubernetes.io/projected/e8dbecb3-17d6-45fd-9045-ee54e5009171-kube-api-access-tzpc5\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.819840 master-0 kubenswrapper[36504]: I1203 22:24:06.819525 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-ring-data-devices\") pod \"e8dbecb3-17d6-45fd-9045-ee54e5009171\" (UID: \"e8dbecb3-17d6-45fd-9045-ee54e5009171\") " Dec 03 22:24:06.821178 master-0 kubenswrapper[36504]: I1203 22:24:06.821097 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:06.822282 master-0 kubenswrapper[36504]: I1203 22:24:06.822244 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8dbecb3-17d6-45fd-9045-ee54e5009171-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:24:06.828607 master-0 kubenswrapper[36504]: I1203 22:24:06.827911 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8dbecb3-17d6-45fd-9045-ee54e5009171-kube-api-access-tzpc5" (OuterVolumeSpecName: "kube-api-access-tzpc5") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "kube-api-access-tzpc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:06.830252 master-0 kubenswrapper[36504]: I1203 22:24:06.830212 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:06.855877 master-0 kubenswrapper[36504]: I1203 22:24:06.855623 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:06.858473 master-0 kubenswrapper[36504]: I1203 22:24:06.858427 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-scripts" (OuterVolumeSpecName: "scripts") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:06.861978 master-0 kubenswrapper[36504]: I1203 22:24:06.861893 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8dbecb3-17d6-45fd-9045-ee54e5009171" (UID: "e8dbecb3-17d6-45fd-9045-ee54e5009171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923483 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923549 36504 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-swiftconf\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923568 36504 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e8dbecb3-17d6-45fd-9045-ee54e5009171-dispersionconf\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923581 36504 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e8dbecb3-17d6-45fd-9045-ee54e5009171-etc-swift\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923596 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923609 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzpc5\" (UniqueName: \"kubernetes.io/projected/e8dbecb3-17d6-45fd-9045-ee54e5009171-kube-api-access-tzpc5\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:06.924615 master-0 kubenswrapper[36504]: I1203 22:24:06.923624 36504 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e8dbecb3-17d6-45fd-9045-ee54e5009171-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:07.181122 master-0 kubenswrapper[36504]: I1203 22:24:07.181025 36504 generic.go:334] "Generic (PLEG): container finished" podID="09ea7cea-94f9-4bdb-ad61-24281b0ee1ed" containerID="b06d77b769597bccba710e9daadeadfb465fe9daf309858d47da3df25d06f5b3" exitCode=0 Dec 03 22:24:07.182009 master-0 kubenswrapper[36504]: I1203 22:24:07.181168 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed","Type":"ContainerDied","Data":"b06d77b769597bccba710e9daadeadfb465fe9daf309858d47da3df25d06f5b3"} Dec 03 22:24:07.184295 master-0 kubenswrapper[36504]: I1203 22:24:07.184213 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-4grtq" Dec 03 22:24:07.184485 master-0 kubenswrapper[36504]: I1203 22:24:07.184427 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-4grtq" event={"ID":"e8dbecb3-17d6-45fd-9045-ee54e5009171","Type":"ContainerDied","Data":"afef924da9fac45824aad69b6c6e3eea4f5a20b4a6605b8721f066ff11681fb2"} Dec 03 22:24:07.184591 master-0 kubenswrapper[36504]: I1203 22:24:07.184568 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afef924da9fac45824aad69b6c6e3eea4f5a20b4a6605b8721f066ff11681fb2" Dec 03 22:24:07.519245 master-0 kubenswrapper[36504]: I1203 22:24:07.519160 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:07.521937 master-0 kubenswrapper[36504]: I1203 22:24:07.521191 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:07.522372 master-0 kubenswrapper[36504]: I1203 22:24:07.522327 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:08.203221 master-0 kubenswrapper[36504]: I1203 22:24:08.203139 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"09ea7cea-94f9-4bdb-ad61-24281b0ee1ed","Type":"ContainerStarted","Data":"587b1e421f8f08cc47bca685b0a3a25c813a850795f882d1e92fd1176a099ab6"} Dec 03 22:24:08.204070 master-0 kubenswrapper[36504]: I1203 22:24:08.203733 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 22:24:08.207056 master-0 kubenswrapper[36504]: I1203 22:24:08.206655 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:08.255809 master-0 kubenswrapper[36504]: I1203 22:24:08.255610 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=68.798528517 podStartE2EDuration="1m18.255576817s" podCreationTimestamp="2025-12-03 22:22:50 +0000 UTC" firstStartedPulling="2025-12-03 22:23:19.924922059 +0000 UTC m=+765.144694066" lastFinishedPulling="2025-12-03 22:23:29.381970359 +0000 UTC m=+774.601742366" observedRunningTime="2025-12-03 22:24:08.230267112 +0000 UTC m=+813.450039129" watchObservedRunningTime="2025-12-03 22:24:08.255576817 +0000 UTC m=+813.475348824" Dec 03 22:24:08.815500 master-0 kubenswrapper[36504]: I1203 22:24:08.814741 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54f54bc9-qrsbs"] Dec 03 22:24:08.816366 master-0 kubenswrapper[36504]: E1203 22:24:08.816322 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8dbecb3-17d6-45fd-9045-ee54e5009171" containerName="swift-ring-rebalance" Dec 03 22:24:08.816366 master-0 kubenswrapper[36504]: I1203 22:24:08.816359 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8dbecb3-17d6-45fd-9045-ee54e5009171" containerName="swift-ring-rebalance" Dec 03 22:24:08.816710 master-0 kubenswrapper[36504]: I1203 22:24:08.816673 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8dbecb3-17d6-45fd-9045-ee54e5009171" containerName="swift-ring-rebalance" Dec 03 22:24:08.818731 master-0 kubenswrapper[36504]: I1203 22:24:08.818576 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:08.825801 master-0 kubenswrapper[36504]: I1203 22:24:08.824721 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"edpm" Dec 03 22:24:08.847927 master-0 kubenswrapper[36504]: I1203 22:24:08.846258 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54f54bc9-qrsbs"] Dec 03 22:24:09.003921 master-0 kubenswrapper[36504]: I1203 22:24:09.001592 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-config\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.003921 master-0 kubenswrapper[36504]: I1203 22:24:09.001723 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-sb\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.003921 master-0 kubenswrapper[36504]: I1203 22:24:09.001840 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-dns-svc\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.003921 master-0 kubenswrapper[36504]: I1203 22:24:09.002045 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-nb\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.003921 master-0 kubenswrapper[36504]: I1203 22:24:09.002068 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlbnh\" (UniqueName: \"kubernetes.io/projected/57cee922-243c-468d-b06e-4b87593b257a-kube-api-access-vlbnh\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.003921 master-0 kubenswrapper[36504]: I1203 22:24:09.002098 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-edpm\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.088852 master-0 kubenswrapper[36504]: I1203 22:24:09.088664 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54f54bc9-qrsbs"] Dec 03 22:24:09.089904 master-0 kubenswrapper[36504]: E1203 22:24:09.089870 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc edpm kube-api-access-vlbnh ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" podUID="57cee922-243c-468d-b06e-4b87593b257a" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.104490 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-config\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.104595 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-sb\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.104732 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-dns-svc\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.104959 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-nb\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.104987 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlbnh\" (UniqueName: \"kubernetes.io/projected/57cee922-243c-468d-b06e-4b87593b257a-kube-api-access-vlbnh\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.105032 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-edpm\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.107617 master-0 kubenswrapper[36504]: I1203 22:24:09.106935 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-dns-svc\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.108161 master-0 kubenswrapper[36504]: I1203 22:24:09.107711 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-nb\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.108303 master-0 kubenswrapper[36504]: I1203 22:24:09.108250 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-config\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.108369 master-0 kubenswrapper[36504]: I1203 22:24:09.108268 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-sb\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.114679 master-0 kubenswrapper[36504]: I1203 22:24:09.114599 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-edpm\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.153911 master-0 kubenswrapper[36504]: I1203 22:24:09.151118 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlbnh\" (UniqueName: \"kubernetes.io/projected/57cee922-243c-468d-b06e-4b87593b257a-kube-api-access-vlbnh\") pod \"dnsmasq-dns-f54f54bc9-qrsbs\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.153911 master-0 kubenswrapper[36504]: I1203 22:24:09.151230 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff645d44c-2ndhp"] Dec 03 22:24:09.174262 master-0 kubenswrapper[36504]: I1203 22:24:09.158256 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.185637 master-0 kubenswrapper[36504]: I1203 22:24:09.185199 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff645d44c-2ndhp"] Dec 03 22:24:09.235464 master-0 kubenswrapper[36504]: I1203 22:24:09.235375 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.252949 master-0 kubenswrapper[36504]: I1203 22:24:09.252802 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:09.278520 master-0 kubenswrapper[36504]: I1203 22:24:09.276464 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zqx7b" podUID="13962c28-06ea-4b66-aa46-f00d50e29eda" containerName="ovn-controller" probeResult="failure" output=< Dec 03 22:24:09.278520 master-0 kubenswrapper[36504]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 22:24:09.278520 master-0 kubenswrapper[36504]: > Dec 03 22:24:09.312467 master-0 kubenswrapper[36504]: I1203 22:24:09.312268 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-edpm\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.312467 master-0 kubenswrapper[36504]: I1203 22:24:09.312400 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-config\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.312884 master-0 kubenswrapper[36504]: I1203 22:24:09.312646 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-sb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.312884 master-0 kubenswrapper[36504]: I1203 22:24:09.312839 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-dns-svc\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.313109 master-0 kubenswrapper[36504]: I1203 22:24:09.313042 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-nb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.313214 master-0 kubenswrapper[36504]: I1203 22:24:09.313137 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg2xb\" (UniqueName: \"kubernetes.io/projected/c33cc971-6403-48df-9951-a01a2e1c92e1-kube-api-access-vg2xb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.421634 master-0 kubenswrapper[36504]: I1203 22:24:09.421566 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-edpm\") pod \"57cee922-243c-468d-b06e-4b87593b257a\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " Dec 03 22:24:09.421972 master-0 kubenswrapper[36504]: I1203 22:24:09.421694 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-dns-svc\") pod \"57cee922-243c-468d-b06e-4b87593b257a\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " Dec 03 22:24:09.421972 master-0 kubenswrapper[36504]: I1203 22:24:09.421748 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-config\") pod \"57cee922-243c-468d-b06e-4b87593b257a\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " Dec 03 22:24:09.421972 master-0 kubenswrapper[36504]: I1203 22:24:09.421810 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-sb\") pod \"57cee922-243c-468d-b06e-4b87593b257a\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " Dec 03 22:24:09.421972 master-0 kubenswrapper[36504]: I1203 22:24:09.421858 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlbnh\" (UniqueName: \"kubernetes.io/projected/57cee922-243c-468d-b06e-4b87593b257a-kube-api-access-vlbnh\") pod \"57cee922-243c-468d-b06e-4b87593b257a\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " Dec 03 22:24:09.421972 master-0 kubenswrapper[36504]: I1203 22:24:09.421965 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-nb\") pod \"57cee922-243c-468d-b06e-4b87593b257a\" (UID: \"57cee922-243c-468d-b06e-4b87593b257a\") " Dec 03 22:24:09.422544 master-0 kubenswrapper[36504]: I1203 22:24:09.422442 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "57cee922-243c-468d-b06e-4b87593b257a" (UID: "57cee922-243c-468d-b06e-4b87593b257a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:09.422544 master-0 kubenswrapper[36504]: I1203 22:24:09.422485 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-nb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.422721 master-0 kubenswrapper[36504]: I1203 22:24:09.422698 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg2xb\" (UniqueName: \"kubernetes.io/projected/c33cc971-6403-48df-9951-a01a2e1c92e1-kube-api-access-vg2xb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.423008 master-0 kubenswrapper[36504]: I1203 22:24:09.422961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-edpm\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.423074 master-0 kubenswrapper[36504]: I1203 22:24:09.423044 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-config\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.423426 master-0 kubenswrapper[36504]: I1203 22:24:09.423389 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-sb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.424764 master-0 kubenswrapper[36504]: I1203 22:24:09.423720 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-dns-svc\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.424764 master-0 kubenswrapper[36504]: I1203 22:24:09.423993 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:09.424910 master-0 kubenswrapper[36504]: I1203 22:24:09.424816 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-edpm" (OuterVolumeSpecName: "edpm") pod "57cee922-243c-468d-b06e-4b87593b257a" (UID: "57cee922-243c-468d-b06e-4b87593b257a"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:09.426066 master-0 kubenswrapper[36504]: I1203 22:24:09.426003 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "57cee922-243c-468d-b06e-4b87593b257a" (UID: "57cee922-243c-468d-b06e-4b87593b257a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:09.426498 master-0 kubenswrapper[36504]: I1203 22:24:09.426432 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-config" (OuterVolumeSpecName: "config") pod "57cee922-243c-468d-b06e-4b87593b257a" (UID: "57cee922-243c-468d-b06e-4b87593b257a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:09.430319 master-0 kubenswrapper[36504]: I1203 22:24:09.430251 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "57cee922-243c-468d-b06e-4b87593b257a" (UID: "57cee922-243c-468d-b06e-4b87593b257a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:09.431041 master-0 kubenswrapper[36504]: I1203 22:24:09.431008 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-nb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.431611 master-0 kubenswrapper[36504]: I1203 22:24:09.431574 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-sb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.431857 master-0 kubenswrapper[36504]: I1203 22:24:09.431793 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-dns-svc\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.433579 master-0 kubenswrapper[36504]: I1203 22:24:09.433533 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-edpm\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.434020 master-0 kubenswrapper[36504]: I1203 22:24:09.433969 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-config\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.434020 master-0 kubenswrapper[36504]: I1203 22:24:09.433983 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57cee922-243c-468d-b06e-4b87593b257a-kube-api-access-vlbnh" (OuterVolumeSpecName: "kube-api-access-vlbnh") pod "57cee922-243c-468d-b06e-4b87593b257a" (UID: "57cee922-243c-468d-b06e-4b87593b257a"). InnerVolumeSpecName "kube-api-access-vlbnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:09.448242 master-0 kubenswrapper[36504]: I1203 22:24:09.448155 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-54zjs" Dec 03 22:24:09.455813 master-0 kubenswrapper[36504]: I1203 22:24:09.455742 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg2xb\" (UniqueName: \"kubernetes.io/projected/c33cc971-6403-48df-9951-a01a2e1c92e1-kube-api-access-vg2xb\") pod \"dnsmasq-dns-ff645d44c-2ndhp\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:09.526921 master-0 kubenswrapper[36504]: I1203 22:24:09.526543 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:09.526921 master-0 kubenswrapper[36504]: I1203 22:24:09.526600 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:09.526921 master-0 kubenswrapper[36504]: I1203 22:24:09.526612 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:09.526921 master-0 kubenswrapper[36504]: I1203 22:24:09.526626 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlbnh\" (UniqueName: \"kubernetes.io/projected/57cee922-243c-468d-b06e-4b87593b257a-kube-api-access-vlbnh\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:09.526921 master-0 kubenswrapper[36504]: I1203 22:24:09.526637 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/57cee922-243c-468d-b06e-4b87593b257a-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:09.533821 master-0 kubenswrapper[36504]: I1203 22:24:09.533741 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:10.247938 master-0 kubenswrapper[36504]: I1203 22:24:10.247854 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54f54bc9-qrsbs" Dec 03 22:24:10.873099 master-0 kubenswrapper[36504]: I1203 22:24:10.873012 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zqx7b-config-sqbpl"] Dec 03 22:24:10.875214 master-0 kubenswrapper[36504]: I1203 22:24:10.875160 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:10.878380 master-0 kubenswrapper[36504]: I1203 22:24:10.878291 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 22:24:11.066903 master-0 kubenswrapper[36504]: I1203 22:24:11.065119 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zqx7b-config-sqbpl"] Dec 03 22:24:11.144568 master-0 kubenswrapper[36504]: I1203 22:24:11.144331 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff645d44c-2ndhp"] Dec 03 22:24:11.171039 master-0 kubenswrapper[36504]: I1203 22:24:11.169741 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54f54bc9-qrsbs"] Dec 03 22:24:11.202254 master-0 kubenswrapper[36504]: I1203 22:24:11.202010 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54f54bc9-qrsbs"] Dec 03 22:24:11.243940 master-0 kubenswrapper[36504]: I1203 22:24:11.243463 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-scripts\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.243940 master-0 kubenswrapper[36504]: I1203 22:24:11.243636 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.243940 master-0 kubenswrapper[36504]: I1203 22:24:11.243703 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run-ovn\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.243940 master-0 kubenswrapper[36504]: I1203 22:24:11.243875 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shkx6\" (UniqueName: \"kubernetes.io/projected/e9d13cdd-7989-4caa-a457-bac8438d445d-kube-api-access-shkx6\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.244562 master-0 kubenswrapper[36504]: I1203 22:24:11.244000 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-log-ovn\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.244562 master-0 kubenswrapper[36504]: I1203 22:24:11.244039 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-additional-scripts\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.279345 master-0 kubenswrapper[36504]: I1203 22:24:11.279249 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" event={"ID":"c33cc971-6403-48df-9951-a01a2e1c92e1","Type":"ContainerStarted","Data":"760221e10aacfaf3d5060cec230fb38aa72a38c625be4fad14d190546b64b60f"} Dec 03 22:24:11.352480 master-0 kubenswrapper[36504]: I1203 22:24:11.352413 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-log-ovn\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.352758 master-0 kubenswrapper[36504]: I1203 22:24:11.352504 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-additional-scripts\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.353102 master-0 kubenswrapper[36504]: I1203 22:24:11.353050 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-log-ovn\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.353876 master-0 kubenswrapper[36504]: I1203 22:24:11.353837 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-additional-scripts\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.372786 master-0 kubenswrapper[36504]: I1203 22:24:11.372629 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-scripts\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.375217 master-0 kubenswrapper[36504]: I1203 22:24:11.372924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.375217 master-0 kubenswrapper[36504]: I1203 22:24:11.373083 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run-ovn\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.375217 master-0 kubenswrapper[36504]: I1203 22:24:11.373350 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shkx6\" (UniqueName: \"kubernetes.io/projected/e9d13cdd-7989-4caa-a457-bac8438d445d-kube-api-access-shkx6\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.375217 master-0 kubenswrapper[36504]: I1203 22:24:11.374840 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.375217 master-0 kubenswrapper[36504]: I1203 22:24:11.374928 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run-ovn\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.387553 master-0 kubenswrapper[36504]: I1203 22:24:11.387499 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-scripts\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.399693 master-0 kubenswrapper[36504]: I1203 22:24:11.399558 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shkx6\" (UniqueName: \"kubernetes.io/projected/e9d13cdd-7989-4caa-a457-bac8438d445d-kube-api-access-shkx6\") pod \"ovn-controller-zqx7b-config-sqbpl\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:11.502244 master-0 kubenswrapper[36504]: I1203 22:24:11.502148 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:12.082883 master-0 kubenswrapper[36504]: I1203 22:24:12.082248 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zqx7b-config-sqbpl"] Dec 03 22:24:12.297529 master-0 kubenswrapper[36504]: I1203 22:24:12.297433 36504 generic.go:334] "Generic (PLEG): container finished" podID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerID="7eabf8c6a3ff450e30776a0cedb3a616a85913114f66d573623b566160209bbe" exitCode=0 Dec 03 22:24:12.297529 master-0 kubenswrapper[36504]: I1203 22:24:12.297486 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" event={"ID":"c33cc971-6403-48df-9951-a01a2e1c92e1","Type":"ContainerDied","Data":"7eabf8c6a3ff450e30776a0cedb3a616a85913114f66d573623b566160209bbe"} Dec 03 22:24:13.112068 master-0 kubenswrapper[36504]: I1203 22:24:13.111985 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57cee922-243c-468d-b06e-4b87593b257a" path="/var/lib/kubelet/pods/57cee922-243c-468d-b06e-4b87593b257a/volumes" Dec 03 22:24:13.139194 master-0 kubenswrapper[36504]: I1203 22:24:13.139101 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:24:13.153509 master-0 kubenswrapper[36504]: I1203 22:24:13.153443 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/07b5d783-be96-4a6b-8603-e6f56f13f233-etc-swift\") pod \"swift-storage-0\" (UID: \"07b5d783-be96-4a6b-8603-e6f56f13f233\") " pod="openstack/swift-storage-0" Dec 03 22:24:13.387030 master-0 kubenswrapper[36504]: I1203 22:24:13.386955 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 22:24:13.927391 master-0 kubenswrapper[36504]: I1203 22:24:13.927274 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:13.927800 master-0 kubenswrapper[36504]: I1203 22:24:13.927688 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="prometheus" containerID="cri-o://51f99190829cd0abd5e0817951a586d79cdeeb56f49085a0cae0bd57a18f87de" gracePeriod=600 Dec 03 22:24:13.927924 master-0 kubenswrapper[36504]: I1203 22:24:13.927887 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="config-reloader" containerID="cri-o://2041cc760388ff620b91c23e3a377192b215481dbe5edd06049cc6ebfb9ec635" gracePeriod=600 Dec 03 22:24:13.928181 master-0 kubenswrapper[36504]: I1203 22:24:13.927988 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="thanos-sidecar" containerID="cri-o://72ac727550c6b9edefbb8d84d41ee176a75b8a08dfbc158a35a79286e5bf62a1" gracePeriod=600 Dec 03 22:24:14.265156 master-0 kubenswrapper[36504]: I1203 22:24:14.265063 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zqx7b" podUID="13962c28-06ea-4b66-aa46-f00d50e29eda" containerName="ovn-controller" probeResult="failure" output=< Dec 03 22:24:14.265156 master-0 kubenswrapper[36504]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 22:24:14.265156 master-0 kubenswrapper[36504]: > Dec 03 22:24:14.347947 master-0 kubenswrapper[36504]: I1203 22:24:14.347588 36504 generic.go:334] "Generic (PLEG): container finished" podID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerID="72ac727550c6b9edefbb8d84d41ee176a75b8a08dfbc158a35a79286e5bf62a1" exitCode=0 Dec 03 22:24:14.347947 master-0 kubenswrapper[36504]: I1203 22:24:14.347640 36504 generic.go:334] "Generic (PLEG): container finished" podID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerID="2041cc760388ff620b91c23e3a377192b215481dbe5edd06049cc6ebfb9ec635" exitCode=0 Dec 03 22:24:14.347947 master-0 kubenswrapper[36504]: I1203 22:24:14.347687 36504 generic.go:334] "Generic (PLEG): container finished" podID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerID="51f99190829cd0abd5e0817951a586d79cdeeb56f49085a0cae0bd57a18f87de" exitCode=0 Dec 03 22:24:14.347947 master-0 kubenswrapper[36504]: I1203 22:24:14.347933 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerDied","Data":"72ac727550c6b9edefbb8d84d41ee176a75b8a08dfbc158a35a79286e5bf62a1"} Dec 03 22:24:14.348267 master-0 kubenswrapper[36504]: I1203 22:24:14.347979 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerDied","Data":"2041cc760388ff620b91c23e3a377192b215481dbe5edd06049cc6ebfb9ec635"} Dec 03 22:24:14.348267 master-0 kubenswrapper[36504]: I1203 22:24:14.348000 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerDied","Data":"51f99190829cd0abd5e0817951a586d79cdeeb56f49085a0cae0bd57a18f87de"} Dec 03 22:24:17.520168 master-0 kubenswrapper[36504]: I1203 22:24:17.520068 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="prometheus" probeResult="failure" output="Get \"http://10.128.0.186:9090/-/ready\": dial tcp 10.128.0.186:9090: connect: connection refused" Dec 03 22:24:18.043987 master-0 kubenswrapper[36504]: I1203 22:24:18.043842 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="09ea7cea-94f9-4bdb-ad61-24281b0ee1ed" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.128.0.177:5671: connect: connection refused" Dec 03 22:24:19.231146 master-0 kubenswrapper[36504]: I1203 22:24:19.230852 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zqx7b" podUID="13962c28-06ea-4b66-aa46-f00d50e29eda" containerName="ovn-controller" probeResult="failure" output=< Dec 03 22:24:19.231146 master-0 kubenswrapper[36504]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 22:24:19.231146 master-0 kubenswrapper[36504]: > Dec 03 22:24:19.693150 master-0 kubenswrapper[36504]: W1203 22:24:19.693085 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9d13cdd_7989_4caa_a457_bac8438d445d.slice/crio-859fe7b2cef4e9b0e22777471bda10b1d542d37078faaa4895728b5de9b873c6 WatchSource:0}: Error finding container 859fe7b2cef4e9b0e22777471bda10b1d542d37078faaa4895728b5de9b873c6: Status 404 returned error can't find the container with id 859fe7b2cef4e9b0e22777471bda10b1d542d37078faaa4895728b5de9b873c6 Dec 03 22:24:20.161989 master-0 kubenswrapper[36504]: I1203 22:24:20.161936 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.286439 master-0 kubenswrapper[36504]: I1203 22:24:20.275147 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19bb9bba-1e31-4f41-9d29-202764cfd498-prometheus-metric-storage-rulefiles-0\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.286439 master-0 kubenswrapper[36504]: I1203 22:24:20.275248 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-web-config\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.286439 master-0 kubenswrapper[36504]: I1203 22:24:20.275283 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-config\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.286439 master-0 kubenswrapper[36504]: I1203 22:24:20.275341 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pfbj\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-kube-api-access-5pfbj\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.286439 master-0 kubenswrapper[36504]: I1203 22:24:20.277175 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19bb9bba-1e31-4f41-9d29-202764cfd498-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:20.330799 master-0 kubenswrapper[36504]: I1203 22:24:20.326955 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-config" (OuterVolumeSpecName: "config") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:20.330799 master-0 kubenswrapper[36504]: I1203 22:24:20.327211 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-kube-api-access-5pfbj" (OuterVolumeSpecName: "kube-api-access-5pfbj") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "kube-api-access-5pfbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:20.335807 master-0 kubenswrapper[36504]: I1203 22:24:20.332548 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-web-config" (OuterVolumeSpecName: "web-config") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:20.406930 master-0 kubenswrapper[36504]: I1203 22:24:20.388017 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.409238 master-0 kubenswrapper[36504]: I1203 22:24:20.409199 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-thanos-prometheus-http-client-file\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.409503 master-0 kubenswrapper[36504]: I1203 22:24:20.409488 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19bb9bba-1e31-4f41-9d29-202764cfd498-config-out\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.410016 master-0 kubenswrapper[36504]: I1203 22:24:20.409996 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-tls-assets\") pod \"19bb9bba-1e31-4f41-9d29-202764cfd498\" (UID: \"19bb9bba-1e31-4f41-9d29-202764cfd498\") " Dec 03 22:24:20.426421 master-0 kubenswrapper[36504]: I1203 22:24:20.422750 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:20.426421 master-0 kubenswrapper[36504]: I1203 22:24:20.423179 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:20.426421 master-0 kubenswrapper[36504]: I1203 22:24:20.425610 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19bb9bba-1e31-4f41-9d29-202764cfd498-config-out" (OuterVolumeSpecName: "config-out") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:24:20.439519 master-0 kubenswrapper[36504]: I1203 22:24:20.427719 36504 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.439946 master-0 kubenswrapper[36504]: I1203 22:24:20.439894 36504 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/19bb9bba-1e31-4f41-9d29-202764cfd498-config-out\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.440042 master-0 kubenswrapper[36504]: I1203 22:24:20.440024 36504 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/19bb9bba-1e31-4f41-9d29-202764cfd498-prometheus-metric-storage-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.440124 master-0 kubenswrapper[36504]: I1203 22:24:20.440113 36504 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-web-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.440187 master-0 kubenswrapper[36504]: I1203 22:24:20.440177 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/19bb9bba-1e31-4f41-9d29-202764cfd498-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.445761 master-0 kubenswrapper[36504]: I1203 22:24:20.445707 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pfbj\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-kube-api-access-5pfbj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.446079 master-0 kubenswrapper[36504]: I1203 22:24:20.446062 36504 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/19bb9bba-1e31-4f41-9d29-202764cfd498-tls-assets\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.452524 master-0 kubenswrapper[36504]: I1203 22:24:20.451005 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 22:24:20.466806 master-0 kubenswrapper[36504]: I1203 22:24:20.459226 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "19bb9bba-1e31-4f41-9d29-202764cfd498" (UID: "19bb9bba-1e31-4f41-9d29-202764cfd498"). InnerVolumeSpecName "pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:24:20.550933 master-0 kubenswrapper[36504]: I1203 22:24:20.549490 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" event={"ID":"c33cc971-6403-48df-9951-a01a2e1c92e1","Type":"ContainerStarted","Data":"1e44be388b71c86ea10e47695909fce01dcbdba8ee0d5078a421414b16fc0a44"} Dec 03 22:24:20.550933 master-0 kubenswrapper[36504]: I1203 22:24:20.550022 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:20.550933 master-0 kubenswrapper[36504]: I1203 22:24:20.550171 36504 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") on node \"master-0\" " Dec 03 22:24:20.554926 master-0 kubenswrapper[36504]: I1203 22:24:20.554853 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"7b869e4eea7faea2cc6401e17da651d0a6cd768e384e06f541c85f99556cc4b5"} Dec 03 22:24:20.568114 master-0 kubenswrapper[36504]: I1203 22:24:20.566919 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-sqbpl" event={"ID":"e9d13cdd-7989-4caa-a457-bac8438d445d","Type":"ContainerStarted","Data":"cb58d2e435b9c236b89f108b32f2d859f2790d96b310dbf4a6308adc863a4a21"} Dec 03 22:24:20.568114 master-0 kubenswrapper[36504]: I1203 22:24:20.566979 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-sqbpl" event={"ID":"e9d13cdd-7989-4caa-a457-bac8438d445d","Type":"ContainerStarted","Data":"859fe7b2cef4e9b0e22777471bda10b1d542d37078faaa4895728b5de9b873c6"} Dec 03 22:24:20.574716 master-0 kubenswrapper[36504]: I1203 22:24:20.574649 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"19bb9bba-1e31-4f41-9d29-202764cfd498","Type":"ContainerDied","Data":"9005232759f9c32eb3d5b2ba7135ad9c8f275ae755a707e46ffaee9d3335daab"} Dec 03 22:24:20.574862 master-0 kubenswrapper[36504]: I1203 22:24:20.574738 36504 scope.go:117] "RemoveContainer" containerID="72ac727550c6b9edefbb8d84d41ee176a75b8a08dfbc158a35a79286e5bf62a1" Dec 03 22:24:20.575619 master-0 kubenswrapper[36504]: I1203 22:24:20.574921 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.586877 master-0 kubenswrapper[36504]: I1203 22:24:20.586802 36504 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:24:20.587135 master-0 kubenswrapper[36504]: I1203 22:24:20.587112 36504 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e" (UniqueName: "kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1") on node "master-0" Dec 03 22:24:20.599174 master-0 kubenswrapper[36504]: I1203 22:24:20.598966 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" podStartSLOduration=11.598936753 podStartE2EDuration="11.598936753s" podCreationTimestamp="2025-12-03 22:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:20.581312549 +0000 UTC m=+825.801084566" watchObservedRunningTime="2025-12-03 22:24:20.598936753 +0000 UTC m=+825.818708760" Dec 03 22:24:20.615808 master-0 kubenswrapper[36504]: I1203 22:24:20.615747 36504 scope.go:117] "RemoveContainer" containerID="2041cc760388ff620b91c23e3a377192b215481dbe5edd06049cc6ebfb9ec635" Dec 03 22:24:20.647511 master-0 kubenswrapper[36504]: I1203 22:24:20.647421 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zqx7b-config-sqbpl" podStartSLOduration=10.647396087 podStartE2EDuration="10.647396087s" podCreationTimestamp="2025-12-03 22:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:20.636504394 +0000 UTC m=+825.856276401" watchObservedRunningTime="2025-12-03 22:24:20.647396087 +0000 UTC m=+825.867168094" Dec 03 22:24:20.652628 master-0 kubenswrapper[36504]: I1203 22:24:20.652579 36504 reconciler_common.go:293] "Volume detached for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:20.679522 master-0 kubenswrapper[36504]: I1203 22:24:20.678417 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:20.710223 master-0 kubenswrapper[36504]: I1203 22:24:20.710101 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:20.713816 master-0 kubenswrapper[36504]: I1203 22:24:20.713705 36504 scope.go:117] "RemoveContainer" containerID="51f99190829cd0abd5e0817951a586d79cdeeb56f49085a0cae0bd57a18f87de" Dec 03 22:24:20.730062 master-0 kubenswrapper[36504]: I1203 22:24:20.729299 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:20.730062 master-0 kubenswrapper[36504]: E1203 22:24:20.729982 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="init-config-reloader" Dec 03 22:24:20.730062 master-0 kubenswrapper[36504]: I1203 22:24:20.730019 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="init-config-reloader" Dec 03 22:24:20.730062 master-0 kubenswrapper[36504]: E1203 22:24:20.730046 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="thanos-sidecar" Dec 03 22:24:20.730062 master-0 kubenswrapper[36504]: I1203 22:24:20.730052 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="thanos-sidecar" Dec 03 22:24:20.730324 master-0 kubenswrapper[36504]: E1203 22:24:20.730082 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="prometheus" Dec 03 22:24:20.730324 master-0 kubenswrapper[36504]: I1203 22:24:20.730090 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="prometheus" Dec 03 22:24:20.730324 master-0 kubenswrapper[36504]: E1203 22:24:20.730134 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="config-reloader" Dec 03 22:24:20.730324 master-0 kubenswrapper[36504]: I1203 22:24:20.730142 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="config-reloader" Dec 03 22:24:20.730452 master-0 kubenswrapper[36504]: I1203 22:24:20.730417 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="thanos-sidecar" Dec 03 22:24:20.730488 master-0 kubenswrapper[36504]: I1203 22:24:20.730460 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="prometheus" Dec 03 22:24:20.730488 master-0 kubenswrapper[36504]: I1203 22:24:20.730482 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" containerName="config-reloader" Dec 03 22:24:20.741544 master-0 kubenswrapper[36504]: I1203 22:24:20.741460 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.747960 master-0 kubenswrapper[36504]: I1203 22:24:20.747864 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 22:24:20.748368 master-0 kubenswrapper[36504]: I1203 22:24:20.748342 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 22:24:20.748568 master-0 kubenswrapper[36504]: I1203 22:24:20.748544 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 22:24:20.757209 master-0 kubenswrapper[36504]: I1203 22:24:20.756547 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 22:24:20.761843 master-0 kubenswrapper[36504]: I1203 22:24:20.761278 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 22:24:20.761843 master-0 kubenswrapper[36504]: I1203 22:24:20.761603 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 22:24:20.783350 master-0 kubenswrapper[36504]: I1203 22:24:20.783226 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:20.823914 master-0 kubenswrapper[36504]: I1203 22:24:20.823850 36504 scope.go:117] "RemoveContainer" containerID="da97bf42436e285c9bb845abd9c57693d0e89d6eab143891326d1104005396f2" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859212 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859401 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859461 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-config\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859500 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23e7beb4-5345-49db-bd3e-aa3a97b7009e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859525 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859663 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859704 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrhh\" (UniqueName: \"kubernetes.io/projected/23e7beb4-5345-49db-bd3e-aa3a97b7009e-kube-api-access-fjrhh\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859762 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23e7beb4-5345-49db-bd3e-aa3a97b7009e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859825 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859850 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23e7beb4-5345-49db-bd3e-aa3a97b7009e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.863918 master-0 kubenswrapper[36504]: I1203 22:24:20.859885 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.962807 master-0 kubenswrapper[36504]: I1203 22:24:20.962707 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963135 master-0 kubenswrapper[36504]: I1203 22:24:20.962866 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-config\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963135 master-0 kubenswrapper[36504]: I1203 22:24:20.962912 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23e7beb4-5345-49db-bd3e-aa3a97b7009e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963135 master-0 kubenswrapper[36504]: I1203 22:24:20.962955 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963135 master-0 kubenswrapper[36504]: I1203 22:24:20.963027 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963135 master-0 kubenswrapper[36504]: I1203 22:24:20.963076 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrhh\" (UniqueName: \"kubernetes.io/projected/23e7beb4-5345-49db-bd3e-aa3a97b7009e-kube-api-access-fjrhh\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963401 master-0 kubenswrapper[36504]: I1203 22:24:20.963142 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23e7beb4-5345-49db-bd3e-aa3a97b7009e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963401 master-0 kubenswrapper[36504]: I1203 22:24:20.963180 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963401 master-0 kubenswrapper[36504]: I1203 22:24:20.963218 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23e7beb4-5345-49db-bd3e-aa3a97b7009e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963401 master-0 kubenswrapper[36504]: I1203 22:24:20.963271 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.963401 master-0 kubenswrapper[36504]: I1203 22:24:20.963319 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.965654 master-0 kubenswrapper[36504]: I1203 22:24:20.965612 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/23e7beb4-5345-49db-bd3e-aa3a97b7009e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.967141 master-0 kubenswrapper[36504]: I1203 22:24:20.967105 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:20.967231 master-0 kubenswrapper[36504]: I1203 22:24:20.967152 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/605be9bb1c2aead55319b869602437907f88623b26353397612d1a0c6a1e99b3/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.968847 master-0 kubenswrapper[36504]: I1203 22:24:20.968803 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.970000 master-0 kubenswrapper[36504]: I1203 22:24:20.969928 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.970183 master-0 kubenswrapper[36504]: I1203 22:24:20.970016 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.974423 master-0 kubenswrapper[36504]: I1203 22:24:20.974372 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/23e7beb4-5345-49db-bd3e-aa3a97b7009e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.975117 master-0 kubenswrapper[36504]: I1203 22:24:20.975075 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-config\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.975214 master-0 kubenswrapper[36504]: I1203 22:24:20.975172 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/23e7beb4-5345-49db-bd3e-aa3a97b7009e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.975449 master-0 kubenswrapper[36504]: I1203 22:24:20.975403 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.976102 master-0 kubenswrapper[36504]: I1203 22:24:20.976017 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/23e7beb4-5345-49db-bd3e-aa3a97b7009e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:20.993598 master-0 kubenswrapper[36504]: I1203 22:24:20.991470 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrhh\" (UniqueName: \"kubernetes.io/projected/23e7beb4-5345-49db-bd3e-aa3a97b7009e-kube-api-access-fjrhh\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:21.147735 master-0 kubenswrapper[36504]: I1203 22:24:21.145786 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19bb9bba-1e31-4f41-9d29-202764cfd498" path="/var/lib/kubelet/pods/19bb9bba-1e31-4f41-9d29-202764cfd498/volumes" Dec 03 22:24:21.599132 master-0 kubenswrapper[36504]: I1203 22:24:21.598927 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8bwst" event={"ID":"b0d98163-c65a-412b-a4c4-32ce8bff30a8","Type":"ContainerStarted","Data":"e38349a8e2c4760958e3f3aeb3a37d7dc01f25c0108f92de9d4f787600919d20"} Dec 03 22:24:21.602074 master-0 kubenswrapper[36504]: I1203 22:24:21.602003 36504 generic.go:334] "Generic (PLEG): container finished" podID="e9d13cdd-7989-4caa-a457-bac8438d445d" containerID="cb58d2e435b9c236b89f108b32f2d859f2790d96b310dbf4a6308adc863a4a21" exitCode=0 Dec 03 22:24:21.602622 master-0 kubenswrapper[36504]: I1203 22:24:21.602575 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-sqbpl" event={"ID":"e9d13cdd-7989-4caa-a457-bac8438d445d","Type":"ContainerDied","Data":"cb58d2e435b9c236b89f108b32f2d859f2790d96b310dbf4a6308adc863a4a21"} Dec 03 22:24:21.652813 master-0 kubenswrapper[36504]: I1203 22:24:21.651130 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8bwst" podStartSLOduration=4.118127327 podStartE2EDuration="19.651079821s" podCreationTimestamp="2025-12-03 22:24:02 +0000 UTC" firstStartedPulling="2025-12-03 22:24:04.33068744 +0000 UTC m=+809.550459447" lastFinishedPulling="2025-12-03 22:24:19.863639934 +0000 UTC m=+825.083411941" observedRunningTime="2025-12-03 22:24:21.619294683 +0000 UTC m=+826.839066710" watchObservedRunningTime="2025-12-03 22:24:21.651079821 +0000 UTC m=+826.870851828" Dec 03 22:24:21.845601 master-0 kubenswrapper[36504]: I1203 22:24:21.845511 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7df074df-3b1b-40c0-9dc9-52dab4d17e4e\" (UniqueName: \"kubernetes.io/csi/topolvm.io^5bd16cd8-4b09-4260-a690-e2d58dfb75a1\") pod \"prometheus-metric-storage-0\" (UID: \"23e7beb4-5345-49db-bd3e-aa3a97b7009e\") " pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:22.024311 master-0 kubenswrapper[36504]: I1203 22:24:22.024212 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:22.658719 master-0 kubenswrapper[36504]: I1203 22:24:22.658615 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"d386945a6d677e5805f8d2515201358c0ad378111641e03cccde0eb57aa8dec4"} Dec 03 22:24:22.658719 master-0 kubenswrapper[36504]: I1203 22:24:22.658699 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"55c9ae6f651184afc2a659ff9ac7d1b9fcf7812c2031c89e9e6a66d3033976f7"} Dec 03 22:24:22.658719 master-0 kubenswrapper[36504]: I1203 22:24:22.658710 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"e7365cd9745eb957d860ebb7ee87f8c9849993096fd4cf2abce50e37a4746318"} Dec 03 22:24:22.693995 master-0 kubenswrapper[36504]: I1203 22:24:22.693918 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 22:24:22.737178 master-0 kubenswrapper[36504]: W1203 22:24:22.736677 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e7beb4_5345_49db_bd3e_aa3a97b7009e.slice/crio-b7330632d5a0603b13f2a02dc7fda5428d130ce817ff5b1e360a7f4e3543678b WatchSource:0}: Error finding container b7330632d5a0603b13f2a02dc7fda5428d130ce817ff5b1e360a7f4e3543678b: Status 404 returned error can't find the container with id b7330632d5a0603b13f2a02dc7fda5428d130ce817ff5b1e360a7f4e3543678b Dec 03 22:24:23.166602 master-0 kubenswrapper[36504]: I1203 22:24:23.166471 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:23.262511 master-0 kubenswrapper[36504]: I1203 22:24:23.262430 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shkx6\" (UniqueName: \"kubernetes.io/projected/e9d13cdd-7989-4caa-a457-bac8438d445d-kube-api-access-shkx6\") pod \"e9d13cdd-7989-4caa-a457-bac8438d445d\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " Dec 03 22:24:23.262925 master-0 kubenswrapper[36504]: I1203 22:24:23.262691 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-additional-scripts\") pod \"e9d13cdd-7989-4caa-a457-bac8438d445d\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " Dec 03 22:24:23.262925 master-0 kubenswrapper[36504]: I1203 22:24:23.262904 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-scripts\") pod \"e9d13cdd-7989-4caa-a457-bac8438d445d\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " Dec 03 22:24:23.263041 master-0 kubenswrapper[36504]: I1203 22:24:23.262950 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run-ovn\") pod \"e9d13cdd-7989-4caa-a457-bac8438d445d\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " Dec 03 22:24:23.263078 master-0 kubenswrapper[36504]: I1203 22:24:23.263035 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-log-ovn\") pod \"e9d13cdd-7989-4caa-a457-bac8438d445d\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " Dec 03 22:24:23.263152 master-0 kubenswrapper[36504]: I1203 22:24:23.263129 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run\") pod \"e9d13cdd-7989-4caa-a457-bac8438d445d\" (UID: \"e9d13cdd-7989-4caa-a457-bac8438d445d\") " Dec 03 22:24:23.263203 master-0 kubenswrapper[36504]: I1203 22:24:23.263134 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e9d13cdd-7989-4caa-a457-bac8438d445d" (UID: "e9d13cdd-7989-4caa-a457-bac8438d445d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:24:23.263297 master-0 kubenswrapper[36504]: I1203 22:24:23.263276 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e9d13cdd-7989-4caa-a457-bac8438d445d" (UID: "e9d13cdd-7989-4caa-a457-bac8438d445d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:24:23.263393 master-0 kubenswrapper[36504]: I1203 22:24:23.263372 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run" (OuterVolumeSpecName: "var-run") pod "e9d13cdd-7989-4caa-a457-bac8438d445d" (UID: "e9d13cdd-7989-4caa-a457-bac8438d445d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:24:23.263616 master-0 kubenswrapper[36504]: I1203 22:24:23.263548 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e9d13cdd-7989-4caa-a457-bac8438d445d" (UID: "e9d13cdd-7989-4caa-a457-bac8438d445d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:23.264113 master-0 kubenswrapper[36504]: I1203 22:24:23.264068 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-scripts" (OuterVolumeSpecName: "scripts") pod "e9d13cdd-7989-4caa-a457-bac8438d445d" (UID: "e9d13cdd-7989-4caa-a457-bac8438d445d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:23.264373 master-0 kubenswrapper[36504]: I1203 22:24:23.264341 36504 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-additional-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:23.264436 master-0 kubenswrapper[36504]: I1203 22:24:23.264376 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9d13cdd-7989-4caa-a457-bac8438d445d-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:23.264436 master-0 kubenswrapper[36504]: I1203 22:24:23.264391 36504 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:23.264436 master-0 kubenswrapper[36504]: I1203 22:24:23.264403 36504 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:23.264436 master-0 kubenswrapper[36504]: I1203 22:24:23.264415 36504 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e9d13cdd-7989-4caa-a457-bac8438d445d-var-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:23.266598 master-0 kubenswrapper[36504]: I1203 22:24:23.266558 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d13cdd-7989-4caa-a457-bac8438d445d-kube-api-access-shkx6" (OuterVolumeSpecName: "kube-api-access-shkx6") pod "e9d13cdd-7989-4caa-a457-bac8438d445d" (UID: "e9d13cdd-7989-4caa-a457-bac8438d445d"). InnerVolumeSpecName "kube-api-access-shkx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:23.367050 master-0 kubenswrapper[36504]: I1203 22:24:23.366921 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shkx6\" (UniqueName: \"kubernetes.io/projected/e9d13cdd-7989-4caa-a457-bac8438d445d-kube-api-access-shkx6\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:23.391105 master-0 kubenswrapper[36504]: I1203 22:24:23.391018 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 22:24:23.675567 master-0 kubenswrapper[36504]: I1203 22:24:23.675479 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23e7beb4-5345-49db-bd3e-aa3a97b7009e","Type":"ContainerStarted","Data":"b7330632d5a0603b13f2a02dc7fda5428d130ce817ff5b1e360a7f4e3543678b"} Dec 03 22:24:23.682556 master-0 kubenswrapper[36504]: I1203 22:24:23.682511 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"f97e2d5151185bc4a52c942195fcfb17fb61c50ccebe2aa35932e3e850805cd6"} Dec 03 22:24:23.686833 master-0 kubenswrapper[36504]: I1203 22:24:23.685627 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-sqbpl" event={"ID":"e9d13cdd-7989-4caa-a457-bac8438d445d","Type":"ContainerDied","Data":"859fe7b2cef4e9b0e22777471bda10b1d542d37078faaa4895728b5de9b873c6"} Dec 03 22:24:23.686833 master-0 kubenswrapper[36504]: I1203 22:24:23.685676 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-sqbpl" Dec 03 22:24:23.686833 master-0 kubenswrapper[36504]: I1203 22:24:23.685714 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="859fe7b2cef4e9b0e22777471bda10b1d542d37078faaa4895728b5de9b873c6" Dec 03 22:24:23.804744 master-0 kubenswrapper[36504]: I1203 22:24:23.804663 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zqx7b-config-sqbpl"] Dec 03 22:24:23.835897 master-0 kubenswrapper[36504]: I1203 22:24:23.835824 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zqx7b-config-sqbpl"] Dec 03 22:24:23.924504 master-0 kubenswrapper[36504]: I1203 22:24:23.924321 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zqx7b-config-hp7wb"] Dec 03 22:24:23.926164 master-0 kubenswrapper[36504]: E1203 22:24:23.925656 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d13cdd-7989-4caa-a457-bac8438d445d" containerName="ovn-config" Dec 03 22:24:23.926164 master-0 kubenswrapper[36504]: I1203 22:24:23.925685 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d13cdd-7989-4caa-a457-bac8438d445d" containerName="ovn-config" Dec 03 22:24:23.926164 master-0 kubenswrapper[36504]: I1203 22:24:23.926001 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d13cdd-7989-4caa-a457-bac8438d445d" containerName="ovn-config" Dec 03 22:24:23.932351 master-0 kubenswrapper[36504]: I1203 22:24:23.927387 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:23.935441 master-0 kubenswrapper[36504]: I1203 22:24:23.935364 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 22:24:23.973365 master-0 kubenswrapper[36504]: I1203 22:24:23.973278 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zqx7b-config-hp7wb"] Dec 03 22:24:24.092017 master-0 kubenswrapper[36504]: I1203 22:24:24.091967 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-additional-scripts\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.092217 master-0 kubenswrapper[36504]: I1203 22:24:24.092200 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run-ovn\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.092315 master-0 kubenswrapper[36504]: I1203 22:24:24.092295 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-scripts\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.095322 master-0 kubenswrapper[36504]: I1203 22:24:24.092797 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-log-ovn\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.095322 master-0 kubenswrapper[36504]: I1203 22:24:24.092875 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.095322 master-0 kubenswrapper[36504]: I1203 22:24:24.092924 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7scrr\" (UniqueName: \"kubernetes.io/projected/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-kube-api-access-7scrr\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.195761 master-0 kubenswrapper[36504]: I1203 22:24:24.195585 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-log-ovn\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.195761 master-0 kubenswrapper[36504]: I1203 22:24:24.195668 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.195761 master-0 kubenswrapper[36504]: I1203 22:24:24.195729 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7scrr\" (UniqueName: \"kubernetes.io/projected/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-kube-api-access-7scrr\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.196131 master-0 kubenswrapper[36504]: I1203 22:24:24.195915 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-additional-scripts\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.196298 master-0 kubenswrapper[36504]: I1203 22:24:24.196241 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run-ovn\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.196358 master-0 kubenswrapper[36504]: I1203 22:24:24.196325 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-scripts\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.196649 master-0 kubenswrapper[36504]: I1203 22:24:24.196582 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run-ovn\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.198355 master-0 kubenswrapper[36504]: I1203 22:24:24.198328 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.198423 master-0 kubenswrapper[36504]: I1203 22:24:24.198353 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-log-ovn\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.199298 master-0 kubenswrapper[36504]: I1203 22:24:24.199168 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-scripts\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.199554 master-0 kubenswrapper[36504]: I1203 22:24:24.199521 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-additional-scripts\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.224397 master-0 kubenswrapper[36504]: I1203 22:24:24.224318 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7scrr\" (UniqueName: \"kubernetes.io/projected/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-kube-api-access-7scrr\") pod \"ovn-controller-zqx7b-config-hp7wb\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.256363 master-0 kubenswrapper[36504]: I1203 22:24:24.256303 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zqx7b" Dec 03 22:24:24.277980 master-0 kubenswrapper[36504]: I1203 22:24:24.277846 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:24.714683 master-0 kubenswrapper[36504]: I1203 22:24:24.714575 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"0fc05200b794dfa5038aebce81f7a1f43150515744e795bb877fba658a0561a4"} Dec 03 22:24:24.875528 master-0 kubenswrapper[36504]: I1203 22:24:24.874452 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zqx7b-config-hp7wb"] Dec 03 22:24:25.135461 master-0 kubenswrapper[36504]: I1203 22:24:25.135385 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d13cdd-7989-4caa-a457-bac8438d445d" path="/var/lib/kubelet/pods/e9d13cdd-7989-4caa-a457-bac8438d445d/volumes" Dec 03 22:24:25.731736 master-0 kubenswrapper[36504]: I1203 22:24:25.731533 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"2a4497120ef1bdef636c8ee5ed390bbf4285b6adf5569852099c9c8f6033ed69"} Dec 03 22:24:25.731736 master-0 kubenswrapper[36504]: I1203 22:24:25.731603 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"57ad05643f2dc103371de45e3e6c94b01942aa1227c00556cfa7578c97426fc6"} Dec 03 22:24:25.734091 master-0 kubenswrapper[36504]: I1203 22:24:25.734048 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-hp7wb" event={"ID":"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c","Type":"ContainerStarted","Data":"71e87d6b780e8a02773a45413651885753b4bdbe704761e0a9a4340b2c5f719e"} Dec 03 22:24:26.756719 master-0 kubenswrapper[36504]: I1203 22:24:26.756618 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"f43e8a164a0a837c529dc77dd2d5d3ada628d8a01d11e5fce872a342f1e44a7a"} Dec 03 22:24:26.764195 master-0 kubenswrapper[36504]: I1203 22:24:26.763582 36504 generic.go:334] "Generic (PLEG): container finished" podID="9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" containerID="4d490f766d003ce3f7cdde5e3f2db16774798f6827797743b163c51406309f61" exitCode=0 Dec 03 22:24:26.764195 master-0 kubenswrapper[36504]: I1203 22:24:26.763684 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-hp7wb" event={"ID":"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c","Type":"ContainerDied","Data":"4d490f766d003ce3f7cdde5e3f2db16774798f6827797743b163c51406309f61"} Dec 03 22:24:27.819007 master-0 kubenswrapper[36504]: I1203 22:24:27.818918 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"693b1c7db09a42d754bc61b58a9e2f10b9fa6cb8ff9604fba2c28dfb6c693403"} Dec 03 22:24:27.819007 master-0 kubenswrapper[36504]: I1203 22:24:27.818997 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"b49ced6710bb2a146fc45bfcf58ec4dcb4be000e058bd75b24ada2ef8fd5993a"} Dec 03 22:24:27.830417 master-0 kubenswrapper[36504]: I1203 22:24:27.821986 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23e7beb4-5345-49db-bd3e-aa3a97b7009e","Type":"ContainerStarted","Data":"5633420bb74c899b269328e7ff0442d51cc0c1e2816645337f7401fe43f03c0f"} Dec 03 22:24:28.046554 master-0 kubenswrapper[36504]: I1203 22:24:28.043837 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 22:24:28.411309 master-0 kubenswrapper[36504]: I1203 22:24:28.410723 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.551264 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-log-ovn\") pod \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.551420 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run-ovn\") pod \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.551494 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7scrr\" (UniqueName: \"kubernetes.io/projected/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-kube-api-access-7scrr\") pod \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.551594 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run\") pod \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.551632 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-additional-scripts\") pod \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.551681 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-scripts\") pod \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\" (UID: \"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c\") " Dec 03 22:24:28.556807 master-0 kubenswrapper[36504]: I1203 22:24:28.554240 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-scripts" (OuterVolumeSpecName: "scripts") pod "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" (UID: "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:28.565141 master-0 kubenswrapper[36504]: I1203 22:24:28.565080 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run" (OuterVolumeSpecName: "var-run") pod "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" (UID: "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:24:28.565712 master-0 kubenswrapper[36504]: I1203 22:24:28.565685 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" (UID: "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:28.565798 master-0 kubenswrapper[36504]: I1203 22:24:28.565729 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" (UID: "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:24:28.565798 master-0 kubenswrapper[36504]: I1203 22:24:28.565786 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" (UID: "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:24:28.572127 master-0 kubenswrapper[36504]: I1203 22:24:28.570341 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-kube-api-access-7scrr" (OuterVolumeSpecName: "kube-api-access-7scrr") pod "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" (UID: "9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c"). InnerVolumeSpecName "kube-api-access-7scrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:28.656076 master-0 kubenswrapper[36504]: I1203 22:24:28.655927 36504 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:28.656076 master-0 kubenswrapper[36504]: I1203 22:24:28.655973 36504 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:28.656076 master-0 kubenswrapper[36504]: I1203 22:24:28.655990 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7scrr\" (UniqueName: \"kubernetes.io/projected/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-kube-api-access-7scrr\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:28.656076 master-0 kubenswrapper[36504]: I1203 22:24:28.656008 36504 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-var-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:28.656076 master-0 kubenswrapper[36504]: I1203 22:24:28.656019 36504 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-additional-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:28.656076 master-0 kubenswrapper[36504]: I1203 22:24:28.656033 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:28.860892 master-0 kubenswrapper[36504]: I1203 22:24:28.858417 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-8lq7d"] Dec 03 22:24:28.860892 master-0 kubenswrapper[36504]: E1203 22:24:28.859369 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" containerName="ovn-config" Dec 03 22:24:28.860892 master-0 kubenswrapper[36504]: I1203 22:24:28.859388 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" containerName="ovn-config" Dec 03 22:24:28.860892 master-0 kubenswrapper[36504]: I1203 22:24:28.859686 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" containerName="ovn-config" Dec 03 22:24:28.860892 master-0 kubenswrapper[36504]: I1203 22:24:28.860859 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:28.924176 master-0 kubenswrapper[36504]: I1203 22:24:28.919036 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8lq7d"] Dec 03 22:24:28.932804 master-0 kubenswrapper[36504]: I1203 22:24:28.925004 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-operator-scripts\") pod \"heat-db-create-8lq7d\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:28.932804 master-0 kubenswrapper[36504]: I1203 22:24:28.925219 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7hzb\" (UniqueName: \"kubernetes.io/projected/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-kube-api-access-r7hzb\") pod \"heat-db-create-8lq7d\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:28.972074 master-0 kubenswrapper[36504]: I1203 22:24:28.971885 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"3296a1d32d9cb330b22a084ba6450128d748878ad2d7e8bd08cd352df996ff73"} Dec 03 22:24:28.972074 master-0 kubenswrapper[36504]: I1203 22:24:28.971976 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"e22ac34e4dcaf6ab51e5116e2b8e4b5739e38aaa473b17724cbed5316295977b"} Dec 03 22:24:28.972074 master-0 kubenswrapper[36504]: I1203 22:24:28.971991 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"f075e005afd7d7c41d70745a05ea55d12b42eb0f1ba2616e9af94518be3701c4"} Dec 03 22:24:28.986029 master-0 kubenswrapper[36504]: I1203 22:24:28.982983 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zqx7b-config-hp7wb" event={"ID":"9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c","Type":"ContainerDied","Data":"71e87d6b780e8a02773a45413651885753b4bdbe704761e0a9a4340b2c5f719e"} Dec 03 22:24:28.986029 master-0 kubenswrapper[36504]: I1203 22:24:28.983059 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71e87d6b780e8a02773a45413651885753b4bdbe704761e0a9a4340b2c5f719e" Dec 03 22:24:28.986029 master-0 kubenswrapper[36504]: I1203 22:24:28.983087 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zqx7b-config-hp7wb" Dec 03 22:24:29.035551 master-0 kubenswrapper[36504]: I1203 22:24:29.031328 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7hzb\" (UniqueName: \"kubernetes.io/projected/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-kube-api-access-r7hzb\") pod \"heat-db-create-8lq7d\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:29.035551 master-0 kubenswrapper[36504]: I1203 22:24:29.031622 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-operator-scripts\") pod \"heat-db-create-8lq7d\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:29.046019 master-0 kubenswrapper[36504]: I1203 22:24:29.043627 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-operator-scripts\") pod \"heat-db-create-8lq7d\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:29.077918 master-0 kubenswrapper[36504]: I1203 22:24:29.075867 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-113a-account-create-update-6xzl6"] Dec 03 22:24:29.077918 master-0 kubenswrapper[36504]: I1203 22:24:29.077875 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.093941 master-0 kubenswrapper[36504]: I1203 22:24:29.088865 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 22:24:29.093941 master-0 kubenswrapper[36504]: I1203 22:24:29.089706 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7hzb\" (UniqueName: \"kubernetes.io/projected/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-kube-api-access-r7hzb\") pod \"heat-db-create-8lq7d\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:29.097461 master-0 kubenswrapper[36504]: I1203 22:24:29.095138 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-113a-account-create-update-6xzl6"] Dec 03 22:24:29.149940 master-0 kubenswrapper[36504]: I1203 22:24:29.149820 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-66454"] Dec 03 22:24:29.152227 master-0 kubenswrapper[36504]: I1203 22:24:29.152167 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-66454"] Dec 03 22:24:29.152227 master-0 kubenswrapper[36504]: I1203 22:24:29.152209 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cj96j"] Dec 03 22:24:29.156269 master-0 kubenswrapper[36504]: I1203 22:24:29.155992 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.175352 master-0 kubenswrapper[36504]: I1203 22:24:29.166172 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.190038 master-0 kubenswrapper[36504]: I1203 22:24:29.189459 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cj96j"] Dec 03 22:24:29.226347 master-0 kubenswrapper[36504]: I1203 22:24:29.226123 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8r7tj"] Dec 03 22:24:29.229428 master-0 kubenswrapper[36504]: I1203 22:24:29.229389 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.235126 master-0 kubenswrapper[36504]: I1203 22:24:29.235039 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:29.261199 master-0 kubenswrapper[36504]: I1203 22:24:29.243457 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:24:29.261199 master-0 kubenswrapper[36504]: I1203 22:24:29.243832 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:24:29.261199 master-0 kubenswrapper[36504]: I1203 22:24:29.248088 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:24:29.262608 master-0 kubenswrapper[36504]: I1203 22:24:29.261418 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr724\" (UniqueName: \"kubernetes.io/projected/6914d837-9818-4960-aa61-a0ea34ae574b-kube-api-access-qr724\") pod \"heat-113a-account-create-update-6xzl6\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.264147 master-0 kubenswrapper[36504]: I1203 22:24:29.263428 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbce4b8e-8c01-42bf-b475-ec56b388fa80-operator-scripts\") pod \"barbican-db-create-66454\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.264147 master-0 kubenswrapper[36504]: I1203 22:24:29.263649 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdnj\" (UniqueName: \"kubernetes.io/projected/dbce4b8e-8c01-42bf-b475-ec56b388fa80-kube-api-access-vjdnj\") pod \"barbican-db-create-66454\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.264147 master-0 kubenswrapper[36504]: I1203 22:24:29.263737 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6914d837-9818-4960-aa61-a0ea34ae574b-operator-scripts\") pod \"heat-113a-account-create-update-6xzl6\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.264147 master-0 kubenswrapper[36504]: I1203 22:24:29.263845 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrzn\" (UniqueName: \"kubernetes.io/projected/55659879-0543-4282-abb3-d03ca6ecf2ce-kube-api-access-tdrzn\") pod \"cinder-db-create-cj96j\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.264147 master-0 kubenswrapper[36504]: I1203 22:24:29.263902 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55659879-0543-4282-abb3-d03ca6ecf2ce-operator-scripts\") pod \"cinder-db-create-cj96j\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.307741 master-0 kubenswrapper[36504]: I1203 22:24:29.307670 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8r7tj"] Dec 03 22:24:29.355389 master-0 kubenswrapper[36504]: I1203 22:24:29.354280 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d991-account-create-update-n4kt9"] Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.369869 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.370408 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87fjk\" (UniqueName: \"kubernetes.io/projected/d143967f-1161-4824-b45f-67d0b695536f-kube-api-access-87fjk\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.370509 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-combined-ca-bundle\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.370556 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr724\" (UniqueName: \"kubernetes.io/projected/6914d837-9818-4960-aa61-a0ea34ae574b-kube-api-access-qr724\") pod \"heat-113a-account-create-update-6xzl6\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.370675 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-config-data\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.370764 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbce4b8e-8c01-42bf-b475-ec56b388fa80-operator-scripts\") pod \"barbican-db-create-66454\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.371027 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdnj\" (UniqueName: \"kubernetes.io/projected/dbce4b8e-8c01-42bf-b475-ec56b388fa80-kube-api-access-vjdnj\") pod \"barbican-db-create-66454\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.371088 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6914d837-9818-4960-aa61-a0ea34ae574b-operator-scripts\") pod \"heat-113a-account-create-update-6xzl6\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.371148 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdrzn\" (UniqueName: \"kubernetes.io/projected/55659879-0543-4282-abb3-d03ca6ecf2ce-kube-api-access-tdrzn\") pod \"cinder-db-create-cj96j\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.371887 master-0 kubenswrapper[36504]: I1203 22:24:29.371192 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55659879-0543-4282-abb3-d03ca6ecf2ce-operator-scripts\") pod \"cinder-db-create-cj96j\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.373048 master-0 kubenswrapper[36504]: I1203 22:24:29.372150 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbce4b8e-8c01-42bf-b475-ec56b388fa80-operator-scripts\") pod \"barbican-db-create-66454\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.373048 master-0 kubenswrapper[36504]: I1203 22:24:29.372424 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55659879-0543-4282-abb3-d03ca6ecf2ce-operator-scripts\") pod \"cinder-db-create-cj96j\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.373048 master-0 kubenswrapper[36504]: I1203 22:24:29.372738 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6914d837-9818-4960-aa61-a0ea34ae574b-operator-scripts\") pod \"heat-113a-account-create-update-6xzl6\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.377856 master-0 kubenswrapper[36504]: I1203 22:24:29.374954 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 22:24:29.396248 master-0 kubenswrapper[36504]: I1203 22:24:29.393673 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdrzn\" (UniqueName: \"kubernetes.io/projected/55659879-0543-4282-abb3-d03ca6ecf2ce-kube-api-access-tdrzn\") pod \"cinder-db-create-cj96j\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.398130 master-0 kubenswrapper[36504]: I1203 22:24:29.397927 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdnj\" (UniqueName: \"kubernetes.io/projected/dbce4b8e-8c01-42bf-b475-ec56b388fa80-kube-api-access-vjdnj\") pod \"barbican-db-create-66454\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.411891 master-0 kubenswrapper[36504]: I1203 22:24:29.405520 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr724\" (UniqueName: \"kubernetes.io/projected/6914d837-9818-4960-aa61-a0ea34ae574b-kube-api-access-qr724\") pod \"heat-113a-account-create-update-6xzl6\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.423484 master-0 kubenswrapper[36504]: I1203 22:24:29.423182 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d991-account-create-update-n4kt9"] Dec 03 22:24:29.491973 master-0 kubenswrapper[36504]: I1203 22:24:29.489043 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87fjk\" (UniqueName: \"kubernetes.io/projected/d143967f-1161-4824-b45f-67d0b695536f-kube-api-access-87fjk\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.491973 master-0 kubenswrapper[36504]: I1203 22:24:29.489254 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-combined-ca-bundle\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.491973 master-0 kubenswrapper[36504]: I1203 22:24:29.489425 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c70d18-a17c-4ceb-a58a-6cb354489cc8-operator-scripts\") pod \"neutron-d991-account-create-update-n4kt9\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.491973 master-0 kubenswrapper[36504]: I1203 22:24:29.490211 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-config-data\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.491973 master-0 kubenswrapper[36504]: I1203 22:24:29.490525 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjdtr\" (UniqueName: \"kubernetes.io/projected/39c70d18-a17c-4ceb-a58a-6cb354489cc8-kube-api-access-vjdtr\") pod \"neutron-d991-account-create-update-n4kt9\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.501183 master-0 kubenswrapper[36504]: I1203 22:24:29.495136 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:29.502081 master-0 kubenswrapper[36504]: I1203 22:24:29.501568 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-config-data\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.506439 master-0 kubenswrapper[36504]: I1203 22:24:29.502467 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sglc2"] Dec 03 22:24:29.508720 master-0 kubenswrapper[36504]: I1203 22:24:29.507576 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.519398 master-0 kubenswrapper[36504]: I1203 22:24:29.519330 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-66454" Dec 03 22:24:29.546461 master-0 kubenswrapper[36504]: I1203 22:24:29.545901 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:24:29.624951 master-0 kubenswrapper[36504]: I1203 22:24:29.613971 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-combined-ca-bundle\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.651693 master-0 kubenswrapper[36504]: I1203 22:24:29.649431 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c70d18-a17c-4ceb-a58a-6cb354489cc8-operator-scripts\") pod \"neutron-d991-account-create-update-n4kt9\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.651693 master-0 kubenswrapper[36504]: I1203 22:24:29.649562 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmd84\" (UniqueName: \"kubernetes.io/projected/a27383b2-a1e5-4e55-9709-136413c4c96a-kube-api-access-lmd84\") pod \"neutron-db-create-sglc2\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.651693 master-0 kubenswrapper[36504]: I1203 22:24:29.649634 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a27383b2-a1e5-4e55-9709-136413c4c96a-operator-scripts\") pod \"neutron-db-create-sglc2\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.651693 master-0 kubenswrapper[36504]: I1203 22:24:29.649790 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjdtr\" (UniqueName: \"kubernetes.io/projected/39c70d18-a17c-4ceb-a58a-6cb354489cc8-kube-api-access-vjdtr\") pod \"neutron-d991-account-create-update-n4kt9\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.652276 master-0 kubenswrapper[36504]: I1203 22:24:29.652004 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c70d18-a17c-4ceb-a58a-6cb354489cc8-operator-scripts\") pod \"neutron-d991-account-create-update-n4kt9\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.652276 master-0 kubenswrapper[36504]: I1203 22:24:29.652021 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87fjk\" (UniqueName: \"kubernetes.io/projected/d143967f-1161-4824-b45f-67d0b695536f-kube-api-access-87fjk\") pod \"keystone-db-sync-8r7tj\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.653629 master-0 kubenswrapper[36504]: I1203 22:24:29.653572 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:29.692805 master-0 kubenswrapper[36504]: I1203 22:24:29.683442 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjdtr\" (UniqueName: \"kubernetes.io/projected/39c70d18-a17c-4ceb-a58a-6cb354489cc8-kube-api-access-vjdtr\") pod \"neutron-d991-account-create-update-n4kt9\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.692805 master-0 kubenswrapper[36504]: I1203 22:24:29.690559 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:29.722259 master-0 kubenswrapper[36504]: I1203 22:24:29.709434 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sglc2"] Dec 03 22:24:29.722259 master-0 kubenswrapper[36504]: I1203 22:24:29.710748 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:29.739154 master-0 kubenswrapper[36504]: I1203 22:24:29.739059 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0539-account-create-update-h6tq8"] Dec 03 22:24:29.755833 master-0 kubenswrapper[36504]: I1203 22:24:29.747377 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:29.755833 master-0 kubenswrapper[36504]: I1203 22:24:29.755681 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 22:24:29.755833 master-0 kubenswrapper[36504]: I1203 22:24:29.755810 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmd84\" (UniqueName: \"kubernetes.io/projected/a27383b2-a1e5-4e55-9709-136413c4c96a-kube-api-access-lmd84\") pod \"neutron-db-create-sglc2\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.756298 master-0 kubenswrapper[36504]: I1203 22:24:29.755921 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a27383b2-a1e5-4e55-9709-136413c4c96a-operator-scripts\") pod \"neutron-db-create-sglc2\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.772405 master-0 kubenswrapper[36504]: I1203 22:24:29.759729 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a27383b2-a1e5-4e55-9709-136413c4c96a-operator-scripts\") pod \"neutron-db-create-sglc2\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.801385 master-0 kubenswrapper[36504]: I1203 22:24:29.792419 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0539-account-create-update-h6tq8"] Dec 03 22:24:29.830216 master-0 kubenswrapper[36504]: I1203 22:24:29.824735 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmd84\" (UniqueName: \"kubernetes.io/projected/a27383b2-a1e5-4e55-9709-136413c4c96a-kube-api-access-lmd84\") pod \"neutron-db-create-sglc2\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:29.874243 master-0 kubenswrapper[36504]: I1203 22:24:29.874151 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a73c-account-create-update-2m5sr"] Dec 03 22:24:29.885211 master-0 kubenswrapper[36504]: I1203 22:24:29.876480 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:29.885211 master-0 kubenswrapper[36504]: I1203 22:24:29.880173 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw466\" (UniqueName: \"kubernetes.io/projected/f10040d6-faaa-4181-b651-e26b54ae8a33-kube-api-access-vw466\") pod \"cinder-0539-account-create-update-h6tq8\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:29.885211 master-0 kubenswrapper[36504]: I1203 22:24:29.880524 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10040d6-faaa-4181-b651-e26b54ae8a33-operator-scripts\") pod \"cinder-0539-account-create-update-h6tq8\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:29.910804 master-0 kubenswrapper[36504]: I1203 22:24:29.904312 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 22:24:29.989049 master-0 kubenswrapper[36504]: I1203 22:24:29.988910 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw466\" (UniqueName: \"kubernetes.io/projected/f10040d6-faaa-4181-b651-e26b54ae8a33-kube-api-access-vw466\") pod \"cinder-0539-account-create-update-h6tq8\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:29.989361 master-0 kubenswrapper[36504]: I1203 22:24:29.989341 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz658\" (UniqueName: \"kubernetes.io/projected/185d021a-be33-4c90-a8b6-ddd16d53b53d-kube-api-access-wz658\") pod \"barbican-a73c-account-create-update-2m5sr\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:29.989543 master-0 kubenswrapper[36504]: I1203 22:24:29.989511 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10040d6-faaa-4181-b651-e26b54ae8a33-operator-scripts\") pod \"cinder-0539-account-create-update-h6tq8\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:29.990178 master-0 kubenswrapper[36504]: I1203 22:24:29.990143 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/185d021a-be33-4c90-a8b6-ddd16d53b53d-operator-scripts\") pod \"barbican-a73c-account-create-update-2m5sr\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:29.991219 master-0 kubenswrapper[36504]: I1203 22:24:29.991173 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10040d6-faaa-4181-b651-e26b54ae8a33-operator-scripts\") pod \"cinder-0539-account-create-update-h6tq8\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:30.089228 master-0 kubenswrapper[36504]: I1203 22:24:30.065785 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw466\" (UniqueName: \"kubernetes.io/projected/f10040d6-faaa-4181-b651-e26b54ae8a33-kube-api-access-vw466\") pod \"cinder-0539-account-create-update-h6tq8\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:30.089228 master-0 kubenswrapper[36504]: I1203 22:24:30.083360 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:30.102525 master-0 kubenswrapper[36504]: I1203 22:24:30.102456 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/185d021a-be33-4c90-a8b6-ddd16d53b53d-operator-scripts\") pod \"barbican-a73c-account-create-update-2m5sr\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:30.103284 master-0 kubenswrapper[36504]: I1203 22:24:30.102636 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz658\" (UniqueName: \"kubernetes.io/projected/185d021a-be33-4c90-a8b6-ddd16d53b53d-kube-api-access-wz658\") pod \"barbican-a73c-account-create-update-2m5sr\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:30.107625 master-0 kubenswrapper[36504]: I1203 22:24:30.106204 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"894d73e1e4592b5bd8bb5edcad7bb492723fdde8ceaf5516f9661e6c74f489ee"} Dec 03 22:24:30.107625 master-0 kubenswrapper[36504]: I1203 22:24:30.106292 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"07b5d783-be96-4a6b-8603-e6f56f13f233","Type":"ContainerStarted","Data":"aff3cf39e0530396fd5883441b8cfee01c5f8c3d7746dbe142a203717f9218dd"} Dec 03 22:24:30.115435 master-0 kubenswrapper[36504]: I1203 22:24:30.115339 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a73c-account-create-update-2m5sr"] Dec 03 22:24:30.118549 master-0 kubenswrapper[36504]: I1203 22:24:30.118486 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/185d021a-be33-4c90-a8b6-ddd16d53b53d-operator-scripts\") pod \"barbican-a73c-account-create-update-2m5sr\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:30.131438 master-0 kubenswrapper[36504]: I1203 22:24:30.129455 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:30.137016 master-0 kubenswrapper[36504]: I1203 22:24:30.136886 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz658\" (UniqueName: \"kubernetes.io/projected/185d021a-be33-4c90-a8b6-ddd16d53b53d-kube-api-access-wz658\") pod \"barbican-a73c-account-create-update-2m5sr\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:30.157165 master-0 kubenswrapper[36504]: I1203 22:24:30.153869 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zqx7b-config-hp7wb"] Dec 03 22:24:30.188009 master-0 kubenswrapper[36504]: I1203 22:24:30.187905 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zqx7b-config-hp7wb"] Dec 03 22:24:30.276072 master-0 kubenswrapper[36504]: I1203 22:24:30.275593 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:30.301797 master-0 kubenswrapper[36504]: I1203 22:24:30.299777 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-tzsmz"] Dec 03 22:24:30.301797 master-0 kubenswrapper[36504]: I1203 22:24:30.300162 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerName="dnsmasq-dns" containerID="cri-o://80780ce9696d05c93094744c8896979af176bea4c25495509bf907b498957ade" gracePeriod=10 Dec 03 22:24:30.383986 master-0 kubenswrapper[36504]: I1203 22:24:30.383887 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8lq7d"] Dec 03 22:24:30.396460 master-0 kubenswrapper[36504]: I1203 22:24:30.396351 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=44.831201782 podStartE2EDuration="51.39631477s" podCreationTimestamp="2025-12-03 22:23:39 +0000 UTC" firstStartedPulling="2025-12-03 22:24:20.466096799 +0000 UTC m=+825.685868816" lastFinishedPulling="2025-12-03 22:24:27.031209797 +0000 UTC m=+832.250981804" observedRunningTime="2025-12-03 22:24:30.221715713 +0000 UTC m=+835.441487740" watchObservedRunningTime="2025-12-03 22:24:30.39631477 +0000 UTC m=+835.616086777" Dec 03 22:24:30.417690 master-0 kubenswrapper[36504]: I1203 22:24:30.417624 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-113a-account-create-update-6xzl6"] Dec 03 22:24:30.684586 master-0 kubenswrapper[36504]: I1203 22:24:30.684421 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cj96j"] Dec 03 22:24:30.706362 master-0 kubenswrapper[36504]: I1203 22:24:30.706248 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-66454"] Dec 03 22:24:30.756164 master-0 kubenswrapper[36504]: I1203 22:24:30.751292 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589b458997-99rrp"] Dec 03 22:24:30.756164 master-0 kubenswrapper[36504]: I1203 22:24:30.754989 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.768922 master-0 kubenswrapper[36504]: I1203 22:24:30.766986 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.810605 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-svc\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.811048 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-sb\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.811264 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-config\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.811330 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-edpm\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.811536 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcnnj\" (UniqueName: \"kubernetes.io/projected/5674bf96-4a9a-40d4-95af-d72defcb57d5-kube-api-access-kcnnj\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.811903 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-swift-storage-0\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.812034 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-nb\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:30.958179 master-0 kubenswrapper[36504]: I1203 22:24:30.862363 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589b458997-99rrp"] Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969079 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-swift-storage-0\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969204 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-nb\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969335 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-svc\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969505 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-sb\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969586 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-config\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969621 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-edpm\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.969709 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcnnj\" (UniqueName: \"kubernetes.io/projected/5674bf96-4a9a-40d4-95af-d72defcb57d5-kube-api-access-kcnnj\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.975482 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-svc\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.981465 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-sb\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.982238 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-nb\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.982717 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-swift-storage-0\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.984688 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-edpm\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:30.985262 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-config\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.123474 master-0 kubenswrapper[36504]: I1203 22:24:31.041277 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcnnj\" (UniqueName: \"kubernetes.io/projected/5674bf96-4a9a-40d4-95af-d72defcb57d5-kube-api-access-kcnnj\") pod \"dnsmasq-dns-589b458997-99rrp\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.172951 master-0 kubenswrapper[36504]: I1203 22:24:31.170419 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:31.233322 master-0 kubenswrapper[36504]: I1203 22:24:31.232616 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c" path="/var/lib/kubelet/pods/9ece7391-c8e0-4e2f-92a5-63d5d7aeaf6c/volumes" Dec 03 22:24:31.247393 master-0 kubenswrapper[36504]: I1203 22:24:31.239852 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8r7tj"] Dec 03 22:24:31.247393 master-0 kubenswrapper[36504]: I1203 22:24:31.239930 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-66454" event={"ID":"dbce4b8e-8c01-42bf-b475-ec56b388fa80","Type":"ContainerStarted","Data":"5b65255967698ebc147c46a77bb52ab9b321ae84fd0e4b21699894603035b982"} Dec 03 22:24:31.247393 master-0 kubenswrapper[36504]: I1203 22:24:31.239969 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8lq7d" event={"ID":"9e04fc00-ca89-4a25-9bb4-2f59d9f14388","Type":"ContainerStarted","Data":"a5445dd67e7b4057aee398d2921e091993a4c000eb24d733ebbb65ba1ad6b9b5"} Dec 03 22:24:31.247393 master-0 kubenswrapper[36504]: I1203 22:24:31.239989 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cj96j" event={"ID":"55659879-0543-4282-abb3-d03ca6ecf2ce","Type":"ContainerStarted","Data":"42ede8d90b3723c764cdd621c6cfcbcf02ece0efde1f678f31100998a298211d"} Dec 03 22:24:31.269964 master-0 kubenswrapper[36504]: I1203 22:24:31.268515 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-113a-account-create-update-6xzl6" event={"ID":"6914d837-9818-4960-aa61-a0ea34ae574b","Type":"ContainerStarted","Data":"d3c1e0141de9ac7b4ed64c6626f431eb89ba0592cac0ffb7996d813ad580328b"} Dec 03 22:24:31.302940 master-0 kubenswrapper[36504]: I1203 22:24:31.302778 36504 generic.go:334] "Generic (PLEG): container finished" podID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerID="80780ce9696d05c93094744c8896979af176bea4c25495509bf907b498957ade" exitCode=0 Dec 03 22:24:31.303293 master-0 kubenswrapper[36504]: I1203 22:24:31.303158 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" event={"ID":"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68","Type":"ContainerDied","Data":"80780ce9696d05c93094744c8896979af176bea4c25495509bf907b498957ade"} Dec 03 22:24:31.619287 master-0 kubenswrapper[36504]: I1203 22:24:31.619221 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d991-account-create-update-n4kt9"] Dec 03 22:24:31.635024 master-0 kubenswrapper[36504]: I1203 22:24:31.634949 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sglc2"] Dec 03 22:24:31.675835 master-0 kubenswrapper[36504]: I1203 22:24:31.661588 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0539-account-create-update-h6tq8"] Dec 03 22:24:31.710727 master-0 kubenswrapper[36504]: W1203 22:24:31.707870 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10040d6_faaa_4181_b651_e26b54ae8a33.slice/crio-735be523a10595e2b5bafbc1db2fccc7b78f80ba2b264e6f9445a30d80f13bad WatchSource:0}: Error finding container 735be523a10595e2b5bafbc1db2fccc7b78f80ba2b264e6f9445a30d80f13bad: Status 404 returned error can't find the container with id 735be523a10595e2b5bafbc1db2fccc7b78f80ba2b264e6f9445a30d80f13bad Dec 03 22:24:31.710727 master-0 kubenswrapper[36504]: W1203 22:24:31.709960 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27383b2_a1e5_4e55_9709_136413c4c96a.slice/crio-e70a70832014524c1995fe4e1ab74a7d48acac23f2ab6d69235b2806d5becfcb WatchSource:0}: Error finding container e70a70832014524c1995fe4e1ab74a7d48acac23f2ab6d69235b2806d5becfcb: Status 404 returned error can't find the container with id e70a70832014524c1995fe4e1ab74a7d48acac23f2ab6d69235b2806d5becfcb Dec 03 22:24:31.725690 master-0 kubenswrapper[36504]: I1203 22:24:31.725606 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:24:31.825426 master-0 kubenswrapper[36504]: I1203 22:24:31.825250 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvtnp\" (UniqueName: \"kubernetes.io/projected/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-kube-api-access-vvtnp\") pod \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " Dec 03 22:24:31.826280 master-0 kubenswrapper[36504]: I1203 22:24:31.825535 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-config\") pod \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " Dec 03 22:24:31.826280 master-0 kubenswrapper[36504]: I1203 22:24:31.825609 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-dns-svc\") pod \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " Dec 03 22:24:31.826463 master-0 kubenswrapper[36504]: I1203 22:24:31.825908 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-nb\") pod \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " Dec 03 22:24:31.826633 master-0 kubenswrapper[36504]: I1203 22:24:31.826586 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-sb\") pod \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\" (UID: \"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68\") " Dec 03 22:24:31.827082 master-0 kubenswrapper[36504]: I1203 22:24:31.826946 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a73c-account-create-update-2m5sr"] Dec 03 22:24:31.829147 master-0 kubenswrapper[36504]: I1203 22:24:31.829088 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-kube-api-access-vvtnp" (OuterVolumeSpecName: "kube-api-access-vvtnp") pod "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" (UID: "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68"). InnerVolumeSpecName "kube-api-access-vvtnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:31.839693 master-0 kubenswrapper[36504]: I1203 22:24:31.839631 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvtnp\" (UniqueName: \"kubernetes.io/projected/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-kube-api-access-vvtnp\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:31.921538 master-0 kubenswrapper[36504]: I1203 22:24:31.921469 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" (UID: "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:31.935206 master-0 kubenswrapper[36504]: I1203 22:24:31.935052 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" (UID: "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:31.947073 master-0 kubenswrapper[36504]: I1203 22:24:31.946704 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:31.947073 master-0 kubenswrapper[36504]: I1203 22:24:31.946757 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:31.948483 master-0 kubenswrapper[36504]: I1203 22:24:31.948431 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-config" (OuterVolumeSpecName: "config") pod "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" (UID: "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:32.019669 master-0 kubenswrapper[36504]: I1203 22:24:32.019546 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" (UID: "c62890e3-7d2e-4547-9a7b-8ea03ce1ce68"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:32.024471 master-0 kubenswrapper[36504]: I1203 22:24:32.023848 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589b458997-99rrp"] Dec 03 22:24:32.052099 master-0 kubenswrapper[36504]: I1203 22:24:32.052011 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:32.052099 master-0 kubenswrapper[36504]: I1203 22:24:32.052060 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:32.113180 master-0 kubenswrapper[36504]: W1203 22:24:32.112517 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5674bf96_4a9a_40d4_95af_d72defcb57d5.slice/crio-4db15af2d54ede8b506496f53683b57c50d14c7a62503e7de16e3c519979ee0e WatchSource:0}: Error finding container 4db15af2d54ede8b506496f53683b57c50d14c7a62503e7de16e3c519979ee0e: Status 404 returned error can't find the container with id 4db15af2d54ede8b506496f53683b57c50d14c7a62503e7de16e3c519979ee0e Dec 03 22:24:32.379803 master-0 kubenswrapper[36504]: I1203 22:24:32.379606 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8r7tj" event={"ID":"d143967f-1161-4824-b45f-67d0b695536f","Type":"ContainerStarted","Data":"f4aa05765a72f7dabaeefac5ea6809c3d8c5b2c67ed0c07720234277e8163b40"} Dec 03 22:24:32.416805 master-0 kubenswrapper[36504]: I1203 22:24:32.412588 36504 generic.go:334] "Generic (PLEG): container finished" podID="9e04fc00-ca89-4a25-9bb4-2f59d9f14388" containerID="9d3b1c630b279b2e5edf98141096ff43c7e6717a6d38ed6ddb1d9a68ae164702" exitCode=0 Dec 03 22:24:32.416805 master-0 kubenswrapper[36504]: I1203 22:24:32.412792 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8lq7d" event={"ID":"9e04fc00-ca89-4a25-9bb4-2f59d9f14388","Type":"ContainerDied","Data":"9d3b1c630b279b2e5edf98141096ff43c7e6717a6d38ed6ddb1d9a68ae164702"} Dec 03 22:24:32.451805 master-0 kubenswrapper[36504]: I1203 22:24:32.444226 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sglc2" event={"ID":"a27383b2-a1e5-4e55-9709-136413c4c96a","Type":"ContainerStarted","Data":"aec52144dd58623639f37392efde253da5e0a918f5f8e908536bbfe7320902af"} Dec 03 22:24:32.451805 master-0 kubenswrapper[36504]: I1203 22:24:32.444308 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sglc2" event={"ID":"a27383b2-a1e5-4e55-9709-136413c4c96a","Type":"ContainerStarted","Data":"e70a70832014524c1995fe4e1ab74a7d48acac23f2ab6d69235b2806d5becfcb"} Dec 03 22:24:32.468763 master-0 kubenswrapper[36504]: I1203 22:24:32.464186 36504 generic.go:334] "Generic (PLEG): container finished" podID="55659879-0543-4282-abb3-d03ca6ecf2ce" containerID="2ab2101c3564d8e46df5763baa7b94814c3cc8f0d1b338e27e85182a9d97a871" exitCode=0 Dec 03 22:24:32.468763 master-0 kubenswrapper[36504]: I1203 22:24:32.464341 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cj96j" event={"ID":"55659879-0543-4282-abb3-d03ca6ecf2ce","Type":"ContainerDied","Data":"2ab2101c3564d8e46df5763baa7b94814c3cc8f0d1b338e27e85182a9d97a871"} Dec 03 22:24:32.494836 master-0 kubenswrapper[36504]: I1203 22:24:32.494072 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" Dec 03 22:24:32.494836 master-0 kubenswrapper[36504]: I1203 22:24:32.494339 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4d74cb79-tzsmz" event={"ID":"c62890e3-7d2e-4547-9a7b-8ea03ce1ce68","Type":"ContainerDied","Data":"eca8fcfc671d95e8d2d56670c2995b26241db87d626281256605715354de2d8b"} Dec 03 22:24:32.494836 master-0 kubenswrapper[36504]: I1203 22:24:32.494492 36504 scope.go:117] "RemoveContainer" containerID="80780ce9696d05c93094744c8896979af176bea4c25495509bf907b498957ade" Dec 03 22:24:32.516841 master-0 kubenswrapper[36504]: I1203 22:24:32.514015 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d991-account-create-update-n4kt9" event={"ID":"39c70d18-a17c-4ceb-a58a-6cb354489cc8","Type":"ContainerStarted","Data":"97596bfea8c5ebd68ff6974cd790e0daa9d71633b399ed223de65410251a1448"} Dec 03 22:24:32.516841 master-0 kubenswrapper[36504]: I1203 22:24:32.514091 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d991-account-create-update-n4kt9" event={"ID":"39c70d18-a17c-4ceb-a58a-6cb354489cc8","Type":"ContainerStarted","Data":"2a2126ed00c98a0db2adef3ab47af870d2226d0edae993415486d9417db60fad"} Dec 03 22:24:32.548429 master-0 kubenswrapper[36504]: I1203 22:24:32.547084 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589b458997-99rrp" event={"ID":"5674bf96-4a9a-40d4-95af-d72defcb57d5","Type":"ContainerStarted","Data":"4db15af2d54ede8b506496f53683b57c50d14c7a62503e7de16e3c519979ee0e"} Dec 03 22:24:32.584040 master-0 kubenswrapper[36504]: I1203 22:24:32.580978 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-sglc2" podStartSLOduration=3.580946642 podStartE2EDuration="3.580946642s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:32.497694105 +0000 UTC m=+837.717466112" watchObservedRunningTime="2025-12-03 22:24:32.580946642 +0000 UTC m=+837.800718649" Dec 03 22:24:32.584040 master-0 kubenswrapper[36504]: I1203 22:24:32.583239 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a73c-account-create-update-2m5sr" event={"ID":"185d021a-be33-4c90-a8b6-ddd16d53b53d","Type":"ContainerStarted","Data":"bc21d4349c616b40b28ce13c6bf39a7d9f4dd91b8968bbaab7c158228c652214"} Dec 03 22:24:32.584040 master-0 kubenswrapper[36504]: I1203 22:24:32.583316 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a73c-account-create-update-2m5sr" event={"ID":"185d021a-be33-4c90-a8b6-ddd16d53b53d","Type":"ContainerStarted","Data":"8b3723bce755638b4c87baae71a97612a13528d1d586efd10e28101c3e2e0a98"} Dec 03 22:24:32.592973 master-0 kubenswrapper[36504]: I1203 22:24:32.586386 36504 scope.go:117] "RemoveContainer" containerID="c73f5b3280ba614c0f0b119d6e3296fb48fc8dc1df7ae7487f2d4fe89a88941a" Dec 03 22:24:32.592973 master-0 kubenswrapper[36504]: I1203 22:24:32.591438 36504 generic.go:334] "Generic (PLEG): container finished" podID="dbce4b8e-8c01-42bf-b475-ec56b388fa80" containerID="24b2dbf2959087f1218c3c20db41bf10628f97b5c6560d2fc0ee9ed39d8c8944" exitCode=0 Dec 03 22:24:32.592973 master-0 kubenswrapper[36504]: I1203 22:24:32.591575 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-66454" event={"ID":"dbce4b8e-8c01-42bf-b475-ec56b388fa80","Type":"ContainerDied","Data":"24b2dbf2959087f1218c3c20db41bf10628f97b5c6560d2fc0ee9ed39d8c8944"} Dec 03 22:24:32.599169 master-0 kubenswrapper[36504]: I1203 22:24:32.599106 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0539-account-create-update-h6tq8" event={"ID":"f10040d6-faaa-4181-b651-e26b54ae8a33","Type":"ContainerStarted","Data":"66860b89689a005b87841659b3bc16c01f3a81e68efcd96c5f3f64340f611f25"} Dec 03 22:24:32.599297 master-0 kubenswrapper[36504]: I1203 22:24:32.599172 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0539-account-create-update-h6tq8" event={"ID":"f10040d6-faaa-4181-b651-e26b54ae8a33","Type":"ContainerStarted","Data":"735be523a10595e2b5bafbc1db2fccc7b78f80ba2b264e6f9445a30d80f13bad"} Dec 03 22:24:32.610857 master-0 kubenswrapper[36504]: I1203 22:24:32.610520 36504 generic.go:334] "Generic (PLEG): container finished" podID="6914d837-9818-4960-aa61-a0ea34ae574b" containerID="fe9df3e82d8605ffe93bfdab93d76cc8f20a10afb1ef1485946567c82cdb3209" exitCode=0 Dec 03 22:24:32.610857 master-0 kubenswrapper[36504]: I1203 22:24:32.610605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-113a-account-create-update-6xzl6" event={"ID":"6914d837-9818-4960-aa61-a0ea34ae574b","Type":"ContainerDied","Data":"fe9df3e82d8605ffe93bfdab93d76cc8f20a10afb1ef1485946567c82cdb3209"} Dec 03 22:24:32.673928 master-0 kubenswrapper[36504]: I1203 22:24:32.673826 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d991-account-create-update-n4kt9" podStartSLOduration=3.673758159 podStartE2EDuration="3.673758159s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:32.602978424 +0000 UTC m=+837.822750451" watchObservedRunningTime="2025-12-03 22:24:32.673758159 +0000 UTC m=+837.893530166" Dec 03 22:24:32.710753 master-0 kubenswrapper[36504]: I1203 22:24:32.710060 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-tzsmz"] Dec 03 22:24:32.731616 master-0 kubenswrapper[36504]: I1203 22:24:32.729139 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d4d74cb79-tzsmz"] Dec 03 22:24:32.739633 master-0 kubenswrapper[36504]: I1203 22:24:32.736199 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-a73c-account-create-update-2m5sr" podStartSLOduration=3.73616182 podStartE2EDuration="3.73616182s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:32.693475079 +0000 UTC m=+837.913247086" watchObservedRunningTime="2025-12-03 22:24:32.73616182 +0000 UTC m=+837.955933837" Dec 03 22:24:32.745548 master-0 kubenswrapper[36504]: I1203 22:24:32.743734 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-0539-account-create-update-h6tq8" podStartSLOduration=3.743701117 podStartE2EDuration="3.743701117s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:32.720299491 +0000 UTC m=+837.940071498" watchObservedRunningTime="2025-12-03 22:24:32.743701117 +0000 UTC m=+837.963473144" Dec 03 22:24:33.124655 master-0 kubenswrapper[36504]: I1203 22:24:33.124539 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" path="/var/lib/kubelet/pods/c62890e3-7d2e-4547-9a7b-8ea03ce1ce68/volumes" Dec 03 22:24:33.639808 master-0 kubenswrapper[36504]: I1203 22:24:33.637289 36504 generic.go:334] "Generic (PLEG): container finished" podID="a27383b2-a1e5-4e55-9709-136413c4c96a" containerID="aec52144dd58623639f37392efde253da5e0a918f5f8e908536bbfe7320902af" exitCode=0 Dec 03 22:24:33.639808 master-0 kubenswrapper[36504]: I1203 22:24:33.637424 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sglc2" event={"ID":"a27383b2-a1e5-4e55-9709-136413c4c96a","Type":"ContainerDied","Data":"aec52144dd58623639f37392efde253da5e0a918f5f8e908536bbfe7320902af"} Dec 03 22:24:33.646094 master-0 kubenswrapper[36504]: I1203 22:24:33.642892 36504 generic.go:334] "Generic (PLEG): container finished" podID="39c70d18-a17c-4ceb-a58a-6cb354489cc8" containerID="97596bfea8c5ebd68ff6974cd790e0daa9d71633b399ed223de65410251a1448" exitCode=0 Dec 03 22:24:33.646094 master-0 kubenswrapper[36504]: I1203 22:24:33.643133 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d991-account-create-update-n4kt9" event={"ID":"39c70d18-a17c-4ceb-a58a-6cb354489cc8","Type":"ContainerDied","Data":"97596bfea8c5ebd68ff6974cd790e0daa9d71633b399ed223de65410251a1448"} Dec 03 22:24:33.665328 master-0 kubenswrapper[36504]: I1203 22:24:33.665233 36504 generic.go:334] "Generic (PLEG): container finished" podID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerID="ac2dab53b10f3857fc282ddda373afaaaced91b15fa6a4e9c877b1110acac1ea" exitCode=0 Dec 03 22:24:33.665567 master-0 kubenswrapper[36504]: I1203 22:24:33.665351 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589b458997-99rrp" event={"ID":"5674bf96-4a9a-40d4-95af-d72defcb57d5","Type":"ContainerDied","Data":"ac2dab53b10f3857fc282ddda373afaaaced91b15fa6a4e9c877b1110acac1ea"} Dec 03 22:24:33.673307 master-0 kubenswrapper[36504]: I1203 22:24:33.672121 36504 generic.go:334] "Generic (PLEG): container finished" podID="185d021a-be33-4c90-a8b6-ddd16d53b53d" containerID="bc21d4349c616b40b28ce13c6bf39a7d9f4dd91b8968bbaab7c158228c652214" exitCode=0 Dec 03 22:24:33.673307 master-0 kubenswrapper[36504]: I1203 22:24:33.672220 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a73c-account-create-update-2m5sr" event={"ID":"185d021a-be33-4c90-a8b6-ddd16d53b53d","Type":"ContainerDied","Data":"bc21d4349c616b40b28ce13c6bf39a7d9f4dd91b8968bbaab7c158228c652214"} Dec 03 22:24:33.676850 master-0 kubenswrapper[36504]: I1203 22:24:33.676793 36504 generic.go:334] "Generic (PLEG): container finished" podID="f10040d6-faaa-4181-b651-e26b54ae8a33" containerID="66860b89689a005b87841659b3bc16c01f3a81e68efcd96c5f3f64340f611f25" exitCode=0 Dec 03 22:24:33.677214 master-0 kubenswrapper[36504]: I1203 22:24:33.677154 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0539-account-create-update-h6tq8" event={"ID":"f10040d6-faaa-4181-b651-e26b54ae8a33","Type":"ContainerDied","Data":"66860b89689a005b87841659b3bc16c01f3a81e68efcd96c5f3f64340f611f25"} Dec 03 22:24:34.360166 master-0 kubenswrapper[36504]: I1203 22:24:34.357633 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:34.489644 master-0 kubenswrapper[36504]: I1203 22:24:34.489566 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr724\" (UniqueName: \"kubernetes.io/projected/6914d837-9818-4960-aa61-a0ea34ae574b-kube-api-access-qr724\") pod \"6914d837-9818-4960-aa61-a0ea34ae574b\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " Dec 03 22:24:34.490391 master-0 kubenswrapper[36504]: I1203 22:24:34.490025 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6914d837-9818-4960-aa61-a0ea34ae574b-operator-scripts\") pod \"6914d837-9818-4960-aa61-a0ea34ae574b\" (UID: \"6914d837-9818-4960-aa61-a0ea34ae574b\") " Dec 03 22:24:34.491195 master-0 kubenswrapper[36504]: I1203 22:24:34.491161 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6914d837-9818-4960-aa61-a0ea34ae574b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6914d837-9818-4960-aa61-a0ea34ae574b" (UID: "6914d837-9818-4960-aa61-a0ea34ae574b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:34.494691 master-0 kubenswrapper[36504]: I1203 22:24:34.494624 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6914d837-9818-4960-aa61-a0ea34ae574b-kube-api-access-qr724" (OuterVolumeSpecName: "kube-api-access-qr724") pod "6914d837-9818-4960-aa61-a0ea34ae574b" (UID: "6914d837-9818-4960-aa61-a0ea34ae574b"). InnerVolumeSpecName "kube-api-access-qr724". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:34.594513 master-0 kubenswrapper[36504]: I1203 22:24:34.594112 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6914d837-9818-4960-aa61-a0ea34ae574b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:34.594513 master-0 kubenswrapper[36504]: I1203 22:24:34.594177 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr724\" (UniqueName: \"kubernetes.io/projected/6914d837-9818-4960-aa61-a0ea34ae574b-kube-api-access-qr724\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:34.698103 master-0 kubenswrapper[36504]: I1203 22:24:34.697989 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-113a-account-create-update-6xzl6" event={"ID":"6914d837-9818-4960-aa61-a0ea34ae574b","Type":"ContainerDied","Data":"d3c1e0141de9ac7b4ed64c6626f431eb89ba0592cac0ffb7996d813ad580328b"} Dec 03 22:24:34.698585 master-0 kubenswrapper[36504]: I1203 22:24:34.698122 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c1e0141de9ac7b4ed64c6626f431eb89ba0592cac0ffb7996d813ad580328b" Dec 03 22:24:34.698585 master-0 kubenswrapper[36504]: I1203 22:24:34.698053 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-113a-account-create-update-6xzl6" Dec 03 22:24:34.710826 master-0 kubenswrapper[36504]: I1203 22:24:34.710712 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589b458997-99rrp" event={"ID":"5674bf96-4a9a-40d4-95af-d72defcb57d5","Type":"ContainerStarted","Data":"24dc447bfbb56d9e65863654c0be66507c85852e991e4b0ae8c4fc7eaa084827"} Dec 03 22:24:34.711248 master-0 kubenswrapper[36504]: I1203 22:24:34.711220 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:34.713877 master-0 kubenswrapper[36504]: I1203 22:24:34.713732 36504 generic.go:334] "Generic (PLEG): container finished" podID="b0d98163-c65a-412b-a4c4-32ce8bff30a8" containerID="e38349a8e2c4760958e3f3aeb3a37d7dc01f25c0108f92de9d4f787600919d20" exitCode=0 Dec 03 22:24:34.713877 master-0 kubenswrapper[36504]: I1203 22:24:34.713829 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8bwst" event={"ID":"b0d98163-c65a-412b-a4c4-32ce8bff30a8","Type":"ContainerDied","Data":"e38349a8e2c4760958e3f3aeb3a37d7dc01f25c0108f92de9d4f787600919d20"} Dec 03 22:24:34.770330 master-0 kubenswrapper[36504]: I1203 22:24:34.768482 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589b458997-99rrp" podStartSLOduration=4.768441164 podStartE2EDuration="4.768441164s" podCreationTimestamp="2025-12-03 22:24:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:34.755381004 +0000 UTC m=+839.975153031" watchObservedRunningTime="2025-12-03 22:24:34.768441164 +0000 UTC m=+839.988213181" Dec 03 22:24:35.728582 master-0 kubenswrapper[36504]: E1203 22:24:35.728443 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:24:36.758403 master-0 kubenswrapper[36504]: I1203 22:24:36.758314 36504 generic.go:334] "Generic (PLEG): container finished" podID="23e7beb4-5345-49db-bd3e-aa3a97b7009e" containerID="5633420bb74c899b269328e7ff0442d51cc0c1e2816645337f7401fe43f03c0f" exitCode=0 Dec 03 22:24:36.758403 master-0 kubenswrapper[36504]: I1203 22:24:36.758397 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23e7beb4-5345-49db-bd3e-aa3a97b7009e","Type":"ContainerDied","Data":"5633420bb74c899b269328e7ff0442d51cc0c1e2816645337f7401fe43f03c0f"} Dec 03 22:24:37.596629 master-0 kubenswrapper[36504]: I1203 22:24:37.596563 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:37.616937 master-0 kubenswrapper[36504]: I1203 22:24:37.616875 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:37.644272 master-0 kubenswrapper[36504]: I1203 22:24:37.643075 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-66454" Dec 03 22:24:37.681069 master-0 kubenswrapper[36504]: I1203 22:24:37.679895 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:37.715806 master-0 kubenswrapper[36504]: I1203 22:24:37.697813 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:37.735802 master-0 kubenswrapper[36504]: I1203 22:24:37.721900 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:37.765101 master-0 kubenswrapper[36504]: I1203 22:24:37.765030 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjdtr\" (UniqueName: \"kubernetes.io/projected/39c70d18-a17c-4ceb-a58a-6cb354489cc8-kube-api-access-vjdtr\") pod \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " Dec 03 22:24:37.765710 master-0 kubenswrapper[36504]: I1203 22:24:37.765156 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c70d18-a17c-4ceb-a58a-6cb354489cc8-operator-scripts\") pod \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\" (UID: \"39c70d18-a17c-4ceb-a58a-6cb354489cc8\") " Dec 03 22:24:37.765710 master-0 kubenswrapper[36504]: I1203 22:24:37.765456 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a27383b2-a1e5-4e55-9709-136413c4c96a-operator-scripts\") pod \"a27383b2-a1e5-4e55-9709-136413c4c96a\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " Dec 03 22:24:37.765710 master-0 kubenswrapper[36504]: I1203 22:24:37.765529 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjdnj\" (UniqueName: \"kubernetes.io/projected/dbce4b8e-8c01-42bf-b475-ec56b388fa80-kube-api-access-vjdnj\") pod \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " Dec 03 22:24:37.765710 master-0 kubenswrapper[36504]: I1203 22:24:37.765600 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmd84\" (UniqueName: \"kubernetes.io/projected/a27383b2-a1e5-4e55-9709-136413c4c96a-kube-api-access-lmd84\") pod \"a27383b2-a1e5-4e55-9709-136413c4c96a\" (UID: \"a27383b2-a1e5-4e55-9709-136413c4c96a\") " Dec 03 22:24:37.765921 master-0 kubenswrapper[36504]: I1203 22:24:37.765735 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbce4b8e-8c01-42bf-b475-ec56b388fa80-operator-scripts\") pod \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\" (UID: \"dbce4b8e-8c01-42bf-b475-ec56b388fa80\") " Dec 03 22:24:37.766022 master-0 kubenswrapper[36504]: I1203 22:24:37.765970 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39c70d18-a17c-4ceb-a58a-6cb354489cc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39c70d18-a17c-4ceb-a58a-6cb354489cc8" (UID: "39c70d18-a17c-4ceb-a58a-6cb354489cc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:37.766103 master-0 kubenswrapper[36504]: I1203 22:24:37.766045 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27383b2-a1e5-4e55-9709-136413c4c96a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a27383b2-a1e5-4e55-9709-136413c4c96a" (UID: "a27383b2-a1e5-4e55-9709-136413c4c96a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:37.768484 master-0 kubenswrapper[36504]: I1203 22:24:37.768421 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:37.769920 master-0 kubenswrapper[36504]: I1203 22:24:37.769532 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c70d18-a17c-4ceb-a58a-6cb354489cc8-kube-api-access-vjdtr" (OuterVolumeSpecName: "kube-api-access-vjdtr") pod "39c70d18-a17c-4ceb-a58a-6cb354489cc8" (UID: "39c70d18-a17c-4ceb-a58a-6cb354489cc8"). InnerVolumeSpecName "kube-api-access-vjdtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.773587 master-0 kubenswrapper[36504]: I1203 22:24:37.773420 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbce4b8e-8c01-42bf-b475-ec56b388fa80-kube-api-access-vjdnj" (OuterVolumeSpecName: "kube-api-access-vjdnj") pod "dbce4b8e-8c01-42bf-b475-ec56b388fa80" (UID: "dbce4b8e-8c01-42bf-b475-ec56b388fa80"). InnerVolumeSpecName "kube-api-access-vjdnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.785004 master-0 kubenswrapper[36504]: I1203 22:24:37.783472 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a27383b2-a1e5-4e55-9709-136413c4c96a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:37.785004 master-0 kubenswrapper[36504]: I1203 22:24:37.783580 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjdnj\" (UniqueName: \"kubernetes.io/projected/dbce4b8e-8c01-42bf-b475-ec56b388fa80-kube-api-access-vjdnj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:37.785004 master-0 kubenswrapper[36504]: I1203 22:24:37.783609 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjdtr\" (UniqueName: \"kubernetes.io/projected/39c70d18-a17c-4ceb-a58a-6cb354489cc8-kube-api-access-vjdtr\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:37.785004 master-0 kubenswrapper[36504]: I1203 22:24:37.783653 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39c70d18-a17c-4ceb-a58a-6cb354489cc8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:37.785004 master-0 kubenswrapper[36504]: I1203 22:24:37.784649 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:37.786051 master-0 kubenswrapper[36504]: I1203 22:24:37.785213 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27383b2-a1e5-4e55-9709-136413c4c96a-kube-api-access-lmd84" (OuterVolumeSpecName: "kube-api-access-lmd84") pod "a27383b2-a1e5-4e55-9709-136413c4c96a" (UID: "a27383b2-a1e5-4e55-9709-136413c4c96a"). InnerVolumeSpecName "kube-api-access-lmd84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.803201 master-0 kubenswrapper[36504]: I1203 22:24:37.799107 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbce4b8e-8c01-42bf-b475-ec56b388fa80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbce4b8e-8c01-42bf-b475-ec56b388fa80" (UID: "dbce4b8e-8c01-42bf-b475-ec56b388fa80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:37.879948 master-0 kubenswrapper[36504]: I1203 22:24:37.879616 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sglc2" event={"ID":"a27383b2-a1e5-4e55-9709-136413c4c96a","Type":"ContainerDied","Data":"e70a70832014524c1995fe4e1ab74a7d48acac23f2ab6d69235b2806d5becfcb"} Dec 03 22:24:37.879948 master-0 kubenswrapper[36504]: I1203 22:24:37.879964 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e70a70832014524c1995fe4e1ab74a7d48acac23f2ab6d69235b2806d5becfcb" Dec 03 22:24:37.882138 master-0 kubenswrapper[36504]: I1203 22:24:37.881753 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sglc2" Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.892641 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55659879-0543-4282-abb3-d03ca6ecf2ce-operator-scripts\") pod \"55659879-0543-4282-abb3-d03ca6ecf2ce\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.892717 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz658\" (UniqueName: \"kubernetes.io/projected/185d021a-be33-4c90-a8b6-ddd16d53b53d-kube-api-access-wz658\") pod \"185d021a-be33-4c90-a8b6-ddd16d53b53d\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.892796 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/185d021a-be33-4c90-a8b6-ddd16d53b53d-operator-scripts\") pod \"185d021a-be33-4c90-a8b6-ddd16d53b53d\" (UID: \"185d021a-be33-4c90-a8b6-ddd16d53b53d\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.892865 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdrzn\" (UniqueName: \"kubernetes.io/projected/55659879-0543-4282-abb3-d03ca6ecf2ce-kube-api-access-tdrzn\") pod \"55659879-0543-4282-abb3-d03ca6ecf2ce\" (UID: \"55659879-0543-4282-abb3-d03ca6ecf2ce\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.892898 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7hzb\" (UniqueName: \"kubernetes.io/projected/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-kube-api-access-r7hzb\") pod \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.892930 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28jlt\" (UniqueName: \"kubernetes.io/projected/b0d98163-c65a-412b-a4c4-32ce8bff30a8-kube-api-access-28jlt\") pod \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.893038 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-config-data\") pod \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.893118 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-db-sync-config-data\") pod \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.893189 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-operator-scripts\") pod \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\" (UID: \"9e04fc00-ca89-4a25-9bb4-2f59d9f14388\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.893231 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-combined-ca-bundle\") pod \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\" (UID: \"b0d98163-c65a-412b-a4c4-32ce8bff30a8\") " Dec 03 22:24:37.893926 master-0 kubenswrapper[36504]: I1203 22:24:37.893923 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmd84\" (UniqueName: \"kubernetes.io/projected/a27383b2-a1e5-4e55-9709-136413c4c96a-kube-api-access-lmd84\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:37.894291 master-0 kubenswrapper[36504]: I1203 22:24:37.894021 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbce4b8e-8c01-42bf-b475-ec56b388fa80-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:37.894987 master-0 kubenswrapper[36504]: I1203 22:24:37.894906 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55659879-0543-4282-abb3-d03ca6ecf2ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55659879-0543-4282-abb3-d03ca6ecf2ce" (UID: "55659879-0543-4282-abb3-d03ca6ecf2ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:37.897831 master-0 kubenswrapper[36504]: I1203 22:24:37.897784 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/185d021a-be33-4c90-a8b6-ddd16d53b53d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "185d021a-be33-4c90-a8b6-ddd16d53b53d" (UID: "185d021a-be33-4c90-a8b6-ddd16d53b53d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:37.897979 master-0 kubenswrapper[36504]: I1203 22:24:37.897947 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e04fc00-ca89-4a25-9bb4-2f59d9f14388" (UID: "9e04fc00-ca89-4a25-9bb4-2f59d9f14388"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:37.898807 master-0 kubenswrapper[36504]: I1203 22:24:37.898780 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-kube-api-access-r7hzb" (OuterVolumeSpecName: "kube-api-access-r7hzb") pod "9e04fc00-ca89-4a25-9bb4-2f59d9f14388" (UID: "9e04fc00-ca89-4a25-9bb4-2f59d9f14388"). InnerVolumeSpecName "kube-api-access-r7hzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.898878 master-0 kubenswrapper[36504]: I1203 22:24:37.898834 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/185d021a-be33-4c90-a8b6-ddd16d53b53d-kube-api-access-wz658" (OuterVolumeSpecName: "kube-api-access-wz658") pod "185d021a-be33-4c90-a8b6-ddd16d53b53d" (UID: "185d021a-be33-4c90-a8b6-ddd16d53b53d"). InnerVolumeSpecName "kube-api-access-wz658". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.900587 master-0 kubenswrapper[36504]: I1203 22:24:37.900533 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d98163-c65a-412b-a4c4-32ce8bff30a8-kube-api-access-28jlt" (OuterVolumeSpecName: "kube-api-access-28jlt") pod "b0d98163-c65a-412b-a4c4-32ce8bff30a8" (UID: "b0d98163-c65a-412b-a4c4-32ce8bff30a8"). InnerVolumeSpecName "kube-api-access-28jlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.903650 master-0 kubenswrapper[36504]: I1203 22:24:37.903611 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55659879-0543-4282-abb3-d03ca6ecf2ce-kube-api-access-tdrzn" (OuterVolumeSpecName: "kube-api-access-tdrzn") pod "55659879-0543-4282-abb3-d03ca6ecf2ce" (UID: "55659879-0543-4282-abb3-d03ca6ecf2ce"). InnerVolumeSpecName "kube-api-access-tdrzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:37.904364 master-0 kubenswrapper[36504]: I1203 22:24:37.904324 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d991-account-create-update-n4kt9" event={"ID":"39c70d18-a17c-4ceb-a58a-6cb354489cc8","Type":"ContainerDied","Data":"2a2126ed00c98a0db2adef3ab47af870d2226d0edae993415486d9417db60fad"} Dec 03 22:24:37.904488 master-0 kubenswrapper[36504]: I1203 22:24:37.904456 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2126ed00c98a0db2adef3ab47af870d2226d0edae993415486d9417db60fad" Dec 03 22:24:37.904707 master-0 kubenswrapper[36504]: I1203 22:24:37.904692 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d991-account-create-update-n4kt9" Dec 03 22:24:37.905529 master-0 kubenswrapper[36504]: I1203 22:24:37.905476 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b0d98163-c65a-412b-a4c4-32ce8bff30a8" (UID: "b0d98163-c65a-412b-a4c4-32ce8bff30a8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:37.907714 master-0 kubenswrapper[36504]: I1203 22:24:37.907670 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8bwst" event={"ID":"b0d98163-c65a-412b-a4c4-32ce8bff30a8","Type":"ContainerDied","Data":"49c33769a87d2e1f6c9fd86cdfb97c45537b1abdc9f301d010952ec8c7b69f05"} Dec 03 22:24:37.907896 master-0 kubenswrapper[36504]: I1203 22:24:37.907723 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49c33769a87d2e1f6c9fd86cdfb97c45537b1abdc9f301d010952ec8c7b69f05" Dec 03 22:24:37.907953 master-0 kubenswrapper[36504]: I1203 22:24:37.907927 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8bwst" Dec 03 22:24:37.916437 master-0 kubenswrapper[36504]: I1203 22:24:37.916363 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8lq7d" event={"ID":"9e04fc00-ca89-4a25-9bb4-2f59d9f14388","Type":"ContainerDied","Data":"a5445dd67e7b4057aee398d2921e091993a4c000eb24d733ebbb65ba1ad6b9b5"} Dec 03 22:24:37.916671 master-0 kubenswrapper[36504]: I1203 22:24:37.916477 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5445dd67e7b4057aee398d2921e091993a4c000eb24d733ebbb65ba1ad6b9b5" Dec 03 22:24:37.916671 master-0 kubenswrapper[36504]: I1203 22:24:37.916556 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8lq7d" Dec 03 22:24:37.918730 master-0 kubenswrapper[36504]: I1203 22:24:37.918702 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0539-account-create-update-h6tq8" event={"ID":"f10040d6-faaa-4181-b651-e26b54ae8a33","Type":"ContainerDied","Data":"735be523a10595e2b5bafbc1db2fccc7b78f80ba2b264e6f9445a30d80f13bad"} Dec 03 22:24:37.918730 master-0 kubenswrapper[36504]: I1203 22:24:37.918730 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735be523a10595e2b5bafbc1db2fccc7b78f80ba2b264e6f9445a30d80f13bad" Dec 03 22:24:37.919192 master-0 kubenswrapper[36504]: I1203 22:24:37.918788 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0539-account-create-update-h6tq8" Dec 03 22:24:37.924746 master-0 kubenswrapper[36504]: I1203 22:24:37.923655 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-66454" event={"ID":"dbce4b8e-8c01-42bf-b475-ec56b388fa80","Type":"ContainerDied","Data":"5b65255967698ebc147c46a77bb52ab9b321ae84fd0e4b21699894603035b982"} Dec 03 22:24:37.924746 master-0 kubenswrapper[36504]: I1203 22:24:37.923689 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b65255967698ebc147c46a77bb52ab9b321ae84fd0e4b21699894603035b982" Dec 03 22:24:37.924746 master-0 kubenswrapper[36504]: I1203 22:24:37.924033 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-66454" Dec 03 22:24:37.978175 master-0 kubenswrapper[36504]: I1203 22:24:37.978063 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cj96j" event={"ID":"55659879-0543-4282-abb3-d03ca6ecf2ce","Type":"ContainerDied","Data":"42ede8d90b3723c764cdd621c6cfcbcf02ece0efde1f678f31100998a298211d"} Dec 03 22:24:37.978175 master-0 kubenswrapper[36504]: I1203 22:24:37.978124 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42ede8d90b3723c764cdd621c6cfcbcf02ece0efde1f678f31100998a298211d" Dec 03 22:24:37.978540 master-0 kubenswrapper[36504]: I1203 22:24:37.978497 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cj96j" Dec 03 22:24:37.980438 master-0 kubenswrapper[36504]: I1203 22:24:37.980280 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0d98163-c65a-412b-a4c4-32ce8bff30a8" (UID: "b0d98163-c65a-412b-a4c4-32ce8bff30a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:37.987909 master-0 kubenswrapper[36504]: I1203 22:24:37.987838 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a73c-account-create-update-2m5sr" event={"ID":"185d021a-be33-4c90-a8b6-ddd16d53b53d","Type":"ContainerDied","Data":"8b3723bce755638b4c87baae71a97612a13528d1d586efd10e28101c3e2e0a98"} Dec 03 22:24:37.988019 master-0 kubenswrapper[36504]: I1203 22:24:37.987974 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b3723bce755638b4c87baae71a97612a13528d1d586efd10e28101c3e2e0a98" Dec 03 22:24:37.988297 master-0 kubenswrapper[36504]: I1203 22:24:37.988263 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a73c-account-create-update-2m5sr" Dec 03 22:24:38.000296 master-0 kubenswrapper[36504]: I1203 22:24:38.000244 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw466\" (UniqueName: \"kubernetes.io/projected/f10040d6-faaa-4181-b651-e26b54ae8a33-kube-api-access-vw466\") pod \"f10040d6-faaa-4181-b651-e26b54ae8a33\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " Dec 03 22:24:38.000575 master-0 kubenswrapper[36504]: I1203 22:24:38.000550 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10040d6-faaa-4181-b651-e26b54ae8a33-operator-scripts\") pod \"f10040d6-faaa-4181-b651-e26b54ae8a33\" (UID: \"f10040d6-faaa-4181-b651-e26b54ae8a33\") " Dec 03 22:24:38.001787 master-0 kubenswrapper[36504]: I1203 22:24:38.001750 36504 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001787 master-0 kubenswrapper[36504]: I1203 22:24:38.001786 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001798 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001810 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55659879-0543-4282-abb3-d03ca6ecf2ce-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001826 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz658\" (UniqueName: \"kubernetes.io/projected/185d021a-be33-4c90-a8b6-ddd16d53b53d-kube-api-access-wz658\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001840 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/185d021a-be33-4c90-a8b6-ddd16d53b53d-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001852 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdrzn\" (UniqueName: \"kubernetes.io/projected/55659879-0543-4282-abb3-d03ca6ecf2ce-kube-api-access-tdrzn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001862 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7hzb\" (UniqueName: \"kubernetes.io/projected/9e04fc00-ca89-4a25-9bb4-2f59d9f14388-kube-api-access-r7hzb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.001892 master-0 kubenswrapper[36504]: I1203 22:24:38.001872 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28jlt\" (UniqueName: \"kubernetes.io/projected/b0d98163-c65a-412b-a4c4-32ce8bff30a8-kube-api-access-28jlt\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.002556 master-0 kubenswrapper[36504]: I1203 22:24:38.002475 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f10040d6-faaa-4181-b651-e26b54ae8a33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f10040d6-faaa-4181-b651-e26b54ae8a33" (UID: "f10040d6-faaa-4181-b651-e26b54ae8a33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:38.004135 master-0 kubenswrapper[36504]: I1203 22:24:38.004101 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23e7beb4-5345-49db-bd3e-aa3a97b7009e","Type":"ContainerStarted","Data":"788360a57b907889342c662558c231a66dbde2e8d10e120452959e540470929b"} Dec 03 22:24:38.005751 master-0 kubenswrapper[36504]: I1203 22:24:38.005677 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-config-data" (OuterVolumeSpecName: "config-data") pod "b0d98163-c65a-412b-a4c4-32ce8bff30a8" (UID: "b0d98163-c65a-412b-a4c4-32ce8bff30a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:38.029839 master-0 kubenswrapper[36504]: I1203 22:24:38.029750 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10040d6-faaa-4181-b651-e26b54ae8a33-kube-api-access-vw466" (OuterVolumeSpecName: "kube-api-access-vw466") pod "f10040d6-faaa-4181-b651-e26b54ae8a33" (UID: "f10040d6-faaa-4181-b651-e26b54ae8a33"). InnerVolumeSpecName "kube-api-access-vw466". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:38.105290 master-0 kubenswrapper[36504]: I1203 22:24:38.105231 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f10040d6-faaa-4181-b651-e26b54ae8a33-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.105290 master-0 kubenswrapper[36504]: I1203 22:24:38.105278 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw466\" (UniqueName: \"kubernetes.io/projected/f10040d6-faaa-4181-b651-e26b54ae8a33-kube-api-access-vw466\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:38.105290 master-0 kubenswrapper[36504]: I1203 22:24:38.105292 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0d98163-c65a-412b-a4c4-32ce8bff30a8-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:39.139833 master-0 kubenswrapper[36504]: I1203 22:24:39.139147 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8r7tj" event={"ID":"d143967f-1161-4824-b45f-67d0b695536f","Type":"ContainerStarted","Data":"0f9ea37d9377d62f31c3bf6f416c2a3014c34f4af1eb0915aa7afcfc799796bb"} Dec 03 22:24:39.140501 master-0 kubenswrapper[36504]: I1203 22:24:39.140159 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8r7tj" podStartSLOduration=3.908174136 podStartE2EDuration="10.140128043s" podCreationTimestamp="2025-12-03 22:24:29 +0000 UTC" firstStartedPulling="2025-12-03 22:24:31.163553244 +0000 UTC m=+836.383325251" lastFinishedPulling="2025-12-03 22:24:37.395507151 +0000 UTC m=+842.615279158" observedRunningTime="2025-12-03 22:24:39.118302148 +0000 UTC m=+844.338074175" watchObservedRunningTime="2025-12-03 22:24:39.140128043 +0000 UTC m=+844.359900050" Dec 03 22:24:39.412799 master-0 kubenswrapper[36504]: I1203 22:24:39.411142 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589b458997-99rrp"] Dec 03 22:24:39.412799 master-0 kubenswrapper[36504]: I1203 22:24:39.411828 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589b458997-99rrp" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerName="dnsmasq-dns" containerID="cri-o://24dc447bfbb56d9e65863654c0be66507c85852e991e4b0ae8c4fc7eaa084827" gracePeriod=10 Dec 03 22:24:39.416093 master-0 kubenswrapper[36504]: I1203 22:24:39.416032 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:39.449701 master-0 kubenswrapper[36504]: I1203 22:24:39.449562 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-598448668f-fn8jr"] Dec 03 22:24:39.456310 master-0 kubenswrapper[36504]: E1203 22:24:39.456241 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbce4b8e-8c01-42bf-b475-ec56b388fa80" containerName="mariadb-database-create" Dec 03 22:24:39.456310 master-0 kubenswrapper[36504]: I1203 22:24:39.456306 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbce4b8e-8c01-42bf-b475-ec56b388fa80" containerName="mariadb-database-create" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: E1203 22:24:39.456357 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e04fc00-ca89-4a25-9bb4-2f59d9f14388" containerName="mariadb-database-create" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: I1203 22:24:39.456373 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e04fc00-ca89-4a25-9bb4-2f59d9f14388" containerName="mariadb-database-create" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: E1203 22:24:39.456397 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6914d837-9818-4960-aa61-a0ea34ae574b" containerName="mariadb-account-create-update" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: I1203 22:24:39.456406 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6914d837-9818-4960-aa61-a0ea34ae574b" containerName="mariadb-account-create-update" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: E1203 22:24:39.456418 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d98163-c65a-412b-a4c4-32ce8bff30a8" containerName="glance-db-sync" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: I1203 22:24:39.456425 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d98163-c65a-412b-a4c4-32ce8bff30a8" containerName="glance-db-sync" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: E1203 22:24:39.456445 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c70d18-a17c-4ceb-a58a-6cb354489cc8" containerName="mariadb-account-create-update" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: I1203 22:24:39.456452 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c70d18-a17c-4ceb-a58a-6cb354489cc8" containerName="mariadb-account-create-update" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: E1203 22:24:39.456469 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10040d6-faaa-4181-b651-e26b54ae8a33" containerName="mariadb-account-create-update" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: I1203 22:24:39.456477 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10040d6-faaa-4181-b651-e26b54ae8a33" containerName="mariadb-account-create-update" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: E1203 22:24:39.456523 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27383b2-a1e5-4e55-9709-136413c4c96a" containerName="mariadb-database-create" Dec 03 22:24:39.456535 master-0 kubenswrapper[36504]: I1203 22:24:39.456532 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27383b2-a1e5-4e55-9709-136413c4c96a" containerName="mariadb-database-create" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: E1203 22:24:39.456553 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerName="init" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: I1203 22:24:39.456563 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerName="init" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: E1203 22:24:39.456582 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="185d021a-be33-4c90-a8b6-ddd16d53b53d" containerName="mariadb-account-create-update" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: I1203 22:24:39.456589 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="185d021a-be33-4c90-a8b6-ddd16d53b53d" containerName="mariadb-account-create-update" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: E1203 22:24:39.456601 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55659879-0543-4282-abb3-d03ca6ecf2ce" containerName="mariadb-database-create" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: I1203 22:24:39.456608 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="55659879-0543-4282-abb3-d03ca6ecf2ce" containerName="mariadb-database-create" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: E1203 22:24:39.456618 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerName="dnsmasq-dns" Dec 03 22:24:39.457052 master-0 kubenswrapper[36504]: I1203 22:24:39.456626 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerName="dnsmasq-dns" Dec 03 22:24:39.457391 master-0 kubenswrapper[36504]: I1203 22:24:39.457369 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27383b2-a1e5-4e55-9709-136413c4c96a" containerName="mariadb-database-create" Dec 03 22:24:39.457483 master-0 kubenswrapper[36504]: I1203 22:24:39.457448 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6914d837-9818-4960-aa61-a0ea34ae574b" containerName="mariadb-account-create-update" Dec 03 22:24:39.457537 master-0 kubenswrapper[36504]: I1203 22:24:39.457477 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c62890e3-7d2e-4547-9a7b-8ea03ce1ce68" containerName="dnsmasq-dns" Dec 03 22:24:39.457588 master-0 kubenswrapper[36504]: I1203 22:24:39.457539 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d98163-c65a-412b-a4c4-32ce8bff30a8" containerName="glance-db-sync" Dec 03 22:24:39.457638 master-0 kubenswrapper[36504]: I1203 22:24:39.457619 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e04fc00-ca89-4a25-9bb4-2f59d9f14388" containerName="mariadb-database-create" Dec 03 22:24:39.457638 master-0 kubenswrapper[36504]: I1203 22:24:39.457636 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c70d18-a17c-4ceb-a58a-6cb354489cc8" containerName="mariadb-account-create-update" Dec 03 22:24:39.457731 master-0 kubenswrapper[36504]: I1203 22:24:39.457645 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbce4b8e-8c01-42bf-b475-ec56b388fa80" containerName="mariadb-database-create" Dec 03 22:24:39.457731 master-0 kubenswrapper[36504]: I1203 22:24:39.457688 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="185d021a-be33-4c90-a8b6-ddd16d53b53d" containerName="mariadb-account-create-update" Dec 03 22:24:39.457731 master-0 kubenswrapper[36504]: I1203 22:24:39.457702 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="55659879-0543-4282-abb3-d03ca6ecf2ce" containerName="mariadb-database-create" Dec 03 22:24:39.457731 master-0 kubenswrapper[36504]: I1203 22:24:39.457712 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10040d6-faaa-4181-b651-e26b54ae8a33" containerName="mariadb-account-create-update" Dec 03 22:24:39.460030 master-0 kubenswrapper[36504]: I1203 22:24:39.459985 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.491804 master-0 kubenswrapper[36504]: I1203 22:24:39.481290 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-598448668f-fn8jr"] Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.512917 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-config\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.513037 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-svc\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.513089 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-swift-storage-0\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.513261 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nmpq\" (UniqueName: \"kubernetes.io/projected/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-kube-api-access-8nmpq\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.513336 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-nb\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.513364 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-sb\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.526790 master-0 kubenswrapper[36504]: I1203 22:24:39.513478 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-edpm\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.616897 master-0 kubenswrapper[36504]: I1203 22:24:39.616756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-config\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.617173 master-0 kubenswrapper[36504]: I1203 22:24:39.616910 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-svc\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.617173 master-0 kubenswrapper[36504]: I1203 22:24:39.616944 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-swift-storage-0\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.617173 master-0 kubenswrapper[36504]: I1203 22:24:39.617035 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nmpq\" (UniqueName: \"kubernetes.io/projected/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-kube-api-access-8nmpq\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.617173 master-0 kubenswrapper[36504]: I1203 22:24:39.617074 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-nb\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.617173 master-0 kubenswrapper[36504]: I1203 22:24:39.617101 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-sb\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.617173 master-0 kubenswrapper[36504]: I1203 22:24:39.617155 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-edpm\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.618281 master-0 kubenswrapper[36504]: I1203 22:24:39.618257 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-edpm\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.618924 master-0 kubenswrapper[36504]: I1203 22:24:39.618894 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-config\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.619497 master-0 kubenswrapper[36504]: I1203 22:24:39.619470 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-svc\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.621674 master-0 kubenswrapper[36504]: I1203 22:24:39.620165 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-swift-storage-0\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.621839 master-0 kubenswrapper[36504]: I1203 22:24:39.621807 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-nb\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.622521 master-0 kubenswrapper[36504]: I1203 22:24:39.622485 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-sb\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.653332 master-0 kubenswrapper[36504]: I1203 22:24:39.653244 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nmpq\" (UniqueName: \"kubernetes.io/projected/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-kube-api-access-8nmpq\") pod \"dnsmasq-dns-598448668f-fn8jr\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:39.840962 master-0 kubenswrapper[36504]: I1203 22:24:39.840808 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:40.127799 master-0 kubenswrapper[36504]: I1203 22:24:40.127710 36504 generic.go:334] "Generic (PLEG): container finished" podID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerID="24dc447bfbb56d9e65863654c0be66507c85852e991e4b0ae8c4fc7eaa084827" exitCode=0 Dec 03 22:24:40.130034 master-0 kubenswrapper[36504]: I1203 22:24:40.129962 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589b458997-99rrp" event={"ID":"5674bf96-4a9a-40d4-95af-d72defcb57d5","Type":"ContainerDied","Data":"24dc447bfbb56d9e65863654c0be66507c85852e991e4b0ae8c4fc7eaa084827"} Dec 03 22:24:40.774959 master-0 kubenswrapper[36504]: I1203 22:24:40.774814 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-598448668f-fn8jr"] Dec 03 22:24:40.783796 master-0 kubenswrapper[36504]: I1203 22:24:40.779166 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:40.929960 master-0 kubenswrapper[36504]: I1203 22:24:40.929857 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-edpm\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:40.930358 master-0 kubenswrapper[36504]: I1203 22:24:40.930104 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-config\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:40.930358 master-0 kubenswrapper[36504]: I1203 22:24:40.930172 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-swift-storage-0\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:40.930358 master-0 kubenswrapper[36504]: I1203 22:24:40.930243 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-svc\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:40.930358 master-0 kubenswrapper[36504]: I1203 22:24:40.930324 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-sb\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:40.930497 master-0 kubenswrapper[36504]: I1203 22:24:40.930402 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-nb\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:40.930497 master-0 kubenswrapper[36504]: I1203 22:24:40.930463 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcnnj\" (UniqueName: \"kubernetes.io/projected/5674bf96-4a9a-40d4-95af-d72defcb57d5-kube-api-access-kcnnj\") pod \"5674bf96-4a9a-40d4-95af-d72defcb57d5\" (UID: \"5674bf96-4a9a-40d4-95af-d72defcb57d5\") " Dec 03 22:24:41.171803 master-0 kubenswrapper[36504]: I1203 22:24:41.170454 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5674bf96-4a9a-40d4-95af-d72defcb57d5-kube-api-access-kcnnj" (OuterVolumeSpecName: "kube-api-access-kcnnj") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "kube-api-access-kcnnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:41.201791 master-0 kubenswrapper[36504]: I1203 22:24:41.197322 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589b458997-99rrp" Dec 03 22:24:41.251794 master-0 kubenswrapper[36504]: I1203 22:24:41.250243 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcnnj\" (UniqueName: \"kubernetes.io/projected/5674bf96-4a9a-40d4-95af-d72defcb57d5-kube-api-access-kcnnj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.349361 master-0 kubenswrapper[36504]: I1203 22:24:41.348906 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-598448668f-fn8jr" event={"ID":"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f","Type":"ContainerStarted","Data":"897cdac6d78a80b16b45d8f014d0819c700cb250d9704b92e350cf9b09d4d24f"} Dec 03 22:24:41.349361 master-0 kubenswrapper[36504]: I1203 22:24:41.348986 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589b458997-99rrp" event={"ID":"5674bf96-4a9a-40d4-95af-d72defcb57d5","Type":"ContainerDied","Data":"4db15af2d54ede8b506496f53683b57c50d14c7a62503e7de16e3c519979ee0e"} Dec 03 22:24:41.349361 master-0 kubenswrapper[36504]: I1203 22:24:41.349048 36504 scope.go:117] "RemoveContainer" containerID="24dc447bfbb56d9e65863654c0be66507c85852e991e4b0ae8c4fc7eaa084827" Dec 03 22:24:41.359161 master-0 kubenswrapper[36504]: I1203 22:24:41.359014 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:41.368538 master-0 kubenswrapper[36504]: I1203 22:24:41.368445 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-edpm" (OuterVolumeSpecName: "edpm") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:41.383093 master-0 kubenswrapper[36504]: I1203 22:24:41.378863 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-config" (OuterVolumeSpecName: "config") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:41.383673 master-0 kubenswrapper[36504]: I1203 22:24:41.383538 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:41.386830 master-0 kubenswrapper[36504]: I1203 22:24:41.386731 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:41.395817 master-0 kubenswrapper[36504]: I1203 22:24:41.395680 36504 scope.go:117] "RemoveContainer" containerID="ac2dab53b10f3857fc282ddda373afaaaced91b15fa6a4e9c877b1110acac1ea" Dec 03 22:24:41.398134 master-0 kubenswrapper[36504]: I1203 22:24:41.397977 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5674bf96-4a9a-40d4-95af-d72defcb57d5" (UID: "5674bf96-4a9a-40d4-95af-d72defcb57d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:41.456715 master-0 kubenswrapper[36504]: I1203 22:24:41.455114 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.456715 master-0 kubenswrapper[36504]: I1203 22:24:41.455157 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.456715 master-0 kubenswrapper[36504]: I1203 22:24:41.455170 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.456715 master-0 kubenswrapper[36504]: I1203 22:24:41.455182 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.456715 master-0 kubenswrapper[36504]: I1203 22:24:41.455192 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.456715 master-0 kubenswrapper[36504]: I1203 22:24:41.455202 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5674bf96-4a9a-40d4-95af-d72defcb57d5-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:41.546872 master-0 kubenswrapper[36504]: I1203 22:24:41.546788 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589b458997-99rrp"] Dec 03 22:24:41.560755 master-0 kubenswrapper[36504]: I1203 22:24:41.560677 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589b458997-99rrp"] Dec 03 22:24:42.247922 master-0 kubenswrapper[36504]: I1203 22:24:42.247818 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23e7beb4-5345-49db-bd3e-aa3a97b7009e","Type":"ContainerStarted","Data":"6016a5e07dde449cf54835ae46ec6dc6d13924b0f8462caa6b9fdb57ea759fc1"} Dec 03 22:24:42.270632 master-0 kubenswrapper[36504]: I1203 22:24:42.269703 36504 generic.go:334] "Generic (PLEG): container finished" podID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerID="92b60014c7f58066fbc6795c56f40c2d4b3caa0e38bdbb5e30a4bb343bf964f0" exitCode=0 Dec 03 22:24:42.270632 master-0 kubenswrapper[36504]: I1203 22:24:42.269893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-598448668f-fn8jr" event={"ID":"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f","Type":"ContainerDied","Data":"92b60014c7f58066fbc6795c56f40c2d4b3caa0e38bdbb5e30a4bb343bf964f0"} Dec 03 22:24:43.124822 master-0 kubenswrapper[36504]: I1203 22:24:43.119548 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" path="/var/lib/kubelet/pods/5674bf96-4a9a-40d4-95af-d72defcb57d5/volumes" Dec 03 22:24:43.297073 master-0 kubenswrapper[36504]: I1203 22:24:43.296975 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-598448668f-fn8jr" event={"ID":"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f","Type":"ContainerStarted","Data":"c353bef980ade599afb552391d03fc758a0b06aa9895dd267cecb9fa61dd845a"} Dec 03 22:24:43.297073 master-0 kubenswrapper[36504]: I1203 22:24:43.297066 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:43.302755 master-0 kubenswrapper[36504]: I1203 22:24:43.302644 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"23e7beb4-5345-49db-bd3e-aa3a97b7009e","Type":"ContainerStarted","Data":"abdee1174b3e9268c42669bfe09180f06dab2119850e48a509e27353975bb834"} Dec 03 22:24:43.333660 master-0 kubenswrapper[36504]: I1203 22:24:43.333543 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-598448668f-fn8jr" podStartSLOduration=4.333517738 podStartE2EDuration="4.333517738s" podCreationTimestamp="2025-12-03 22:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:43.322706848 +0000 UTC m=+848.542478855" watchObservedRunningTime="2025-12-03 22:24:43.333517738 +0000 UTC m=+848.553289745" Dec 03 22:24:43.378960 master-0 kubenswrapper[36504]: I1203 22:24:43.378673 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.378644127 podStartE2EDuration="23.378644127s" podCreationTimestamp="2025-12-03 22:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:43.356602344 +0000 UTC m=+848.576374381" watchObservedRunningTime="2025-12-03 22:24:43.378644127 +0000 UTC m=+848.598416124" Dec 03 22:24:44.316876 master-0 kubenswrapper[36504]: I1203 22:24:44.316794 36504 generic.go:334] "Generic (PLEG): container finished" podID="d143967f-1161-4824-b45f-67d0b695536f" containerID="0f9ea37d9377d62f31c3bf6f416c2a3014c34f4af1eb0915aa7afcfc799796bb" exitCode=0 Dec 03 22:24:44.318412 master-0 kubenswrapper[36504]: I1203 22:24:44.318366 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8r7tj" event={"ID":"d143967f-1161-4824-b45f-67d0b695536f","Type":"ContainerDied","Data":"0f9ea37d9377d62f31c3bf6f416c2a3014c34f4af1eb0915aa7afcfc799796bb"} Dec 03 22:24:45.856099 master-0 kubenswrapper[36504]: I1203 22:24:45.856024 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:46.023084 master-0 kubenswrapper[36504]: I1203 22:24:46.022807 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87fjk\" (UniqueName: \"kubernetes.io/projected/d143967f-1161-4824-b45f-67d0b695536f-kube-api-access-87fjk\") pod \"d143967f-1161-4824-b45f-67d0b695536f\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " Dec 03 22:24:46.023084 master-0 kubenswrapper[36504]: I1203 22:24:46.022961 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-config-data\") pod \"d143967f-1161-4824-b45f-67d0b695536f\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " Dec 03 22:24:46.023569 master-0 kubenswrapper[36504]: I1203 22:24:46.023283 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-combined-ca-bundle\") pod \"d143967f-1161-4824-b45f-67d0b695536f\" (UID: \"d143967f-1161-4824-b45f-67d0b695536f\") " Dec 03 22:24:46.027584 master-0 kubenswrapper[36504]: I1203 22:24:46.027433 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d143967f-1161-4824-b45f-67d0b695536f-kube-api-access-87fjk" (OuterVolumeSpecName: "kube-api-access-87fjk") pod "d143967f-1161-4824-b45f-67d0b695536f" (UID: "d143967f-1161-4824-b45f-67d0b695536f"). InnerVolumeSpecName "kube-api-access-87fjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:46.057405 master-0 kubenswrapper[36504]: I1203 22:24:46.057315 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d143967f-1161-4824-b45f-67d0b695536f" (UID: "d143967f-1161-4824-b45f-67d0b695536f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:46.085820 master-0 kubenswrapper[36504]: I1203 22:24:46.085364 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-config-data" (OuterVolumeSpecName: "config-data") pod "d143967f-1161-4824-b45f-67d0b695536f" (UID: "d143967f-1161-4824-b45f-67d0b695536f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:46.096226 master-0 kubenswrapper[36504]: I1203 22:24:46.096141 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:24:46.127717 master-0 kubenswrapper[36504]: I1203 22:24:46.127639 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:46.127717 master-0 kubenswrapper[36504]: I1203 22:24:46.127683 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87fjk\" (UniqueName: \"kubernetes.io/projected/d143967f-1161-4824-b45f-67d0b695536f-kube-api-access-87fjk\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:46.127717 master-0 kubenswrapper[36504]: I1203 22:24:46.127698 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d143967f-1161-4824-b45f-67d0b695536f-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:46.356641 master-0 kubenswrapper[36504]: I1203 22:24:46.356464 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8r7tj" event={"ID":"d143967f-1161-4824-b45f-67d0b695536f","Type":"ContainerDied","Data":"f4aa05765a72f7dabaeefac5ea6809c3d8c5b2c67ed0c07720234277e8163b40"} Dec 03 22:24:46.356641 master-0 kubenswrapper[36504]: I1203 22:24:46.356538 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4aa05765a72f7dabaeefac5ea6809c3d8c5b2c67ed0c07720234277e8163b40" Dec 03 22:24:46.356641 master-0 kubenswrapper[36504]: I1203 22:24:46.356562 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8r7tj" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.800866 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-57lcw"] Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: E1203 22:24:46.801594 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerName="dnsmasq-dns" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.801610 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerName="dnsmasq-dns" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: E1203 22:24:46.801628 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerName="init" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.801635 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerName="init" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: E1203 22:24:46.801652 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d143967f-1161-4824-b45f-67d0b695536f" containerName="keystone-db-sync" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.801660 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d143967f-1161-4824-b45f-67d0b695536f" containerName="keystone-db-sync" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.802231 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="5674bf96-4a9a-40d4-95af-d72defcb57d5" containerName="dnsmasq-dns" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.802257 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="d143967f-1161-4824-b45f-67d0b695536f" containerName="keystone-db-sync" Dec 03 22:24:46.803377 master-0 kubenswrapper[36504]: I1203 22:24:46.803246 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.814679 master-0 kubenswrapper[36504]: I1203 22:24:46.812619 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:24:46.814679 master-0 kubenswrapper[36504]: I1203 22:24:46.812998 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:24:46.814679 master-0 kubenswrapper[36504]: I1203 22:24:46.813287 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:24:46.814679 master-0 kubenswrapper[36504]: I1203 22:24:46.813594 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:24:46.865663 master-0 kubenswrapper[36504]: I1203 22:24:46.865415 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-598448668f-fn8jr"] Dec 03 22:24:46.866428 master-0 kubenswrapper[36504]: I1203 22:24:46.865758 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-598448668f-fn8jr" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerName="dnsmasq-dns" containerID="cri-o://c353bef980ade599afb552391d03fc758a0b06aa9895dd267cecb9fa61dd845a" gracePeriod=10 Dec 03 22:24:46.901148 master-0 kubenswrapper[36504]: I1203 22:24:46.900204 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-config-data\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.901148 master-0 kubenswrapper[36504]: I1203 22:24:46.900281 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-credential-keys\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.901148 master-0 kubenswrapper[36504]: I1203 22:24:46.900304 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-combined-ca-bundle\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.901148 master-0 kubenswrapper[36504]: I1203 22:24:46.900334 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-fernet-keys\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.901148 master-0 kubenswrapper[36504]: I1203 22:24:46.900458 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-scripts\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.901148 master-0 kubenswrapper[36504]: I1203 22:24:46.900600 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68m67\" (UniqueName: \"kubernetes.io/projected/4616bb3b-141e-4185-a336-920a4e986750-kube-api-access-68m67\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:46.915373 master-0 kubenswrapper[36504]: I1203 22:24:46.915259 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-57lcw"] Dec 03 22:24:47.026123 master-0 kubenswrapper[36504]: I1203 22:24:47.026028 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:47.035190 master-0 kubenswrapper[36504]: I1203 22:24:47.035055 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-scripts\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.035631 master-0 kubenswrapper[36504]: I1203 22:24:47.035579 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68m67\" (UniqueName: \"kubernetes.io/projected/4616bb3b-141e-4185-a336-920a4e986750-kube-api-access-68m67\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.035875 master-0 kubenswrapper[36504]: I1203 22:24:47.035665 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-config-data\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.035875 master-0 kubenswrapper[36504]: I1203 22:24:47.035709 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-credential-keys\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.035875 master-0 kubenswrapper[36504]: I1203 22:24:47.035724 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-combined-ca-bundle\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.035875 master-0 kubenswrapper[36504]: I1203 22:24:47.035750 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-fernet-keys\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.042301 master-0 kubenswrapper[36504]: I1203 22:24:47.042243 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-config-data\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.042865 master-0 kubenswrapper[36504]: I1203 22:24:47.042835 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-fernet-keys\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.043333 master-0 kubenswrapper[36504]: I1203 22:24:47.043297 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-scripts\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.090841 master-0 kubenswrapper[36504]: I1203 22:24:47.056232 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-credential-keys\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.090841 master-0 kubenswrapper[36504]: I1203 22:24:47.060923 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-combined-ca-bundle\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.176110 master-0 kubenswrapper[36504]: I1203 22:24:47.175935 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-hghcd"] Dec 03 22:24:47.176722 master-0 kubenswrapper[36504]: I1203 22:24:47.176678 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68m67\" (UniqueName: \"kubernetes.io/projected/4616bb3b-141e-4185-a336-920a4e986750-kube-api-access-68m67\") pod \"keystone-bootstrap-57lcw\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.194259 master-0 kubenswrapper[36504]: I1203 22:24:47.193485 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.209437 master-0 kubenswrapper[36504]: I1203 22:24:47.201422 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 22:24:47.267113 master-0 kubenswrapper[36504]: I1203 22:24:47.266994 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b69454d77-lfkfc"] Dec 03 22:24:47.353955 master-0 kubenswrapper[36504]: I1203 22:24:47.314140 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-hghcd"] Dec 03 22:24:47.353955 master-0 kubenswrapper[36504]: I1203 22:24:47.350914 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b69454d77-lfkfc"] Dec 03 22:24:47.353955 master-0 kubenswrapper[36504]: I1203 22:24:47.315226 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.353955 master-0 kubenswrapper[36504]: I1203 22:24:47.353016 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-combined-ca-bundle\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.353955 master-0 kubenswrapper[36504]: I1203 22:24:47.353114 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-config-data\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.353955 master-0 kubenswrapper[36504]: I1203 22:24:47.353252 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jqwl\" (UniqueName: \"kubernetes.io/projected/79800ee3-07d8-43e7-9263-15e8cdeef26d-kube-api-access-7jqwl\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.359192 master-0 kubenswrapper[36504]: I1203 22:24:47.359127 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:47.409544 master-0 kubenswrapper[36504]: I1203 22:24:47.409489 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-n4ht5"] Dec 03 22:24:47.411678 master-0 kubenswrapper[36504]: I1203 22:24:47.411645 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.421591 master-0 kubenswrapper[36504]: I1203 22:24:47.421540 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 22:24:47.427097 master-0 kubenswrapper[36504]: I1203 22:24:47.426964 36504 generic.go:334] "Generic (PLEG): container finished" podID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerID="c353bef980ade599afb552391d03fc758a0b06aa9895dd267cecb9fa61dd845a" exitCode=0 Dec 03 22:24:47.427097 master-0 kubenswrapper[36504]: I1203 22:24:47.427035 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-598448668f-fn8jr" event={"ID":"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f","Type":"ContainerDied","Data":"c353bef980ade599afb552391d03fc758a0b06aa9895dd267cecb9fa61dd845a"} Dec 03 22:24:47.428653 master-0 kubenswrapper[36504]: I1203 22:24:47.428619 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456223 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-config\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456344 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jqwl\" (UniqueName: \"kubernetes.io/projected/79800ee3-07d8-43e7-9263-15e8cdeef26d-kube-api-access-7jqwl\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456407 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456439 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456465 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456492 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-svc\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456508 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56h8j\" (UniqueName: \"kubernetes.io/projected/ce7e6878-72c7-411c-b399-2c890b83bb8f-kube-api-access-56h8j\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456565 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-combined-ca-bundle\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456623 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-config-data\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.460840 master-0 kubenswrapper[36504]: I1203 22:24:47.456661 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-edpm\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.474797 master-0 kubenswrapper[36504]: I1203 22:24:47.470355 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-config-data\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.478797 master-0 kubenswrapper[36504]: I1203 22:24:47.475347 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-combined-ca-bundle\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.528798 master-0 kubenswrapper[36504]: I1203 22:24:47.526036 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jqwl\" (UniqueName: \"kubernetes.io/projected/79800ee3-07d8-43e7-9263-15e8cdeef26d-kube-api-access-7jqwl\") pod \"heat-db-sync-hghcd\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.564226 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-edpm\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.564629 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-config\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.564950 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.565029 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.565074 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.565134 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-svc\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.566801 master-0 kubenswrapper[36504]: I1203 22:24:47.565156 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56h8j\" (UniqueName: \"kubernetes.io/projected/ce7e6878-72c7-411c-b399-2c890b83bb8f-kube-api-access-56h8j\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.568446 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-edpm\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.569223 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-config\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.570299 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-swift-storage-0\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.570913 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-db-sync-wq9c6"] Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.571011 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-sb\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.572477 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-svc\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.573217 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-nb\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.583805 master-0 kubenswrapper[36504]: I1203 22:24:47.574875 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.609802 master-0 kubenswrapper[36504]: I1203 22:24:47.607287 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hghcd" Dec 03 22:24:47.609802 master-0 kubenswrapper[36504]: I1203 22:24:47.607486 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-config-data" Dec 03 22:24:47.609802 master-0 kubenswrapper[36504]: I1203 22:24:47.607757 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-scripts" Dec 03 22:24:47.619632 master-0 kubenswrapper[36504]: I1203 22:24:47.618872 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56h8j\" (UniqueName: \"kubernetes.io/projected/ce7e6878-72c7-411c-b399-2c890b83bb8f-kube-api-access-56h8j\") pod \"dnsmasq-dns-6b69454d77-lfkfc\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.690141 master-0 kubenswrapper[36504]: I1203 22:24:47.679451 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-config-data\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.690141 master-0 kubenswrapper[36504]: I1203 22:24:47.679539 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-db-sync-config-data\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.690141 master-0 kubenswrapper[36504]: I1203 22:24:47.679654 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftpf7\" (UniqueName: \"kubernetes.io/projected/99d0cdcb-2d58-4530-a135-0e89f8c77065-kube-api-access-ftpf7\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.690141 master-0 kubenswrapper[36504]: I1203 22:24:47.679713 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-config\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.690141 master-0 kubenswrapper[36504]: I1203 22:24:47.679927 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-scripts\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.690141 master-0 kubenswrapper[36504]: I1203 22:24:47.680043 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4f03542-de5f-457c-8843-01ea5d2febce-etc-machine-id\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.771808 master-0 kubenswrapper[36504]: I1203 22:24:47.758430 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9622j\" (UniqueName: \"kubernetes.io/projected/c4f03542-de5f-457c-8843-01ea5d2febce-kube-api-access-9622j\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.771808 master-0 kubenswrapper[36504]: I1203 22:24:47.758523 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-combined-ca-bundle\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.771808 master-0 kubenswrapper[36504]: I1203 22:24:47.758630 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-combined-ca-bundle\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.869635 master-0 kubenswrapper[36504]: I1203 22:24:47.855572 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:47.870713 master-0 kubenswrapper[36504]: I1203 22:24:47.869779 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n4ht5"] Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885052 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-combined-ca-bundle\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885190 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-config-data\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885220 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-db-sync-config-data\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885267 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftpf7\" (UniqueName: \"kubernetes.io/projected/99d0cdcb-2d58-4530-a135-0e89f8c77065-kube-api-access-ftpf7\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885306 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-config\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885390 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-scripts\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885450 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4f03542-de5f-457c-8843-01ea5d2febce-etc-machine-id\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885569 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9622j\" (UniqueName: \"kubernetes.io/projected/c4f03542-de5f-457c-8843-01ea5d2febce-kube-api-access-9622j\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.886019 master-0 kubenswrapper[36504]: I1203 22:24:47.885596 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-combined-ca-bundle\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.886939 master-0 kubenswrapper[36504]: I1203 22:24:47.886585 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4f03542-de5f-457c-8843-01ea5d2febce-etc-machine-id\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.895236 master-0 kubenswrapper[36504]: I1203 22:24:47.893612 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-combined-ca-bundle\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.896258 master-0 kubenswrapper[36504]: I1203 22:24:47.896178 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-combined-ca-bundle\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.898337 master-0 kubenswrapper[36504]: I1203 22:24:47.898284 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-config\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:47.899692 master-0 kubenswrapper[36504]: I1203 22:24:47.899664 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-config-data\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.901848 master-0 kubenswrapper[36504]: I1203 22:24:47.901809 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-db-sync-config-data\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.903638 master-0 kubenswrapper[36504]: I1203 22:24:47.903607 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-scripts\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:47.915654 master-0 kubenswrapper[36504]: I1203 22:24:47.914994 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-db-sync-wq9c6"] Dec 03 22:24:47.938954 master-0 kubenswrapper[36504]: I1203 22:24:47.938829 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:24:47.945137 master-0 kubenswrapper[36504]: I1203 22:24:47.944958 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:24:47.950040 master-0 kubenswrapper[36504]: I1203 22:24:47.949639 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:24:47.950480 master-0 kubenswrapper[36504]: I1203 22:24:47.950439 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:24:47.954614 master-0 kubenswrapper[36504]: I1203 22:24:47.954546 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:24:47.982065 master-0 kubenswrapper[36504]: I1203 22:24:47.981868 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-v4x4d"] Dec 03 22:24:47.984112 master-0 kubenswrapper[36504]: I1203 22:24:47.984070 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:47.988566 master-0 kubenswrapper[36504]: I1203 22:24:47.988471 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 22:24:47.989049 master-0 kubenswrapper[36504]: I1203 22:24:47.989010 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 22:24:47.995032 master-0 kubenswrapper[36504]: I1203 22:24:47.994977 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v4x4d"] Dec 03 22:24:48.007736 master-0 kubenswrapper[36504]: I1203 22:24:48.007632 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b69454d77-lfkfc"] Dec 03 22:24:48.019341 master-0 kubenswrapper[36504]: I1203 22:24:48.019192 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8lw82"] Dec 03 22:24:48.022792 master-0 kubenswrapper[36504]: I1203 22:24:48.022699 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.027394 master-0 kubenswrapper[36504]: I1203 22:24:48.026338 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 22:24:48.041437 master-0 kubenswrapper[36504]: I1203 22:24:48.039229 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8lw82"] Dec 03 22:24:48.055576 master-0 kubenswrapper[36504]: I1203 22:24:48.055485 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848b5d5dd5-k7qdq"] Dec 03 22:24:48.062121 master-0 kubenswrapper[36504]: I1203 22:24:48.062051 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.079642 master-0 kubenswrapper[36504]: I1203 22:24:48.078292 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848b5d5dd5-k7qdq"] Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.091666 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-scripts\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.091878 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-scripts\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.091941 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lfb4\" (UniqueName: \"kubernetes.io/projected/50ff2a15-1a3d-459a-80fe-440d20765350-kube-api-access-5lfb4\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.092024 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-run-httpd\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.092048 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b12b6b8-6e84-4e56-b403-d4d31d309852-logs\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.092084 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.092105 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-log-httpd\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.092168 master-0 kubenswrapper[36504]: I1203 22:24:48.092178 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-combined-ca-bundle\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.092563 master-0 kubenswrapper[36504]: I1203 22:24:48.092205 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/3b12b6b8-6e84-4e56-b403-d4d31d309852-kube-api-access-rth2j\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.092563 master-0 kubenswrapper[36504]: I1203 22:24:48.092257 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.092563 master-0 kubenswrapper[36504]: I1203 22:24:48.092292 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-config-data\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.092563 master-0 kubenswrapper[36504]: I1203 22:24:48.092341 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-config-data\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.114860 master-0 kubenswrapper[36504]: I1203 22:24:48.114794 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftpf7\" (UniqueName: \"kubernetes.io/projected/99d0cdcb-2d58-4530-a135-0e89f8c77065-kube-api-access-ftpf7\") pod \"neutron-db-sync-n4ht5\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:48.116240 master-0 kubenswrapper[36504]: I1203 22:24:48.116189 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9622j\" (UniqueName: \"kubernetes.io/projected/c4f03542-de5f-457c-8843-01ea5d2febce-kube-api-access-9622j\") pod \"cinder-baebb-db-sync-wq9c6\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:48.179792 master-0 kubenswrapper[36504]: I1203 22:24:48.120883 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:24:48.182700 master-0 kubenswrapper[36504]: I1203 22:24:48.182553 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.255545 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-edpm\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.255645 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-run-httpd\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.255681 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b12b6b8-6e84-4e56-b403-d4d31d309852-logs\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256097 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256185 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-log-httpd\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256332 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-combined-ca-bundle\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256358 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b12b6b8-6e84-4e56-b403-d4d31d309852-logs\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256370 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-svc\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256490 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-combined-ca-bundle\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256549 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/3b12b6b8-6e84-4e56-b403-d4d31d309852-kube-api-access-rth2j\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256579 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-nb\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256697 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-run-httpd\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256697 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256759 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-config-data\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256801 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-config-data\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256946 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-scripts\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.256989 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcpw\" (UniqueName: \"kubernetes.io/projected/c52e6803-ca6e-4687-a71c-3029e0cb2253-kube-api-access-kwcpw\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257036 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-sb\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257060 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-config\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257414 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmb28\" (UniqueName: \"kubernetes.io/projected/486e7352-51c4-4e75-9ff9-ead9eb721c74-kube-api-access-jmb28\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257478 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-db-sync-config-data\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257534 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-scripts\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257593 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lfb4\" (UniqueName: \"kubernetes.io/projected/50ff2a15-1a3d-459a-80fe-440d20765350-kube-api-access-5lfb4\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.257661 master-0 kubenswrapper[36504]: I1203 22:24:48.257611 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-swift-storage-0\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.259401 master-0 kubenswrapper[36504]: I1203 22:24:48.258601 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-log-httpd\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.265949 master-0 kubenswrapper[36504]: I1203 22:24:48.265889 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-config-data\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.272651 master-0 kubenswrapper[36504]: I1203 22:24:48.269212 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.272651 master-0 kubenswrapper[36504]: I1203 22:24:48.270943 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-scripts\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.274399 master-0 kubenswrapper[36504]: I1203 22:24:48.274348 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-config-data\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.279700 master-0 kubenswrapper[36504]: I1203 22:24:48.279640 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.282064 master-0 kubenswrapper[36504]: I1203 22:24:48.281747 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-scripts\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.282547 master-0 kubenswrapper[36504]: I1203 22:24:48.282502 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-57lcw"] Dec 03 22:24:48.282677 master-0 kubenswrapper[36504]: I1203 22:24:48.282619 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-combined-ca-bundle\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.302140 master-0 kubenswrapper[36504]: I1203 22:24:48.302097 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lfb4\" (UniqueName: \"kubernetes.io/projected/50ff2a15-1a3d-459a-80fe-440d20765350-kube-api-access-5lfb4\") pod \"ceilometer-0\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " pod="openstack/ceilometer-0" Dec 03 22:24:48.306636 master-0 kubenswrapper[36504]: I1203 22:24:48.306583 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/3b12b6b8-6e84-4e56-b403-d4d31d309852-kube-api-access-rth2j\") pod \"placement-db-sync-v4x4d\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.324129 master-0 kubenswrapper[36504]: I1203 22:24:48.321744 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.358993 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-edpm\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.359640 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-nb\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.359711 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nmpq\" (UniqueName: \"kubernetes.io/projected/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-kube-api-access-8nmpq\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.359750 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-sb\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.359940 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-config\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.360023 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-swift-storage-0\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.360169 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-svc\") pod \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\" (UID: \"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f\") " Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.360913 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcpw\" (UniqueName: \"kubernetes.io/projected/c52e6803-ca6e-4687-a71c-3029e0cb2253-kube-api-access-kwcpw\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.360968 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-sb\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.360997 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-config\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361030 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmb28\" (UniqueName: \"kubernetes.io/projected/486e7352-51c4-4e75-9ff9-ead9eb721c74-kube-api-access-jmb28\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361075 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-db-sync-config-data\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361219 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-swift-storage-0\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361375 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-edpm\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361516 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-combined-ca-bundle\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361544 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-svc\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.361859 master-0 kubenswrapper[36504]: I1203 22:24:48.361637 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-nb\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.365527 master-0 kubenswrapper[36504]: I1203 22:24:48.364305 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-svc\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.365527 master-0 kubenswrapper[36504]: I1203 22:24:48.364816 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-edpm\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.365527 master-0 kubenswrapper[36504]: I1203 22:24:48.364982 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-nb\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.365527 master-0 kubenswrapper[36504]: I1203 22:24:48.365304 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-sb\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.365527 master-0 kubenswrapper[36504]: I1203 22:24:48.365456 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-swift-storage-0\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.366132 master-0 kubenswrapper[36504]: I1203 22:24:48.365796 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-config\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.403896 master-0 kubenswrapper[36504]: I1203 22:24:48.387030 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmb28\" (UniqueName: \"kubernetes.io/projected/486e7352-51c4-4e75-9ff9-ead9eb721c74-kube-api-access-jmb28\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.403896 master-0 kubenswrapper[36504]: I1203 22:24:48.389511 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcpw\" (UniqueName: \"kubernetes.io/projected/c52e6803-ca6e-4687-a71c-3029e0cb2253-kube-api-access-kwcpw\") pod \"dnsmasq-dns-848b5d5dd5-k7qdq\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.403896 master-0 kubenswrapper[36504]: I1203 22:24:48.391177 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-kube-api-access-8nmpq" (OuterVolumeSpecName: "kube-api-access-8nmpq") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "kube-api-access-8nmpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:48.403896 master-0 kubenswrapper[36504]: I1203 22:24:48.398095 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-combined-ca-bundle\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.412083 master-0 kubenswrapper[36504]: I1203 22:24:48.409370 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:24:48.420366 master-0 kubenswrapper[36504]: I1203 22:24:48.420048 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-db-sync-config-data\") pod \"barbican-db-sync-8lw82\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.471346 master-0 kubenswrapper[36504]: I1203 22:24:48.465675 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nmpq\" (UniqueName: \"kubernetes.io/projected/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-kube-api-access-8nmpq\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.489088 master-0 kubenswrapper[36504]: I1203 22:24:48.488950 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-598448668f-fn8jr" Dec 03 22:24:48.489088 master-0 kubenswrapper[36504]: I1203 22:24:48.489039 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-598448668f-fn8jr" event={"ID":"e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f","Type":"ContainerDied","Data":"897cdac6d78a80b16b45d8f014d0819c700cb250d9704b92e350cf9b09d4d24f"} Dec 03 22:24:48.489388 master-0 kubenswrapper[36504]: I1203 22:24:48.489144 36504 scope.go:117] "RemoveContainer" containerID="c353bef980ade599afb552391d03fc758a0b06aa9895dd267cecb9fa61dd845a" Dec 03 22:24:48.509453 master-0 kubenswrapper[36504]: I1203 22:24:48.507790 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57lcw" event={"ID":"4616bb3b-141e-4185-a336-920a4e986750","Type":"ContainerStarted","Data":"39a6d6390e8b07cc9af73639b049cc52189a8dc46201e6f81a48ceff211747cf"} Dec 03 22:24:48.536559 master-0 kubenswrapper[36504]: I1203 22:24:48.536075 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:48.560870 master-0 kubenswrapper[36504]: I1203 22:24:48.557976 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b69454d77-lfkfc"] Dec 03 22:24:48.569084 master-0 kubenswrapper[36504]: I1203 22:24:48.569033 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.570739 master-0 kubenswrapper[36504]: I1203 22:24:48.570320 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:48.605450 master-0 kubenswrapper[36504]: I1203 22:24:48.585765 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v4x4d" Dec 03 22:24:48.605450 master-0 kubenswrapper[36504]: I1203 22:24:48.591536 36504 scope.go:117] "RemoveContainer" containerID="92b60014c7f58066fbc6795c56f40c2d4b3caa0e38bdbb5e30a4bb343bf964f0" Dec 03 22:24:48.612958 master-0 kubenswrapper[36504]: W1203 22:24:48.612874 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce7e6878_72c7_411c_b399_2c890b83bb8f.slice/crio-5c1e6967e5d2b68ac63290a277caf8074a46a3886181d5462445c8e41ee8b180 WatchSource:0}: Error finding container 5c1e6967e5d2b68ac63290a277caf8074a46a3886181d5462445c8e41ee8b180: Status 404 returned error can't find the container with id 5c1e6967e5d2b68ac63290a277caf8074a46a3886181d5462445c8e41ee8b180 Dec 03 22:24:48.617162 master-0 kubenswrapper[36504]: I1203 22:24:48.617076 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8lw82" Dec 03 22:24:48.624055 master-0 kubenswrapper[36504]: I1203 22:24:48.623951 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:48.633460 master-0 kubenswrapper[36504]: I1203 22:24:48.632445 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-edpm" (OuterVolumeSpecName: "edpm") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:48.647330 master-0 kubenswrapper[36504]: I1203 22:24:48.641369 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:48.673332 master-0 kubenswrapper[36504]: I1203 22:24:48.672195 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.673332 master-0 kubenswrapper[36504]: I1203 22:24:48.672252 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.673332 master-0 kubenswrapper[36504]: I1203 22:24:48.672271 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.710043 master-0 kubenswrapper[36504]: I1203 22:24:48.709955 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-config" (OuterVolumeSpecName: "config") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:48.726705 master-0 kubenswrapper[36504]: I1203 22:24:48.726625 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" (UID: "e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:48.735815 master-0 kubenswrapper[36504]: I1203 22:24:48.735624 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-hghcd"] Dec 03 22:24:48.785375 master-0 kubenswrapper[36504]: I1203 22:24:48.785318 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.785375 master-0 kubenswrapper[36504]: I1203 22:24:48.785375 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:48.958092 master-0 kubenswrapper[36504]: I1203 22:24:48.935889 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-598448668f-fn8jr"] Dec 03 22:24:48.958092 master-0 kubenswrapper[36504]: I1203 22:24:48.950882 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-598448668f-fn8jr"] Dec 03 22:24:49.099632 master-0 kubenswrapper[36504]: I1203 22:24:49.099574 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:24:49.218790 master-0 kubenswrapper[36504]: I1203 22:24:49.218681 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" path="/var/lib/kubelet/pods/e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f/volumes" Dec 03 22:24:49.221029 master-0 kubenswrapper[36504]: I1203 22:24:49.220918 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n4ht5"] Dec 03 22:24:49.321406 master-0 kubenswrapper[36504]: I1203 22:24:49.321319 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-db-sync-wq9c6"] Dec 03 22:24:49.557501 master-0 kubenswrapper[36504]: I1203 22:24:49.557145 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57lcw" event={"ID":"4616bb3b-141e-4185-a336-920a4e986750","Type":"ContainerStarted","Data":"243fb3e7052bb47c1d450eafc9fdd3cc50be6dd1c068b3d430b5cfc2cf787cf2"} Dec 03 22:24:49.578852 master-0 kubenswrapper[36504]: I1203 22:24:49.578713 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:24:49.597878 master-0 kubenswrapper[36504]: I1203 22:24:49.597758 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4ht5" event={"ID":"99d0cdcb-2d58-4530-a135-0e89f8c77065","Type":"ContainerStarted","Data":"f60609604a14c53dade4a1414b76d4bd121fd07c6374999a281da3768fcc094f"} Dec 03 22:24:49.653548 master-0 kubenswrapper[36504]: I1203 22:24:49.653476 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hghcd" event={"ID":"79800ee3-07d8-43e7-9263-15e8cdeef26d","Type":"ContainerStarted","Data":"b1b54fb4feb60c0f8bfe2f4add76f9fa15c437460cd7532f88265a9ed7c177dc"} Dec 03 22:24:49.656298 master-0 kubenswrapper[36504]: I1203 22:24:49.656225 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-db-sync-wq9c6" event={"ID":"c4f03542-de5f-457c-8843-01ea5d2febce","Type":"ContainerStarted","Data":"81fd3294112ed9b71d0e6bd3037f08801b2717de17a5a7de2d203880301da72f"} Dec 03 22:24:49.680835 master-0 kubenswrapper[36504]: I1203 22:24:49.676721 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-57lcw" podStartSLOduration=3.676694972 podStartE2EDuration="3.676694972s" podCreationTimestamp="2025-12-03 22:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:49.630918643 +0000 UTC m=+854.850690670" watchObservedRunningTime="2025-12-03 22:24:49.676694972 +0000 UTC m=+854.896466979" Dec 03 22:24:49.680835 master-0 kubenswrapper[36504]: I1203 22:24:49.677544 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" event={"ID":"ce7e6878-72c7-411c-b399-2c890b83bb8f","Type":"ContainerStarted","Data":"31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734"} Dec 03 22:24:49.680835 master-0 kubenswrapper[36504]: I1203 22:24:49.677595 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" event={"ID":"ce7e6878-72c7-411c-b399-2c890b83bb8f","Type":"ContainerStarted","Data":"5c1e6967e5d2b68ac63290a277caf8074a46a3886181d5462445c8e41ee8b180"} Dec 03 22:24:49.680835 master-0 kubenswrapper[36504]: I1203 22:24:49.680823 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-v4x4d"] Dec 03 22:24:49.702868 master-0 kubenswrapper[36504]: I1203 22:24:49.701196 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8lw82"] Dec 03 22:24:49.944797 master-0 kubenswrapper[36504]: W1203 22:24:49.943987 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc52e6803_ca6e_4687_a71c_3029e0cb2253.slice/crio-68ef88e1870f4fa944edf9ee7e2c26aceba0ba41c4cc26b3302c2a24909a709f WatchSource:0}: Error finding container 68ef88e1870f4fa944edf9ee7e2c26aceba0ba41c4cc26b3302c2a24909a709f: Status 404 returned error can't find the container with id 68ef88e1870f4fa944edf9ee7e2c26aceba0ba41c4cc26b3302c2a24909a709f Dec 03 22:24:50.032080 master-0 kubenswrapper[36504]: I1203 22:24:49.996249 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848b5d5dd5-k7qdq"] Dec 03 22:24:50.180711 master-0 kubenswrapper[36504]: I1203 22:24:50.180618 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:24:50.392874 master-0 kubenswrapper[36504]: I1203 22:24:50.392812 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:50.548558 master-0 kubenswrapper[36504]: I1203 22:24:50.548310 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-edpm\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.548558 master-0 kubenswrapper[36504]: I1203 22:24:50.548440 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-nb\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.548558 master-0 kubenswrapper[36504]: I1203 22:24:50.548500 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-svc\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.548558 master-0 kubenswrapper[36504]: I1203 22:24:50.548551 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-sb\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.549043 master-0 kubenswrapper[36504]: I1203 22:24:50.548621 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-config\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.549043 master-0 kubenswrapper[36504]: I1203 22:24:50.548835 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-swift-storage-0\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.549043 master-0 kubenswrapper[36504]: I1203 22:24:50.548900 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56h8j\" (UniqueName: \"kubernetes.io/projected/ce7e6878-72c7-411c-b399-2c890b83bb8f-kube-api-access-56h8j\") pod \"ce7e6878-72c7-411c-b399-2c890b83bb8f\" (UID: \"ce7e6878-72c7-411c-b399-2c890b83bb8f\") " Dec 03 22:24:50.586962 master-0 kubenswrapper[36504]: I1203 22:24:50.586210 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7e6878-72c7-411c-b399-2c890b83bb8f-kube-api-access-56h8j" (OuterVolumeSpecName: "kube-api-access-56h8j") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "kube-api-access-56h8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:50.657011 master-0 kubenswrapper[36504]: I1203 22:24:50.656099 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56h8j\" (UniqueName: \"kubernetes.io/projected/ce7e6878-72c7-411c-b399-2c890b83bb8f-kube-api-access-56h8j\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.690941 master-0 kubenswrapper[36504]: I1203 22:24:50.690717 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:50.707950 master-0 kubenswrapper[36504]: I1203 22:24:50.705914 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v4x4d" event={"ID":"3b12b6b8-6e84-4e56-b403-d4d31d309852","Type":"ContainerStarted","Data":"68845500dd34cb983cb332a073c0df96de1ea58e0dec7d8a03145e09aa990a9c"} Dec 03 22:24:50.715351 master-0 kubenswrapper[36504]: I1203 22:24:50.715211 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4ht5" event={"ID":"99d0cdcb-2d58-4530-a135-0e89f8c77065","Type":"ContainerStarted","Data":"4ba7fbc701630d14c229d5082ae0c81d6fecad586043aa1c771966dd3808b291"} Dec 03 22:24:50.722296 master-0 kubenswrapper[36504]: I1203 22:24:50.721982 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-edpm" (OuterVolumeSpecName: "edpm") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:50.724554 master-0 kubenswrapper[36504]: I1203 22:24:50.724496 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:50.726025 master-0 kubenswrapper[36504]: I1203 22:24:50.725981 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" event={"ID":"c52e6803-ca6e-4687-a71c-3029e0cb2253","Type":"ContainerStarted","Data":"68ef88e1870f4fa944edf9ee7e2c26aceba0ba41c4cc26b3302c2a24909a709f"} Dec 03 22:24:50.727662 master-0 kubenswrapper[36504]: I1203 22:24:50.727586 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-config" (OuterVolumeSpecName: "config") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:50.738728 master-0 kubenswrapper[36504]: I1203 22:24:50.738408 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:50.750850 master-0 kubenswrapper[36504]: I1203 22:24:50.743150 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8lw82" event={"ID":"486e7352-51c4-4e75-9ff9-ead9eb721c74","Type":"ContainerStarted","Data":"978129855c4971acdba6cce10364a4534166a91b4f69c59af280a65210ee8c56"} Dec 03 22:24:50.754977 master-0 kubenswrapper[36504]: I1203 22:24:50.753995 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce7e6878-72c7-411c-b399-2c890b83bb8f" (UID: "ce7e6878-72c7-411c-b399-2c890b83bb8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:24:50.762466 master-0 kubenswrapper[36504]: I1203 22:24:50.762116 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.762466 master-0 kubenswrapper[36504]: I1203 22:24:50.762177 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.762466 master-0 kubenswrapper[36504]: I1203 22:24:50.762191 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.762466 master-0 kubenswrapper[36504]: I1203 22:24:50.762201 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.762466 master-0 kubenswrapper[36504]: I1203 22:24:50.762212 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.762466 master-0 kubenswrapper[36504]: I1203 22:24:50.762224 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce7e6878-72c7-411c-b399-2c890b83bb8f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:50.777963 master-0 kubenswrapper[36504]: I1203 22:24:50.777730 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-n4ht5" podStartSLOduration=4.777707745 podStartE2EDuration="4.777707745s" podCreationTimestamp="2025-12-03 22:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:50.775507976 +0000 UTC m=+855.995279983" watchObservedRunningTime="2025-12-03 22:24:50.777707745 +0000 UTC m=+855.997479752" Dec 03 22:24:50.781751 master-0 kubenswrapper[36504]: I1203 22:24:50.781502 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerStarted","Data":"bbbc5f1def52cbb213f07748935b6e988143707bbe3c4849f4061f99ab5ee4fa"} Dec 03 22:24:50.785627 master-0 kubenswrapper[36504]: I1203 22:24:50.785578 36504 generic.go:334] "Generic (PLEG): container finished" podID="ce7e6878-72c7-411c-b399-2c890b83bb8f" containerID="31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734" exitCode=0 Dec 03 22:24:50.788086 master-0 kubenswrapper[36504]: I1203 22:24:50.788056 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" Dec 03 22:24:50.795230 master-0 kubenswrapper[36504]: I1203 22:24:50.790742 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" event={"ID":"ce7e6878-72c7-411c-b399-2c890b83bb8f","Type":"ContainerDied","Data":"31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734"} Dec 03 22:24:50.795230 master-0 kubenswrapper[36504]: I1203 22:24:50.790823 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b69454d77-lfkfc" event={"ID":"ce7e6878-72c7-411c-b399-2c890b83bb8f","Type":"ContainerDied","Data":"5c1e6967e5d2b68ac63290a277caf8074a46a3886181d5462445c8e41ee8b180"} Dec 03 22:24:50.795230 master-0 kubenswrapper[36504]: I1203 22:24:50.790843 36504 scope.go:117] "RemoveContainer" containerID="31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734" Dec 03 22:24:50.897582 master-0 kubenswrapper[36504]: I1203 22:24:50.896854 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b69454d77-lfkfc"] Dec 03 22:24:50.913263 master-0 kubenswrapper[36504]: I1203 22:24:50.913182 36504 scope.go:117] "RemoveContainer" containerID="31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734" Dec 03 22:24:50.916550 master-0 kubenswrapper[36504]: E1203 22:24:50.916154 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734\": container with ID starting with 31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734 not found: ID does not exist" containerID="31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734" Dec 03 22:24:50.916550 master-0 kubenswrapper[36504]: I1203 22:24:50.916264 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734"} err="failed to get container status \"31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734\": rpc error: code = NotFound desc = could not find container \"31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734\": container with ID starting with 31ed3c9b09978233816b3c71be4568c6407bff574ab74f30eb0a5c2b5c33d734 not found: ID does not exist" Dec 03 22:24:50.948950 master-0 kubenswrapper[36504]: I1203 22:24:50.948064 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b69454d77-lfkfc"] Dec 03 22:24:51.160946 master-0 kubenswrapper[36504]: I1203 22:24:51.160370 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce7e6878-72c7-411c-b399-2c890b83bb8f" path="/var/lib/kubelet/pods/ce7e6878-72c7-411c-b399-2c890b83bb8f/volumes" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.184438 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: E1203 22:24:51.185118 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce7e6878-72c7-411c-b399-2c890b83bb8f" containerName="init" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.185136 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7e6878-72c7-411c-b399-2c890b83bb8f" containerName="init" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: E1203 22:24:51.185171 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerName="init" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.185177 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerName="init" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: E1203 22:24:51.185221 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerName="dnsmasq-dns" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.185227 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerName="dnsmasq-dns" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.185732 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce7e6878-72c7-411c-b399-2c890b83bb8f" containerName="init" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.185805 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bd7fe5-c4d0-4d17-9adc-06b03ab4522f" containerName="dnsmasq-dns" Dec 03 22:24:51.188407 master-0 kubenswrapper[36504]: I1203 22:24:51.187281 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.193792 master-0 kubenswrapper[36504]: I1203 22:24:51.191062 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-baebb-default-external-config-data" Dec 03 22:24:51.193792 master-0 kubenswrapper[36504]: I1203 22:24:51.191821 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 22:24:51.193945 master-0 kubenswrapper[36504]: I1203 22:24:51.193814 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 22:24:51.250799 master-0 kubenswrapper[36504]: I1203 22:24:51.249861 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286457 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-scripts\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286547 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-httpd-run\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286681 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-config-data\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286704 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-logs\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286736 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-combined-ca-bundle\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286840 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286944 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2hz\" (UniqueName: \"kubernetes.io/projected/dc35c451-c612-46b7-8328-9c8bfa7891e7-kube-api-access-hv2hz\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.289805 master-0 kubenswrapper[36504]: I1203 22:24:51.286980 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-public-tls-certs\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390077 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-config-data\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390130 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-logs\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390162 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-combined-ca-bundle\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390201 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390307 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2hz\" (UniqueName: \"kubernetes.io/projected/dc35c451-c612-46b7-8328-9c8bfa7891e7-kube-api-access-hv2hz\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390343 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-public-tls-certs\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390378 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-scripts\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.390804 master-0 kubenswrapper[36504]: I1203 22:24:51.390411 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-httpd-run\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.391463 master-0 kubenswrapper[36504]: I1203 22:24:51.391028 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-httpd-run\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.394794 master-0 kubenswrapper[36504]: I1203 22:24:51.392092 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-logs\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.398086 master-0 kubenswrapper[36504]: I1203 22:24:51.396597 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:51.398086 master-0 kubenswrapper[36504]: I1203 22:24:51.396685 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a7e57e719e859925dd772afc983e918084ef377c946533a5bd384e3451c0939d/globalmount\"" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.412859 master-0 kubenswrapper[36504]: I1203 22:24:51.400386 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-scripts\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.412859 master-0 kubenswrapper[36504]: I1203 22:24:51.401525 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-combined-ca-bundle\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.412859 master-0 kubenswrapper[36504]: I1203 22:24:51.402224 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-config-data\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.412859 master-0 kubenswrapper[36504]: I1203 22:24:51.404815 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-public-tls-certs\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.435810 master-0 kubenswrapper[36504]: I1203 22:24:51.417993 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2hz\" (UniqueName: \"kubernetes.io/projected/dc35c451-c612-46b7-8328-9c8bfa7891e7-kube-api-access-hv2hz\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:51.860653 master-0 kubenswrapper[36504]: I1203 22:24:51.860485 36504 generic.go:334] "Generic (PLEG): container finished" podID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerID="ef067c318d9cb82534ae69782df516e6360c57080d949cfd3845fbf5b90718af" exitCode=0 Dec 03 22:24:51.861566 master-0 kubenswrapper[36504]: I1203 22:24:51.860697 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" event={"ID":"c52e6803-ca6e-4687-a71c-3029e0cb2253","Type":"ContainerDied","Data":"ef067c318d9cb82534ae69782df516e6360c57080d949cfd3845fbf5b90718af"} Dec 03 22:24:52.025209 master-0 kubenswrapper[36504]: I1203 22:24:52.025134 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:52.035560 master-0 kubenswrapper[36504]: I1203 22:24:52.035123 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:52.258211 master-0 kubenswrapper[36504]: I1203 22:24:52.254872 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:24:52.310201 master-0 kubenswrapper[36504]: I1203 22:24:52.260469 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.310201 master-0 kubenswrapper[36504]: I1203 22:24:52.266053 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 22:24:52.310201 master-0 kubenswrapper[36504]: I1203 22:24:52.267140 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-baebb-default-internal-config-data" Dec 03 22:24:52.310201 master-0 kubenswrapper[36504]: I1203 22:24:52.291615 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:24:52.369406 master-0 kubenswrapper[36504]: I1203 22:24:52.369331 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-combined-ca-bundle\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.369738 master-0 kubenswrapper[36504]: I1203 22:24:52.369437 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-config-data\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.369738 master-0 kubenswrapper[36504]: I1203 22:24:52.369520 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-572qt\" (UniqueName: \"kubernetes.io/projected/6cb8747f-1989-4028-b62a-5bf9ea57f609-kube-api-access-572qt\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.369738 master-0 kubenswrapper[36504]: I1203 22:24:52.369660 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-httpd-run\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.369738 master-0 kubenswrapper[36504]: I1203 22:24:52.369730 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-internal-tls-certs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.369926 master-0 kubenswrapper[36504]: I1203 22:24:52.369810 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.370003 master-0 kubenswrapper[36504]: I1203 22:24:52.369977 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-logs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.370815 master-0 kubenswrapper[36504]: I1203 22:24:52.370779 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-scripts\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475068 master-0 kubenswrapper[36504]: I1203 22:24:52.474515 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-combined-ca-bundle\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475068 master-0 kubenswrapper[36504]: I1203 22:24:52.474589 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-config-data\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475068 master-0 kubenswrapper[36504]: I1203 22:24:52.474724 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-572qt\" (UniqueName: \"kubernetes.io/projected/6cb8747f-1989-4028-b62a-5bf9ea57f609-kube-api-access-572qt\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475068 master-0 kubenswrapper[36504]: I1203 22:24:52.474860 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-httpd-run\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475068 master-0 kubenswrapper[36504]: I1203 22:24:52.474923 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-internal-tls-certs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475068 master-0 kubenswrapper[36504]: I1203 22:24:52.474999 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475671 master-0 kubenswrapper[36504]: I1203 22:24:52.475112 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-logs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475671 master-0 kubenswrapper[36504]: I1203 22:24:52.475217 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-scripts\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.475895 master-0 kubenswrapper[36504]: I1203 22:24:52.475746 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-httpd-run\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.476815 master-0 kubenswrapper[36504]: I1203 22:24:52.476781 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-logs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.479158 master-0 kubenswrapper[36504]: I1203 22:24:52.479094 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:24:52.479158 master-0 kubenswrapper[36504]: I1203 22:24:52.479140 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2072068328f8c9fdd4d126c82ed7a5705e28f66beec6cc4a6e19712168c74b28/globalmount\"" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.481362 master-0 kubenswrapper[36504]: I1203 22:24:52.480180 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-scripts\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.481362 master-0 kubenswrapper[36504]: I1203 22:24:52.480210 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-internal-tls-certs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.481362 master-0 kubenswrapper[36504]: I1203 22:24:52.481255 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-config-data\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.503392 master-0 kubenswrapper[36504]: I1203 22:24:52.482970 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-combined-ca-bundle\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.642918 master-0 kubenswrapper[36504]: I1203 22:24:52.641412 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-572qt\" (UniqueName: \"kubernetes.io/projected/6cb8747f-1989-4028-b62a-5bf9ea57f609-kube-api-access-572qt\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:52.876430 master-0 kubenswrapper[36504]: I1203 22:24:52.876360 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:52.880363 master-0 kubenswrapper[36504]: I1203 22:24:52.880302 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" event={"ID":"c52e6803-ca6e-4687-a71c-3029e0cb2253","Type":"ContainerStarted","Data":"9d300920b20d02c109bc28af15be4798547015038d64adc4899c559e8b4aa8b4"} Dec 03 22:24:52.886459 master-0 kubenswrapper[36504]: I1203 22:24:52.886408 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 22:24:52.997179 master-0 kubenswrapper[36504]: I1203 22:24:52.992020 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" podStartSLOduration=5.991968078 podStartE2EDuration="5.991968078s" podCreationTimestamp="2025-12-03 22:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:24:52.961260133 +0000 UTC m=+858.181032300" watchObservedRunningTime="2025-12-03 22:24:52.991968078 +0000 UTC m=+858.211740105" Dec 03 22:24:53.029860 master-0 kubenswrapper[36504]: I1203 22:24:53.029609 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:24:53.649805 master-0 kubenswrapper[36504]: I1203 22:24:53.642957 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:53.916111 master-0 kubenswrapper[36504]: I1203 22:24:53.915949 36504 generic.go:334] "Generic (PLEG): container finished" podID="4616bb3b-141e-4185-a336-920a4e986750" containerID="243fb3e7052bb47c1d450eafc9fdd3cc50be6dd1c068b3d430b5cfc2cf787cf2" exitCode=0 Dec 03 22:24:53.916399 master-0 kubenswrapper[36504]: I1203 22:24:53.916040 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57lcw" event={"ID":"4616bb3b-141e-4185-a336-920a4e986750","Type":"ContainerDied","Data":"243fb3e7052bb47c1d450eafc9fdd3cc50be6dd1c068b3d430b5cfc2cf787cf2"} Dec 03 22:24:54.817078 master-0 kubenswrapper[36504]: I1203 22:24:54.817015 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:55.065901 master-0 kubenswrapper[36504]: I1203 22:24:55.065760 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:24:56.405451 master-0 kubenswrapper[36504]: I1203 22:24:56.405404 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:56.582740 master-0 kubenswrapper[36504]: I1203 22:24:56.582551 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-credential-keys\") pod \"4616bb3b-141e-4185-a336-920a4e986750\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " Dec 03 22:24:56.582740 master-0 kubenswrapper[36504]: I1203 22:24:56.582690 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68m67\" (UniqueName: \"kubernetes.io/projected/4616bb3b-141e-4185-a336-920a4e986750-kube-api-access-68m67\") pod \"4616bb3b-141e-4185-a336-920a4e986750\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " Dec 03 22:24:56.582740 master-0 kubenswrapper[36504]: I1203 22:24:56.582724 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-config-data\") pod \"4616bb3b-141e-4185-a336-920a4e986750\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " Dec 03 22:24:56.582740 master-0 kubenswrapper[36504]: I1203 22:24:56.582742 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-fernet-keys\") pod \"4616bb3b-141e-4185-a336-920a4e986750\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " Dec 03 22:24:56.583202 master-0 kubenswrapper[36504]: I1203 22:24:56.582944 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-scripts\") pod \"4616bb3b-141e-4185-a336-920a4e986750\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " Dec 03 22:24:56.583202 master-0 kubenswrapper[36504]: I1203 22:24:56.583008 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-combined-ca-bundle\") pod \"4616bb3b-141e-4185-a336-920a4e986750\" (UID: \"4616bb3b-141e-4185-a336-920a4e986750\") " Dec 03 22:24:56.587499 master-0 kubenswrapper[36504]: I1203 22:24:56.587194 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4616bb3b-141e-4185-a336-920a4e986750" (UID: "4616bb3b-141e-4185-a336-920a4e986750"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:56.588548 master-0 kubenswrapper[36504]: I1203 22:24:56.588482 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4616bb3b-141e-4185-a336-920a4e986750" (UID: "4616bb3b-141e-4185-a336-920a4e986750"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:56.588663 master-0 kubenswrapper[36504]: I1203 22:24:56.588600 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4616bb3b-141e-4185-a336-920a4e986750-kube-api-access-68m67" (OuterVolumeSpecName: "kube-api-access-68m67") pod "4616bb3b-141e-4185-a336-920a4e986750" (UID: "4616bb3b-141e-4185-a336-920a4e986750"). InnerVolumeSpecName "kube-api-access-68m67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:24:56.588991 master-0 kubenswrapper[36504]: I1203 22:24:56.588945 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-scripts" (OuterVolumeSpecName: "scripts") pod "4616bb3b-141e-4185-a336-920a4e986750" (UID: "4616bb3b-141e-4185-a336-920a4e986750"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:56.621198 master-0 kubenswrapper[36504]: I1203 22:24:56.621056 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-config-data" (OuterVolumeSpecName: "config-data") pod "4616bb3b-141e-4185-a336-920a4e986750" (UID: "4616bb3b-141e-4185-a336-920a4e986750"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:56.621556 master-0 kubenswrapper[36504]: I1203 22:24:56.621457 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4616bb3b-141e-4185-a336-920a4e986750" (UID: "4616bb3b-141e-4185-a336-920a4e986750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:24:56.689238 master-0 kubenswrapper[36504]: I1203 22:24:56.689165 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:56.689238 master-0 kubenswrapper[36504]: I1203 22:24:56.689222 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:56.689238 master-0 kubenswrapper[36504]: I1203 22:24:56.689250 36504 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-credential-keys\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:56.689688 master-0 kubenswrapper[36504]: I1203 22:24:56.689267 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68m67\" (UniqueName: \"kubernetes.io/projected/4616bb3b-141e-4185-a336-920a4e986750-kube-api-access-68m67\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:56.689688 master-0 kubenswrapper[36504]: I1203 22:24:56.689282 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:56.689688 master-0 kubenswrapper[36504]: I1203 22:24:56.689294 36504 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4616bb3b-141e-4185-a336-920a4e986750-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 03 22:24:56.972235 master-0 kubenswrapper[36504]: I1203 22:24:56.972147 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-57lcw" event={"ID":"4616bb3b-141e-4185-a336-920a4e986750","Type":"ContainerDied","Data":"39a6d6390e8b07cc9af73639b049cc52189a8dc46201e6f81a48ceff211747cf"} Dec 03 22:24:56.972235 master-0 kubenswrapper[36504]: I1203 22:24:56.972224 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a6d6390e8b07cc9af73639b049cc52189a8dc46201e6f81a48ceff211747cf" Dec 03 22:24:56.972613 master-0 kubenswrapper[36504]: I1203 22:24:56.972311 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-57lcw" Dec 03 22:24:57.634266 master-0 kubenswrapper[36504]: I1203 22:24:57.634094 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-57lcw"] Dec 03 22:24:57.650848 master-0 kubenswrapper[36504]: I1203 22:24:57.650750 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-57lcw"] Dec 03 22:24:57.750269 master-0 kubenswrapper[36504]: I1203 22:24:57.750178 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hlr94"] Dec 03 22:24:57.751020 master-0 kubenswrapper[36504]: E1203 22:24:57.750984 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4616bb3b-141e-4185-a336-920a4e986750" containerName="keystone-bootstrap" Dec 03 22:24:57.751020 master-0 kubenswrapper[36504]: I1203 22:24:57.751011 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4616bb3b-141e-4185-a336-920a4e986750" containerName="keystone-bootstrap" Dec 03 22:24:57.751356 master-0 kubenswrapper[36504]: I1203 22:24:57.751324 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="4616bb3b-141e-4185-a336-920a4e986750" containerName="keystone-bootstrap" Dec 03 22:24:57.752502 master-0 kubenswrapper[36504]: I1203 22:24:57.752464 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.757407 master-0 kubenswrapper[36504]: I1203 22:24:57.757336 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:24:57.757860 master-0 kubenswrapper[36504]: I1203 22:24:57.757723 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:24:57.758520 master-0 kubenswrapper[36504]: I1203 22:24:57.758115 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:24:57.758520 master-0 kubenswrapper[36504]: I1203 22:24:57.758339 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:24:57.771735 master-0 kubenswrapper[36504]: I1203 22:24:57.771654 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hlr94"] Dec 03 22:24:57.849953 master-0 kubenswrapper[36504]: I1203 22:24:57.849837 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-scripts\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.850484 master-0 kubenswrapper[36504]: I1203 22:24:57.850044 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-combined-ca-bundle\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.851868 master-0 kubenswrapper[36504]: I1203 22:24:57.851815 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-fernet-keys\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.851944 master-0 kubenswrapper[36504]: I1203 22:24:57.851868 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5fcq\" (UniqueName: \"kubernetes.io/projected/22075cd6-19d1-4b03-8517-a9d678a23e23-kube-api-access-h5fcq\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.852177 master-0 kubenswrapper[36504]: I1203 22:24:57.852103 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-config-data\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.852243 master-0 kubenswrapper[36504]: I1203 22:24:57.852228 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-credential-keys\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.956655 master-0 kubenswrapper[36504]: I1203 22:24:57.956376 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-config-data\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.956655 master-0 kubenswrapper[36504]: I1203 22:24:57.956517 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-credential-keys\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.956977 master-0 kubenswrapper[36504]: I1203 22:24:57.956669 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-scripts\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.956977 master-0 kubenswrapper[36504]: I1203 22:24:57.956754 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-combined-ca-bundle\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.956977 master-0 kubenswrapper[36504]: I1203 22:24:57.956864 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-fernet-keys\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.956977 master-0 kubenswrapper[36504]: I1203 22:24:57.956895 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5fcq\" (UniqueName: \"kubernetes.io/projected/22075cd6-19d1-4b03-8517-a9d678a23e23-kube-api-access-h5fcq\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.960644 master-0 kubenswrapper[36504]: I1203 22:24:57.960563 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-scripts\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.962171 master-0 kubenswrapper[36504]: I1203 22:24:57.962102 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-fernet-keys\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.963546 master-0 kubenswrapper[36504]: I1203 22:24:57.963078 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-credential-keys\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.964919 master-0 kubenswrapper[36504]: I1203 22:24:57.964361 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-config-data\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.972670 master-0 kubenswrapper[36504]: I1203 22:24:57.972629 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-combined-ca-bundle\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:57.976339 master-0 kubenswrapper[36504]: I1203 22:24:57.976295 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5fcq\" (UniqueName: \"kubernetes.io/projected/22075cd6-19d1-4b03-8517-a9d678a23e23-kube-api-access-h5fcq\") pod \"keystone-bootstrap-hlr94\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:58.082845 master-0 kubenswrapper[36504]: I1203 22:24:58.082752 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:24:58.644413 master-0 kubenswrapper[36504]: I1203 22:24:58.644136 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:24:58.795140 master-0 kubenswrapper[36504]: I1203 22:24:58.793569 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff645d44c-2ndhp"] Dec 03 22:24:58.796085 master-0 kubenswrapper[36504]: I1203 22:24:58.795589 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" containerID="cri-o://1e44be388b71c86ea10e47695909fce01dcbdba8ee0d5078a421414b16fc0a44" gracePeriod=10 Dec 03 22:24:59.013949 master-0 kubenswrapper[36504]: I1203 22:24:59.013764 36504 generic.go:334] "Generic (PLEG): container finished" podID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerID="1e44be388b71c86ea10e47695909fce01dcbdba8ee0d5078a421414b16fc0a44" exitCode=0 Dec 03 22:24:59.013949 master-0 kubenswrapper[36504]: I1203 22:24:59.013824 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" event={"ID":"c33cc971-6403-48df-9951-a01a2e1c92e1","Type":"ContainerDied","Data":"1e44be388b71c86ea10e47695909fce01dcbdba8ee0d5078a421414b16fc0a44"} Dec 03 22:24:59.121485 master-0 kubenswrapper[36504]: I1203 22:24:59.121387 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4616bb3b-141e-4185-a336-920a4e986750" path="/var/lib/kubelet/pods/4616bb3b-141e-4185-a336-920a4e986750/volumes" Dec 03 22:24:59.534008 master-0 kubenswrapper[36504]: I1203 22:24:59.533916 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.204:5353: connect: connection refused" Dec 03 22:25:09.533567 master-0 kubenswrapper[36504]: I1203 22:25:09.533426 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.204:5353: i/o timeout" Dec 03 22:25:11.879516 master-0 kubenswrapper[36504]: I1203 22:25:11.879390 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-dataplane-os-edpm-s7gzj"] Dec 03 22:25:11.882575 master-0 kubenswrapper[36504]: I1203 22:25:11.881810 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:11.887889 master-0 kubenswrapper[36504]: I1203 22:25:11.887708 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:25:11.889374 master-0 kubenswrapper[36504]: I1203 22:25:11.889159 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:25:11.890712 master-0 kubenswrapper[36504]: I1203 22:25:11.890674 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:25:11.897746 master-0 kubenswrapper[36504]: I1203 22:25:11.895738 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-dataplane-os-edpm-s7gzj"] Dec 03 22:25:11.971237 master-0 kubenswrapper[36504]: I1203 22:25:11.971017 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-ssh-key\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:11.971983 master-0 kubenswrapper[36504]: I1203 22:25:11.971953 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflp5\" (UniqueName: \"kubernetes.io/projected/e4c57b50-c8ce-423b-9d25-b659e26c03f8-kube-api-access-bflp5\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:11.972119 master-0 kubenswrapper[36504]: I1203 22:25:11.972101 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:11.972278 master-0 kubenswrapper[36504]: I1203 22:25:11.972261 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-inventory\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.075591 master-0 kubenswrapper[36504]: I1203 22:25:12.075472 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.075983 master-0 kubenswrapper[36504]: I1203 22:25:12.075631 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-inventory\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.075983 master-0 kubenswrapper[36504]: I1203 22:25:12.075861 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-ssh-key\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.075983 master-0 kubenswrapper[36504]: I1203 22:25:12.075945 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflp5\" (UniqueName: \"kubernetes.io/projected/e4c57b50-c8ce-423b-9d25-b659e26c03f8-kube-api-access-bflp5\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.080131 master-0 kubenswrapper[36504]: I1203 22:25:12.080076 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-bootstrap-combined-ca-bundle\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.080229 master-0 kubenswrapper[36504]: I1203 22:25:12.080180 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-ssh-key\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.080339 master-0 kubenswrapper[36504]: I1203 22:25:12.080207 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-inventory\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.132671 master-0 kubenswrapper[36504]: I1203 22:25:12.132490 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflp5\" (UniqueName: \"kubernetes.io/projected/e4c57b50-c8ce-423b-9d25-b659e26c03f8-kube-api-access-bflp5\") pod \"bootstrap-dataplane-os-edpm-s7gzj\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:12.205384 master-0 kubenswrapper[36504]: I1203 22:25:12.205301 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:25:14.299088 master-0 kubenswrapper[36504]: I1203 22:25:14.299016 36504 generic.go:334] "Generic (PLEG): container finished" podID="99d0cdcb-2d58-4530-a135-0e89f8c77065" containerID="4ba7fbc701630d14c229d5082ae0c81d6fecad586043aa1c771966dd3808b291" exitCode=0 Dec 03 22:25:14.299088 master-0 kubenswrapper[36504]: I1203 22:25:14.299087 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4ht5" event={"ID":"99d0cdcb-2d58-4530-a135-0e89f8c77065","Type":"ContainerDied","Data":"4ba7fbc701630d14c229d5082ae0c81d6fecad586043aa1c771966dd3808b291"} Dec 03 22:25:14.534519 master-0 kubenswrapper[36504]: I1203 22:25:14.534233 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.204:5353: i/o timeout" Dec 03 22:25:14.535259 master-0 kubenswrapper[36504]: I1203 22:25:14.535177 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:25:15.355605 master-0 kubenswrapper[36504]: I1203 22:25:15.355531 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" event={"ID":"c33cc971-6403-48df-9951-a01a2e1c92e1","Type":"ContainerDied","Data":"760221e10aacfaf3d5060cec230fb38aa72a38c625be4fad14d190546b64b60f"} Dec 03 22:25:15.355605 master-0 kubenswrapper[36504]: I1203 22:25:15.355601 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760221e10aacfaf3d5060cec230fb38aa72a38c625be4fad14d190546b64b60f" Dec 03 22:25:15.411422 master-0 kubenswrapper[36504]: I1203 22:25:15.409924 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:25:15.588660 master-0 kubenswrapper[36504]: I1203 22:25:15.588091 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-sb\") pod \"c33cc971-6403-48df-9951-a01a2e1c92e1\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " Dec 03 22:25:15.589025 master-0 kubenswrapper[36504]: I1203 22:25:15.588884 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-nb\") pod \"c33cc971-6403-48df-9951-a01a2e1c92e1\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " Dec 03 22:25:15.589065 master-0 kubenswrapper[36504]: I1203 22:25:15.589018 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg2xb\" (UniqueName: \"kubernetes.io/projected/c33cc971-6403-48df-9951-a01a2e1c92e1-kube-api-access-vg2xb\") pod \"c33cc971-6403-48df-9951-a01a2e1c92e1\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " Dec 03 22:25:15.590277 master-0 kubenswrapper[36504]: I1203 22:25:15.589151 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-edpm\") pod \"c33cc971-6403-48df-9951-a01a2e1c92e1\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " Dec 03 22:25:15.590277 master-0 kubenswrapper[36504]: I1203 22:25:15.589223 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-dns-svc\") pod \"c33cc971-6403-48df-9951-a01a2e1c92e1\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " Dec 03 22:25:15.590277 master-0 kubenswrapper[36504]: I1203 22:25:15.589415 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-config\") pod \"c33cc971-6403-48df-9951-a01a2e1c92e1\" (UID: \"c33cc971-6403-48df-9951-a01a2e1c92e1\") " Dec 03 22:25:15.596033 master-0 kubenswrapper[36504]: I1203 22:25:15.595199 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c33cc971-6403-48df-9951-a01a2e1c92e1-kube-api-access-vg2xb" (OuterVolumeSpecName: "kube-api-access-vg2xb") pod "c33cc971-6403-48df-9951-a01a2e1c92e1" (UID: "c33cc971-6403-48df-9951-a01a2e1c92e1"). InnerVolumeSpecName "kube-api-access-vg2xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:15.670857 master-0 kubenswrapper[36504]: I1203 22:25:15.669366 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c33cc971-6403-48df-9951-a01a2e1c92e1" (UID: "c33cc971-6403-48df-9951-a01a2e1c92e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:15.678139 master-0 kubenswrapper[36504]: I1203 22:25:15.678066 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c33cc971-6403-48df-9951-a01a2e1c92e1" (UID: "c33cc971-6403-48df-9951-a01a2e1c92e1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:15.688288 master-0 kubenswrapper[36504]: I1203 22:25:15.688056 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c33cc971-6403-48df-9951-a01a2e1c92e1" (UID: "c33cc971-6403-48df-9951-a01a2e1c92e1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:15.696235 master-0 kubenswrapper[36504]: I1203 22:25:15.696166 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-config" (OuterVolumeSpecName: "config") pod "c33cc971-6403-48df-9951-a01a2e1c92e1" (UID: "c33cc971-6403-48df-9951-a01a2e1c92e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:15.697957 master-0 kubenswrapper[36504]: I1203 22:25:15.697910 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:15.698053 master-0 kubenswrapper[36504]: I1203 22:25:15.697959 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:15.698053 master-0 kubenswrapper[36504]: I1203 22:25:15.697981 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:15.698053 master-0 kubenswrapper[36504]: I1203 22:25:15.698000 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg2xb\" (UniqueName: \"kubernetes.io/projected/c33cc971-6403-48df-9951-a01a2e1c92e1-kube-api-access-vg2xb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:15.698053 master-0 kubenswrapper[36504]: I1203 22:25:15.698013 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:15.734548 master-0 kubenswrapper[36504]: I1203 22:25:15.734030 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-edpm" (OuterVolumeSpecName: "edpm") pod "c33cc971-6403-48df-9951-a01a2e1c92e1" (UID: "c33cc971-6403-48df-9951-a01a2e1c92e1"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:15.822021 master-0 kubenswrapper[36504]: I1203 22:25:15.821809 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c33cc971-6403-48df-9951-a01a2e1c92e1-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:16.141944 master-0 kubenswrapper[36504]: I1203 22:25:16.141553 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:25:16.366755 master-0 kubenswrapper[36504]: I1203 22:25:16.366695 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" Dec 03 22:25:16.416902 master-0 kubenswrapper[36504]: I1203 22:25:16.416808 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff645d44c-2ndhp"] Dec 03 22:25:16.438856 master-0 kubenswrapper[36504]: I1203 22:25:16.438283 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff645d44c-2ndhp"] Dec 03 22:25:17.116115 master-0 kubenswrapper[36504]: I1203 22:25:17.116043 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" path="/var/lib/kubelet/pods/c33cc971-6403-48df-9951-a01a2e1c92e1/volumes" Dec 03 22:25:17.388793 master-0 kubenswrapper[36504]: I1203 22:25:17.388526 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"dc35c451-c612-46b7-8328-9c8bfa7891e7","Type":"ContainerStarted","Data":"11a26d88261e713a0b7c6cbe9e545c9bc81ce59c9cc3b35e5b72318e70b98f24"} Dec 03 22:25:17.402312 master-0 kubenswrapper[36504]: I1203 22:25:17.402231 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n4ht5" event={"ID":"99d0cdcb-2d58-4530-a135-0e89f8c77065","Type":"ContainerDied","Data":"f60609604a14c53dade4a1414b76d4bd121fd07c6374999a281da3768fcc094f"} Dec 03 22:25:17.402312 master-0 kubenswrapper[36504]: I1203 22:25:17.402310 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60609604a14c53dade4a1414b76d4bd121fd07c6374999a281da3768fcc094f" Dec 03 22:25:17.504924 master-0 kubenswrapper[36504]: I1203 22:25:17.504876 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:25:17.614345 master-0 kubenswrapper[36504]: I1203 22:25:17.613799 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftpf7\" (UniqueName: \"kubernetes.io/projected/99d0cdcb-2d58-4530-a135-0e89f8c77065-kube-api-access-ftpf7\") pod \"99d0cdcb-2d58-4530-a135-0e89f8c77065\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " Dec 03 22:25:17.614724 master-0 kubenswrapper[36504]: I1203 22:25:17.614447 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-config\") pod \"99d0cdcb-2d58-4530-a135-0e89f8c77065\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " Dec 03 22:25:17.614724 master-0 kubenswrapper[36504]: I1203 22:25:17.614532 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-combined-ca-bundle\") pod \"99d0cdcb-2d58-4530-a135-0e89f8c77065\" (UID: \"99d0cdcb-2d58-4530-a135-0e89f8c77065\") " Dec 03 22:25:17.623318 master-0 kubenswrapper[36504]: I1203 22:25:17.623017 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99d0cdcb-2d58-4530-a135-0e89f8c77065-kube-api-access-ftpf7" (OuterVolumeSpecName: "kube-api-access-ftpf7") pod "99d0cdcb-2d58-4530-a135-0e89f8c77065" (UID: "99d0cdcb-2d58-4530-a135-0e89f8c77065"). InnerVolumeSpecName "kube-api-access-ftpf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:17.696089 master-0 kubenswrapper[36504]: I1203 22:25:17.695999 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:25:17.719274 master-0 kubenswrapper[36504]: I1203 22:25:17.719207 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftpf7\" (UniqueName: \"kubernetes.io/projected/99d0cdcb-2d58-4530-a135-0e89f8c77065-kube-api-access-ftpf7\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:17.731072 master-0 kubenswrapper[36504]: W1203 22:25:17.730972 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cb8747f_1989_4028_b62a_5bf9ea57f609.slice/crio-26e5f37107d13dae26d6bb143d727e2399393df0bf5e4ea3e311026466f6d9a9 WatchSource:0}: Error finding container 26e5f37107d13dae26d6bb143d727e2399393df0bf5e4ea3e311026466f6d9a9: Status 404 returned error can't find the container with id 26e5f37107d13dae26d6bb143d727e2399393df0bf5e4ea3e311026466f6d9a9 Dec 03 22:25:17.783382 master-0 kubenswrapper[36504]: I1203 22:25:17.783124 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hlr94"] Dec 03 22:25:17.813783 master-0 kubenswrapper[36504]: W1203 22:25:17.813629 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22075cd6_19d1_4b03_8517_a9d678a23e23.slice/crio-c7388b6273997a30edbba35d37275868899dd5c58e29ca685ab0196769a381c6 WatchSource:0}: Error finding container c7388b6273997a30edbba35d37275868899dd5c58e29ca685ab0196769a381c6: Status 404 returned error can't find the container with id c7388b6273997a30edbba35d37275868899dd5c58e29ca685ab0196769a381c6 Dec 03 22:25:17.864636 master-0 kubenswrapper[36504]: I1203 22:25:17.864571 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-config" (OuterVolumeSpecName: "config") pod "99d0cdcb-2d58-4530-a135-0e89f8c77065" (UID: "99d0cdcb-2d58-4530-a135-0e89f8c77065"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:17.865930 master-0 kubenswrapper[36504]: I1203 22:25:17.865734 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99d0cdcb-2d58-4530-a135-0e89f8c77065" (UID: "99d0cdcb-2d58-4530-a135-0e89f8c77065"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:17.907356 master-0 kubenswrapper[36504]: I1203 22:25:17.907284 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-dataplane-os-edpm-s7gzj"] Dec 03 22:25:17.929340 master-0 kubenswrapper[36504]: W1203 22:25:17.929254 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c57b50_c8ce_423b_9d25_b659e26c03f8.slice/crio-4cc2a7218ef4d430152d156a688ad6b6df852634d54176ccb20094d11162927c WatchSource:0}: Error finding container 4cc2a7218ef4d430152d156a688ad6b6df852634d54176ccb20094d11162927c: Status 404 returned error can't find the container with id 4cc2a7218ef4d430152d156a688ad6b6df852634d54176ccb20094d11162927c Dec 03 22:25:17.933273 master-0 kubenswrapper[36504]: I1203 22:25:17.933216 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:17.933363 master-0 kubenswrapper[36504]: I1203 22:25:17.933280 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99d0cdcb-2d58-4530-a135-0e89f8c77065-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:18.435255 master-0 kubenswrapper[36504]: I1203 22:25:18.435177 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerStarted","Data":"da59aad6ab606c854f6b9ee71b211b268f994002d51927273d1f73b261fb46de"} Dec 03 22:25:18.438417 master-0 kubenswrapper[36504]: I1203 22:25:18.438354 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v4x4d" event={"ID":"3b12b6b8-6e84-4e56-b403-d4d31d309852","Type":"ContainerStarted","Data":"3c4552f29ecea698a744372808b816cad64f4c05790364bc740c3ca6a59ee260"} Dec 03 22:25:18.441082 master-0 kubenswrapper[36504]: I1203 22:25:18.441033 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hghcd" event={"ID":"79800ee3-07d8-43e7-9263-15e8cdeef26d","Type":"ContainerStarted","Data":"f73b6b3ecae967d9bb29ee494c0c19b97d70da3f1232551294416b1005658838"} Dec 03 22:25:18.445786 master-0 kubenswrapper[36504]: I1203 22:25:18.445678 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"dc35c451-c612-46b7-8328-9c8bfa7891e7","Type":"ContainerStarted","Data":"1134b64cbd728364a1ecbcc5af5ff8ad9874f65a096416a5ff243337bacd7f54"} Dec 03 22:25:18.447891 master-0 kubenswrapper[36504]: I1203 22:25:18.447703 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"6cb8747f-1989-4028-b62a-5bf9ea57f609","Type":"ContainerStarted","Data":"26e5f37107d13dae26d6bb143d727e2399393df0bf5e4ea3e311026466f6d9a9"} Dec 03 22:25:18.451594 master-0 kubenswrapper[36504]: I1203 22:25:18.451546 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" event={"ID":"e4c57b50-c8ce-423b-9d25-b659e26c03f8","Type":"ContainerStarted","Data":"4cc2a7218ef4d430152d156a688ad6b6df852634d54176ccb20094d11162927c"} Dec 03 22:25:18.457053 master-0 kubenswrapper[36504]: I1203 22:25:18.457004 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlr94" event={"ID":"22075cd6-19d1-4b03-8517-a9d678a23e23","Type":"ContainerStarted","Data":"0b0691b47f7c7349fc28ccbb6511cc23b35ecb7ebcbd369f30bb089fed3c7142"} Dec 03 22:25:18.457053 master-0 kubenswrapper[36504]: I1203 22:25:18.457053 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlr94" event={"ID":"22075cd6-19d1-4b03-8517-a9d678a23e23","Type":"ContainerStarted","Data":"c7388b6273997a30edbba35d37275868899dd5c58e29ca685ab0196769a381c6"} Dec 03 22:25:18.464409 master-0 kubenswrapper[36504]: I1203 22:25:18.462317 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n4ht5" Dec 03 22:25:18.464409 master-0 kubenswrapper[36504]: I1203 22:25:18.462429 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8lw82" event={"ID":"486e7352-51c4-4e75-9ff9-ead9eb721c74","Type":"ContainerStarted","Data":"ed22f8f052b7eb8696fb7d126014ca7ee249f63bbee1339902a7792b160ced4b"} Dec 03 22:25:18.467116 master-0 kubenswrapper[36504]: I1203 22:25:18.466040 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-v4x4d" podStartSLOduration=4.395077679 podStartE2EDuration="31.466013514s" podCreationTimestamp="2025-12-03 22:24:47 +0000 UTC" firstStartedPulling="2025-12-03 22:24:49.713015673 +0000 UTC m=+854.932787680" lastFinishedPulling="2025-12-03 22:25:16.783951508 +0000 UTC m=+882.003723515" observedRunningTime="2025-12-03 22:25:18.461839074 +0000 UTC m=+883.681611101" watchObservedRunningTime="2025-12-03 22:25:18.466013514 +0000 UTC m=+883.685785521" Dec 03 22:25:18.509848 master-0 kubenswrapper[36504]: I1203 22:25:18.509529 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-hghcd" podStartSLOduration=4.1395748900000005 podStartE2EDuration="32.509495712s" podCreationTimestamp="2025-12-03 22:24:46 +0000 UTC" firstStartedPulling="2025-12-03 22:24:48.89553775 +0000 UTC m=+854.115309757" lastFinishedPulling="2025-12-03 22:25:17.265458572 +0000 UTC m=+882.485230579" observedRunningTime="2025-12-03 22:25:18.495268334 +0000 UTC m=+883.715040361" watchObservedRunningTime="2025-12-03 22:25:18.509495712 +0000 UTC m=+883.729267739" Dec 03 22:25:18.537523 master-0 kubenswrapper[36504]: I1203 22:25:18.535990 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hlr94" podStartSLOduration=21.535944552 podStartE2EDuration="21.535944552s" podCreationTimestamp="2025-12-03 22:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:18.518398902 +0000 UTC m=+883.738170909" watchObservedRunningTime="2025-12-03 22:25:18.535944552 +0000 UTC m=+883.755716559" Dec 03 22:25:18.604917 master-0 kubenswrapper[36504]: I1203 22:25:18.604452 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8lw82" podStartSLOduration=4.094590206 podStartE2EDuration="31.604426465s" podCreationTimestamp="2025-12-03 22:24:47 +0000 UTC" firstStartedPulling="2025-12-03 22:24:49.74440927 +0000 UTC m=+854.964181277" lastFinishedPulling="2025-12-03 22:25:17.254245529 +0000 UTC m=+882.474017536" observedRunningTime="2025-12-03 22:25:18.545915156 +0000 UTC m=+883.765687183" watchObservedRunningTime="2025-12-03 22:25:18.604426465 +0000 UTC m=+883.824198472" Dec 03 22:25:19.042932 master-0 kubenswrapper[36504]: I1203 22:25:19.023833 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b7f767947-jf44q"] Dec 03 22:25:19.045190 master-0 kubenswrapper[36504]: E1203 22:25:19.035666 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="init" Dec 03 22:25:19.045190 master-0 kubenswrapper[36504]: I1203 22:25:19.045126 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="init" Dec 03 22:25:19.061102 master-0 kubenswrapper[36504]: E1203 22:25:19.059850 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99d0cdcb-2d58-4530-a135-0e89f8c77065" containerName="neutron-db-sync" Dec 03 22:25:19.061102 master-0 kubenswrapper[36504]: I1203 22:25:19.059926 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="99d0cdcb-2d58-4530-a135-0e89f8c77065" containerName="neutron-db-sync" Dec 03 22:25:19.061102 master-0 kubenswrapper[36504]: E1203 22:25:19.059955 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" Dec 03 22:25:19.061102 master-0 kubenswrapper[36504]: I1203 22:25:19.060160 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" Dec 03 22:25:19.063353 master-0 kubenswrapper[36504]: I1203 22:25:19.061920 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="99d0cdcb-2d58-4530-a135-0e89f8c77065" containerName="neutron-db-sync" Dec 03 22:25:19.063353 master-0 kubenswrapper[36504]: I1203 22:25:19.062047 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" Dec 03 22:25:19.065458 master-0 kubenswrapper[36504]: I1203 22:25:19.065402 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.180964 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-config\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.181064 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-nb\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.181110 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-edpm\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.181147 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rflp\" (UniqueName: \"kubernetes.io/projected/4214a09f-9cb8-4d17-9a41-369838f0eedc-kube-api-access-7rflp\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.181244 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-svc\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.181294 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-sb\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.185832 master-0 kubenswrapper[36504]: I1203 22:25:19.181806 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-swift-storage-0\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.195850 master-0 kubenswrapper[36504]: I1203 22:25:19.195572 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b7f767947-jf44q"] Dec 03 22:25:19.244134 master-0 kubenswrapper[36504]: I1203 22:25:19.244035 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-64488669cb-b8cqj"] Dec 03 22:25:19.247570 master-0 kubenswrapper[36504]: I1203 22:25:19.247489 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.251645 master-0 kubenswrapper[36504]: I1203 22:25:19.251061 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 22:25:19.265278 master-0 kubenswrapper[36504]: I1203 22:25:19.265209 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64488669cb-b8cqj"] Dec 03 22:25:19.275838 master-0 kubenswrapper[36504]: I1203 22:25:19.275760 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 22:25:19.281364 master-0 kubenswrapper[36504]: I1203 22:25:19.281300 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 22:25:19.285225 master-0 kubenswrapper[36504]: I1203 22:25:19.284410 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-config\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.285365 master-0 kubenswrapper[36504]: I1203 22:25:19.285296 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-nb\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.285885 master-0 kubenswrapper[36504]: I1203 22:25:19.285826 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-edpm\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.285885 master-0 kubenswrapper[36504]: I1203 22:25:19.285865 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rflp\" (UniqueName: \"kubernetes.io/projected/4214a09f-9cb8-4d17-9a41-369838f0eedc-kube-api-access-7rflp\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.287320 master-0 kubenswrapper[36504]: I1203 22:25:19.285959 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-svc\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.287320 master-0 kubenswrapper[36504]: I1203 22:25:19.286008 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-sb\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.287320 master-0 kubenswrapper[36504]: I1203 22:25:19.286206 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-nb\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.287320 master-0 kubenswrapper[36504]: I1203 22:25:19.285524 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-config\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.287320 master-0 kubenswrapper[36504]: I1203 22:25:19.286680 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-swift-storage-0\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.287320 master-0 kubenswrapper[36504]: I1203 22:25:19.286894 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-edpm\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.288096 master-0 kubenswrapper[36504]: I1203 22:25:19.287917 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-sb\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.293977 master-0 kubenswrapper[36504]: I1203 22:25:19.292035 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-swift-storage-0\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.302945 master-0 kubenswrapper[36504]: I1203 22:25:19.302884 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-svc\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.311192 master-0 kubenswrapper[36504]: I1203 22:25:19.311137 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rflp\" (UniqueName: \"kubernetes.io/projected/4214a09f-9cb8-4d17-9a41-369838f0eedc-kube-api-access-7rflp\") pod \"dnsmasq-dns-b7f767947-jf44q\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.420801 master-0 kubenswrapper[36504]: I1203 22:25:19.403842 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjcff\" (UniqueName: \"kubernetes.io/projected/2bf27d66-22d6-4ae2-8b10-41eba3e48294-kube-api-access-bjcff\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.420801 master-0 kubenswrapper[36504]: I1203 22:25:19.403978 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-ovndb-tls-certs\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.420801 master-0 kubenswrapper[36504]: I1203 22:25:19.404098 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-combined-ca-bundle\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.420801 master-0 kubenswrapper[36504]: I1203 22:25:19.404235 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-httpd-config\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.420801 master-0 kubenswrapper[36504]: I1203 22:25:19.404292 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-config\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.451699 master-0 kubenswrapper[36504]: I1203 22:25:19.448201 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:19.509481 master-0 kubenswrapper[36504]: I1203 22:25:19.507431 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjcff\" (UniqueName: \"kubernetes.io/projected/2bf27d66-22d6-4ae2-8b10-41eba3e48294-kube-api-access-bjcff\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.509481 master-0 kubenswrapper[36504]: I1203 22:25:19.507518 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-ovndb-tls-certs\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.509481 master-0 kubenswrapper[36504]: I1203 22:25:19.507604 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-combined-ca-bundle\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.509481 master-0 kubenswrapper[36504]: I1203 22:25:19.507712 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-httpd-config\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.509481 master-0 kubenswrapper[36504]: I1203 22:25:19.507759 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-config\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.530800 master-0 kubenswrapper[36504]: I1203 22:25:19.527808 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-ovndb-tls-certs\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.530800 master-0 kubenswrapper[36504]: I1203 22:25:19.528336 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-config\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.530800 master-0 kubenswrapper[36504]: I1203 22:25:19.530727 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-combined-ca-bundle\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.538794 master-0 kubenswrapper[36504]: I1203 22:25:19.537484 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ff645d44c-2ndhp" podUID="c33cc971-6403-48df-9951-a01a2e1c92e1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.204:5353: i/o timeout" Dec 03 22:25:19.547892 master-0 kubenswrapper[36504]: I1203 22:25:19.543657 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-httpd-config\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.571798 master-0 kubenswrapper[36504]: I1203 22:25:19.564369 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjcff\" (UniqueName: \"kubernetes.io/projected/2bf27d66-22d6-4ae2-8b10-41eba3e48294-kube-api-access-bjcff\") pod \"neutron-64488669cb-b8cqj\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.601804 master-0 kubenswrapper[36504]: I1203 22:25:19.586473 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"dc35c451-c612-46b7-8328-9c8bfa7891e7","Type":"ContainerStarted","Data":"3a0cd1e926d33bb474ede50ac54ad7dac595519714f4608d71bc2d26bdc7afa9"} Dec 03 22:25:19.617798 master-0 kubenswrapper[36504]: I1203 22:25:19.612185 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-db-sync-wq9c6" event={"ID":"c4f03542-de5f-457c-8843-01ea5d2febce","Type":"ContainerStarted","Data":"33176e2d1fb664c8b39f0ac856d7f6bdbdc4e6ffadca38e31d0edd73a918f866"} Dec 03 22:25:19.701800 master-0 kubenswrapper[36504]: I1203 22:25:19.681898 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"6cb8747f-1989-4028-b62a-5bf9ea57f609","Type":"ContainerStarted","Data":"c28adb8871b8d1df40a0ebc7d67cde64a957e8e8087902237524a1d002d90584"} Dec 03 22:25:19.701800 master-0 kubenswrapper[36504]: I1203 22:25:19.697896 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-baebb-default-external-api-0" podStartSLOduration=29.697859261 podStartE2EDuration="29.697859261s" podCreationTimestamp="2025-12-03 22:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:19.638318839 +0000 UTC m=+884.858090846" watchObservedRunningTime="2025-12-03 22:25:19.697859261 +0000 UTC m=+884.917631258" Dec 03 22:25:19.701800 master-0 kubenswrapper[36504]: I1203 22:25:19.699588 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:19.747074 master-0 kubenswrapper[36504]: I1203 22:25:19.741603 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-db-sync-wq9c6" podStartSLOduration=5.811565261 podStartE2EDuration="33.741552745s" podCreationTimestamp="2025-12-03 22:24:46 +0000 UTC" firstStartedPulling="2025-12-03 22:24:49.391192449 +0000 UTC m=+854.610964456" lastFinishedPulling="2025-12-03 22:25:17.321179933 +0000 UTC m=+882.540951940" observedRunningTime="2025-12-03 22:25:19.709166836 +0000 UTC m=+884.928938843" watchObservedRunningTime="2025-12-03 22:25:19.741552745 +0000 UTC m=+884.961324752" Dec 03 22:25:20.443958 master-0 kubenswrapper[36504]: I1203 22:25:20.439843 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b7f767947-jf44q"] Dec 03 22:25:20.694401 master-0 kubenswrapper[36504]: I1203 22:25:20.694274 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"6cb8747f-1989-4028-b62a-5bf9ea57f609","Type":"ContainerStarted","Data":"41577981d3efce579ffaefd5b26dba879a677a4785d74d6355a99e12e822d046"} Dec 03 22:25:20.791874 master-0 kubenswrapper[36504]: I1203 22:25:20.791721 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-baebb-default-internal-api-0" podStartSLOduration=30.791685629 podStartE2EDuration="30.791685629s" podCreationTimestamp="2025-12-03 22:24:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:20.766918571 +0000 UTC m=+885.986690598" watchObservedRunningTime="2025-12-03 22:25:20.791685629 +0000 UTC m=+886.011457626" Dec 03 22:25:20.820664 master-0 kubenswrapper[36504]: I1203 22:25:20.820587 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-64488669cb-b8cqj"] Dec 03 22:25:21.107230 master-0 kubenswrapper[36504]: W1203 22:25:21.107091 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bf27d66_22d6_4ae2_8b10_41eba3e48294.slice/crio-e217d04d35e47042a150331523c569b66b183fb96f40b9d4bca7018443d2d386 WatchSource:0}: Error finding container e217d04d35e47042a150331523c569b66b183fb96f40b9d4bca7018443d2d386: Status 404 returned error can't find the container with id e217d04d35e47042a150331523c569b66b183fb96f40b9d4bca7018443d2d386 Dec 03 22:25:21.719095 master-0 kubenswrapper[36504]: I1203 22:25:21.719002 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7f767947-jf44q" event={"ID":"4214a09f-9cb8-4d17-9a41-369838f0eedc","Type":"ContainerStarted","Data":"e0dc1317629462445b8dc30902570d0d46ced1d896ed967cd3b8f1628525eb34"} Dec 03 22:25:21.725594 master-0 kubenswrapper[36504]: I1203 22:25:21.724576 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64488669cb-b8cqj" event={"ID":"2bf27d66-22d6-4ae2-8b10-41eba3e48294","Type":"ContainerStarted","Data":"e217d04d35e47042a150331523c569b66b183fb96f40b9d4bca7018443d2d386"} Dec 03 22:25:22.774453 master-0 kubenswrapper[36504]: I1203 22:25:22.774357 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerStarted","Data":"ccebe6789c8d277e28fc5b0704007c0ded605cc5b201be05d197ca94a05e11a5"} Dec 03 22:25:22.783431 master-0 kubenswrapper[36504]: I1203 22:25:22.783335 36504 generic.go:334] "Generic (PLEG): container finished" podID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerID="8e1cbbcd94b5bd5b44674846b4116d0c3e8c98bd508594f298ee41cbbd08e75b" exitCode=0 Dec 03 22:25:22.783976 master-0 kubenswrapper[36504]: I1203 22:25:22.783934 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7f767947-jf44q" event={"ID":"4214a09f-9cb8-4d17-9a41-369838f0eedc","Type":"ContainerDied","Data":"8e1cbbcd94b5bd5b44674846b4116d0c3e8c98bd508594f298ee41cbbd08e75b"} Dec 03 22:25:22.797178 master-0 kubenswrapper[36504]: I1203 22:25:22.795980 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64488669cb-b8cqj" event={"ID":"2bf27d66-22d6-4ae2-8b10-41eba3e48294","Type":"ContainerStarted","Data":"92b857429875a99f53e495dd31dfab68a737dd9585a62fb08ccbd9d6dff299e2"} Dec 03 22:25:22.828668 master-0 kubenswrapper[36504]: I1203 22:25:22.828565 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56f6c98b97-zjmbp"] Dec 03 22:25:22.844394 master-0 kubenswrapper[36504]: I1203 22:25:22.843868 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.852631 master-0 kubenswrapper[36504]: I1203 22:25:22.851040 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 22:25:22.852631 master-0 kubenswrapper[36504]: I1203 22:25:22.851288 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 22:25:22.909812 master-0 kubenswrapper[36504]: I1203 22:25:22.906848 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f6c98b97-zjmbp"] Dec 03 22:25:22.942692 master-0 kubenswrapper[36504]: I1203 22:25:22.942594 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-public-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.942692 master-0 kubenswrapper[36504]: I1203 22:25:22.942695 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-httpd-config\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.943073 master-0 kubenswrapper[36504]: I1203 22:25:22.942724 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cc6\" (UniqueName: \"kubernetes.io/projected/32f345c1-f0c4-48a5-b3ab-da35ac924812-kube-api-access-25cc6\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.943073 master-0 kubenswrapper[36504]: I1203 22:25:22.942763 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-config\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.943073 master-0 kubenswrapper[36504]: I1203 22:25:22.942909 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-combined-ca-bundle\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.943073 master-0 kubenswrapper[36504]: I1203 22:25:22.942955 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-internal-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:22.943210 master-0 kubenswrapper[36504]: I1203 22:25:22.943072 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-ovndb-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.035183 master-0 kubenswrapper[36504]: I1203 22:25:23.031795 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:23.035183 master-0 kubenswrapper[36504]: I1203 22:25:23.031884 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:23.035183 master-0 kubenswrapper[36504]: I1203 22:25:23.031899 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:23.035183 master-0 kubenswrapper[36504]: I1203 22:25:23.031910 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:23.046698 master-0 kubenswrapper[36504]: I1203 22:25:23.046608 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-public-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.046963 master-0 kubenswrapper[36504]: I1203 22:25:23.046763 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-httpd-config\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.046963 master-0 kubenswrapper[36504]: I1203 22:25:23.046864 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cc6\" (UniqueName: \"kubernetes.io/projected/32f345c1-f0c4-48a5-b3ab-da35ac924812-kube-api-access-25cc6\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.046963 master-0 kubenswrapper[36504]: I1203 22:25:23.046945 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-config\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.047808 master-0 kubenswrapper[36504]: I1203 22:25:23.047132 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-combined-ca-bundle\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.047808 master-0 kubenswrapper[36504]: I1203 22:25:23.047254 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-internal-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.047808 master-0 kubenswrapper[36504]: I1203 22:25:23.047428 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-ovndb-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.056853 master-0 kubenswrapper[36504]: I1203 22:25:23.056794 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-config\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.060739 master-0 kubenswrapper[36504]: I1203 22:25:23.060677 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-public-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.061021 master-0 kubenswrapper[36504]: I1203 22:25:23.060871 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-httpd-config\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.062537 master-0 kubenswrapper[36504]: I1203 22:25:23.062503 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-internal-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.064840 master-0 kubenswrapper[36504]: I1203 22:25:23.064764 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-ovndb-tls-certs\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.072256 master-0 kubenswrapper[36504]: I1203 22:25:23.072200 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f345c1-f0c4-48a5-b3ab-da35ac924812-combined-ca-bundle\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.077231 master-0 kubenswrapper[36504]: I1203 22:25:23.077183 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cc6\" (UniqueName: \"kubernetes.io/projected/32f345c1-f0c4-48a5-b3ab-da35ac924812-kube-api-access-25cc6\") pod \"neutron-56f6c98b97-zjmbp\" (UID: \"32f345c1-f0c4-48a5-b3ab-da35ac924812\") " pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.089109 master-0 kubenswrapper[36504]: I1203 22:25:23.089038 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:23.118578 master-0 kubenswrapper[36504]: I1203 22:25:23.118429 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:23.180852 master-0 kubenswrapper[36504]: I1203 22:25:23.180789 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:23.841599 master-0 kubenswrapper[36504]: I1203 22:25:23.841174 36504 generic.go:334] "Generic (PLEG): container finished" podID="3b12b6b8-6e84-4e56-b403-d4d31d309852" containerID="3c4552f29ecea698a744372808b816cad64f4c05790364bc740c3ca6a59ee260" exitCode=0 Dec 03 22:25:23.841599 master-0 kubenswrapper[36504]: I1203 22:25:23.841380 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v4x4d" event={"ID":"3b12b6b8-6e84-4e56-b403-d4d31d309852","Type":"ContainerDied","Data":"3c4552f29ecea698a744372808b816cad64f4c05790364bc740c3ca6a59ee260"} Dec 03 22:25:23.845222 master-0 kubenswrapper[36504]: I1203 22:25:23.845141 36504 generic.go:334] "Generic (PLEG): container finished" podID="22075cd6-19d1-4b03-8517-a9d678a23e23" containerID="0b0691b47f7c7349fc28ccbb6511cc23b35ecb7ebcbd369f30bb089fed3c7142" exitCode=0 Dec 03 22:25:23.845345 master-0 kubenswrapper[36504]: I1203 22:25:23.845246 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlr94" event={"ID":"22075cd6-19d1-4b03-8517-a9d678a23e23","Type":"ContainerDied","Data":"0b0691b47f7c7349fc28ccbb6511cc23b35ecb7ebcbd369f30bb089fed3c7142"} Dec 03 22:25:24.895609 master-0 kubenswrapper[36504]: I1203 22:25:24.895499 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7f767947-jf44q" event={"ID":"4214a09f-9cb8-4d17-9a41-369838f0eedc","Type":"ContainerStarted","Data":"320c354237b8f5025d220d84753f9b6d65c75e6860e9b6ff615aed0d058e959d"} Dec 03 22:25:24.896946 master-0 kubenswrapper[36504]: I1203 22:25:24.895961 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:24.911712 master-0 kubenswrapper[36504]: I1203 22:25:24.910599 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64488669cb-b8cqj" event={"ID":"2bf27d66-22d6-4ae2-8b10-41eba3e48294","Type":"ContainerStarted","Data":"fe175fa8193f4ae74569fd34bf60c3a220d9fe564ee1c661e695748d46ef3a36"} Dec 03 22:25:24.911712 master-0 kubenswrapper[36504]: I1203 22:25:24.910834 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:24.953908 master-0 kubenswrapper[36504]: I1203 22:25:24.953727 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56f6c98b97-zjmbp"] Dec 03 22:25:25.017827 master-0 kubenswrapper[36504]: I1203 22:25:25.013492 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b7f767947-jf44q" podStartSLOduration=7.013457667 podStartE2EDuration="7.013457667s" podCreationTimestamp="2025-12-03 22:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:24.995901056 +0000 UTC m=+890.215673063" watchObservedRunningTime="2025-12-03 22:25:25.013457667 +0000 UTC m=+890.233229674" Dec 03 22:25:25.043535 master-0 kubenswrapper[36504]: I1203 22:25:25.043366 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-64488669cb-b8cqj" podStartSLOduration=6.043322046 podStartE2EDuration="6.043322046s" podCreationTimestamp="2025-12-03 22:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:25.03294388 +0000 UTC m=+890.252715887" watchObservedRunningTime="2025-12-03 22:25:25.043322046 +0000 UTC m=+890.263094053" Dec 03 22:25:25.065690 master-0 kubenswrapper[36504]: I1203 22:25:25.065571 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:25.065690 master-0 kubenswrapper[36504]: I1203 22:25:25.065676 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:25.072029 master-0 kubenswrapper[36504]: I1203 22:25:25.071675 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:25.072029 master-0 kubenswrapper[36504]: I1203 22:25:25.071805 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:25.196864 master-0 kubenswrapper[36504]: I1203 22:25:25.193935 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:25.196864 master-0 kubenswrapper[36504]: I1203 22:25:25.194044 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:25.507691 master-0 kubenswrapper[36504]: I1203 22:25:25.507138 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v4x4d" Dec 03 22:25:25.637246 master-0 kubenswrapper[36504]: I1203 22:25:25.637062 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:25:25.641369 master-0 kubenswrapper[36504]: I1203 22:25:25.641273 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b12b6b8-6e84-4e56-b403-d4d31d309852-logs\") pod \"3b12b6b8-6e84-4e56-b403-d4d31d309852\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " Dec 03 22:25:25.643009 master-0 kubenswrapper[36504]: I1203 22:25:25.642634 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-scripts\") pod \"3b12b6b8-6e84-4e56-b403-d4d31d309852\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " Dec 03 22:25:25.643009 master-0 kubenswrapper[36504]: I1203 22:25:25.642821 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-combined-ca-bundle\") pod \"3b12b6b8-6e84-4e56-b403-d4d31d309852\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " Dec 03 22:25:25.643435 master-0 kubenswrapper[36504]: I1203 22:25:25.643196 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/3b12b6b8-6e84-4e56-b403-d4d31d309852-kube-api-access-rth2j\") pod \"3b12b6b8-6e84-4e56-b403-d4d31d309852\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " Dec 03 22:25:25.643435 master-0 kubenswrapper[36504]: I1203 22:25:25.643333 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-config-data\") pod \"3b12b6b8-6e84-4e56-b403-d4d31d309852\" (UID: \"3b12b6b8-6e84-4e56-b403-d4d31d309852\") " Dec 03 22:25:25.648948 master-0 kubenswrapper[36504]: I1203 22:25:25.648838 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b12b6b8-6e84-4e56-b403-d4d31d309852-logs" (OuterVolumeSpecName: "logs") pod "3b12b6b8-6e84-4e56-b403-d4d31d309852" (UID: "3b12b6b8-6e84-4e56-b403-d4d31d309852"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:25:25.651403 master-0 kubenswrapper[36504]: I1203 22:25:25.651286 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b12b6b8-6e84-4e56-b403-d4d31d309852-kube-api-access-rth2j" (OuterVolumeSpecName: "kube-api-access-rth2j") pod "3b12b6b8-6e84-4e56-b403-d4d31d309852" (UID: "3b12b6b8-6e84-4e56-b403-d4d31d309852"). InnerVolumeSpecName "kube-api-access-rth2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:25.655243 master-0 kubenswrapper[36504]: I1203 22:25:25.655079 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-scripts" (OuterVolumeSpecName: "scripts") pod "3b12b6b8-6e84-4e56-b403-d4d31d309852" (UID: "3b12b6b8-6e84-4e56-b403-d4d31d309852"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.686285 master-0 kubenswrapper[36504]: I1203 22:25:25.686074 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-config-data" (OuterVolumeSpecName: "config-data") pod "3b12b6b8-6e84-4e56-b403-d4d31d309852" (UID: "3b12b6b8-6e84-4e56-b403-d4d31d309852"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.727016 master-0 kubenswrapper[36504]: I1203 22:25:25.713941 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b12b6b8-6e84-4e56-b403-d4d31d309852" (UID: "3b12b6b8-6e84-4e56-b403-d4d31d309852"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.746475 master-0 kubenswrapper[36504]: I1203 22:25:25.746402 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-scripts\") pod \"22075cd6-19d1-4b03-8517-a9d678a23e23\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " Dec 03 22:25:25.746973 master-0 kubenswrapper[36504]: I1203 22:25:25.746616 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-credential-keys\") pod \"22075cd6-19d1-4b03-8517-a9d678a23e23\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " Dec 03 22:25:25.746973 master-0 kubenswrapper[36504]: I1203 22:25:25.746936 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-fernet-keys\") pod \"22075cd6-19d1-4b03-8517-a9d678a23e23\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " Dec 03 22:25:25.747055 master-0 kubenswrapper[36504]: I1203 22:25:25.747011 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5fcq\" (UniqueName: \"kubernetes.io/projected/22075cd6-19d1-4b03-8517-a9d678a23e23-kube-api-access-h5fcq\") pod \"22075cd6-19d1-4b03-8517-a9d678a23e23\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " Dec 03 22:25:25.747141 master-0 kubenswrapper[36504]: I1203 22:25:25.747109 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-combined-ca-bundle\") pod \"22075cd6-19d1-4b03-8517-a9d678a23e23\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " Dec 03 22:25:25.747245 master-0 kubenswrapper[36504]: I1203 22:25:25.747215 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-config-data\") pod \"22075cd6-19d1-4b03-8517-a9d678a23e23\" (UID: \"22075cd6-19d1-4b03-8517-a9d678a23e23\") " Dec 03 22:25:25.759097 master-0 kubenswrapper[36504]: I1203 22:25:25.758991 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "22075cd6-19d1-4b03-8517-a9d678a23e23" (UID: "22075cd6-19d1-4b03-8517-a9d678a23e23"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.759097 master-0 kubenswrapper[36504]: I1203 22:25:25.759052 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-scripts" (OuterVolumeSpecName: "scripts") pod "22075cd6-19d1-4b03-8517-a9d678a23e23" (UID: "22075cd6-19d1-4b03-8517-a9d678a23e23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.760874 master-0 kubenswrapper[36504]: I1203 22:25:25.760837 36504 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.760874 master-0 kubenswrapper[36504]: I1203 22:25:25.760874 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b12b6b8-6e84-4e56-b403-d4d31d309852-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.760984 master-0 kubenswrapper[36504]: I1203 22:25:25.760888 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.760984 master-0 kubenswrapper[36504]: I1203 22:25:25.760900 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.760984 master-0 kubenswrapper[36504]: I1203 22:25:25.760914 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.760984 master-0 kubenswrapper[36504]: I1203 22:25:25.760931 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rth2j\" (UniqueName: \"kubernetes.io/projected/3b12b6b8-6e84-4e56-b403-d4d31d309852-kube-api-access-rth2j\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.760984 master-0 kubenswrapper[36504]: I1203 22:25:25.760945 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b12b6b8-6e84-4e56-b403-d4d31d309852-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.762019 master-0 kubenswrapper[36504]: I1203 22:25:25.761960 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22075cd6-19d1-4b03-8517-a9d678a23e23-kube-api-access-h5fcq" (OuterVolumeSpecName: "kube-api-access-h5fcq") pod "22075cd6-19d1-4b03-8517-a9d678a23e23" (UID: "22075cd6-19d1-4b03-8517-a9d678a23e23"). InnerVolumeSpecName "kube-api-access-h5fcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:25.762911 master-0 kubenswrapper[36504]: I1203 22:25:25.762870 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "22075cd6-19d1-4b03-8517-a9d678a23e23" (UID: "22075cd6-19d1-4b03-8517-a9d678a23e23"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.792950 master-0 kubenswrapper[36504]: I1203 22:25:25.792885 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22075cd6-19d1-4b03-8517-a9d678a23e23" (UID: "22075cd6-19d1-4b03-8517-a9d678a23e23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.806874 master-0 kubenswrapper[36504]: I1203 22:25:25.806794 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-config-data" (OuterVolumeSpecName: "config-data") pod "22075cd6-19d1-4b03-8517-a9d678a23e23" (UID: "22075cd6-19d1-4b03-8517-a9d678a23e23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:25.864381 master-0 kubenswrapper[36504]: I1203 22:25:25.864327 36504 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-credential-keys\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.865505 master-0 kubenswrapper[36504]: I1203 22:25:25.865438 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5fcq\" (UniqueName: \"kubernetes.io/projected/22075cd6-19d1-4b03-8517-a9d678a23e23-kube-api-access-h5fcq\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.865505 master-0 kubenswrapper[36504]: I1203 22:25:25.865463 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.865505 master-0 kubenswrapper[36504]: I1203 22:25:25.865474 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22075cd6-19d1-4b03-8517-a9d678a23e23-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:25.947553 master-0 kubenswrapper[36504]: I1203 22:25:25.947409 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-v4x4d" Dec 03 22:25:25.947553 master-0 kubenswrapper[36504]: I1203 22:25:25.947507 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-v4x4d" event={"ID":"3b12b6b8-6e84-4e56-b403-d4d31d309852","Type":"ContainerDied","Data":"68845500dd34cb983cb332a073c0df96de1ea58e0dec7d8a03145e09aa990a9c"} Dec 03 22:25:25.948428 master-0 kubenswrapper[36504]: I1203 22:25:25.947578 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68845500dd34cb983cb332a073c0df96de1ea58e0dec7d8a03145e09aa990a9c" Dec 03 22:25:25.953562 master-0 kubenswrapper[36504]: I1203 22:25:25.953501 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f6c98b97-zjmbp" event={"ID":"32f345c1-f0c4-48a5-b3ab-da35ac924812","Type":"ContainerStarted","Data":"838296320b1ede918bb7b665addfd715cd06b506c4069667dc3de1cf16ae6063"} Dec 03 22:25:25.953799 master-0 kubenswrapper[36504]: I1203 22:25:25.953715 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f6c98b97-zjmbp" event={"ID":"32f345c1-f0c4-48a5-b3ab-da35ac924812","Type":"ContainerStarted","Data":"a15f8ca4293cd97fdaf27a255a62262c5358fedd0f009e260d14b41d31ca6cec"} Dec 03 22:25:25.961953 master-0 kubenswrapper[36504]: I1203 22:25:25.959828 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hlr94" event={"ID":"22075cd6-19d1-4b03-8517-a9d678a23e23","Type":"ContainerDied","Data":"c7388b6273997a30edbba35d37275868899dd5c58e29ca685ab0196769a381c6"} Dec 03 22:25:25.961953 master-0 kubenswrapper[36504]: I1203 22:25:25.959936 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7388b6273997a30edbba35d37275868899dd5c58e29ca685ab0196769a381c6" Dec 03 22:25:25.961953 master-0 kubenswrapper[36504]: I1203 22:25:25.960069 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hlr94" Dec 03 22:25:26.409824 master-0 kubenswrapper[36504]: I1203 22:25:26.409697 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:26.410148 master-0 kubenswrapper[36504]: I1203 22:25:26.409912 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:25:26.486114 master-0 kubenswrapper[36504]: I1203 22:25:26.486050 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:25:26.998026 master-0 kubenswrapper[36504]: I1203 22:25:26.997952 36504 generic.go:334] "Generic (PLEG): container finished" podID="486e7352-51c4-4e75-9ff9-ead9eb721c74" containerID="ed22f8f052b7eb8696fb7d126014ca7ee249f63bbee1339902a7792b160ced4b" exitCode=0 Dec 03 22:25:26.998736 master-0 kubenswrapper[36504]: I1203 22:25:26.998049 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8lw82" event={"ID":"486e7352-51c4-4e75-9ff9-ead9eb721c74","Type":"ContainerDied","Data":"ed22f8f052b7eb8696fb7d126014ca7ee249f63bbee1339902a7792b160ced4b"} Dec 03 22:25:28.412872 master-0 kubenswrapper[36504]: I1203 22:25:28.412793 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:28.414029 master-0 kubenswrapper[36504]: I1203 22:25:28.413969 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:25:29.451661 master-0 kubenswrapper[36504]: I1203 22:25:29.450698 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:29.685704 master-0 kubenswrapper[36504]: I1203 22:25:29.685603 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6bf95cb78d-jt2z5"] Dec 03 22:25:29.686642 master-0 kubenswrapper[36504]: E1203 22:25:29.686487 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b12b6b8-6e84-4e56-b403-d4d31d309852" containerName="placement-db-sync" Dec 03 22:25:29.686642 master-0 kubenswrapper[36504]: I1203 22:25:29.686521 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b12b6b8-6e84-4e56-b403-d4d31d309852" containerName="placement-db-sync" Dec 03 22:25:29.688576 master-0 kubenswrapper[36504]: E1203 22:25:29.687017 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22075cd6-19d1-4b03-8517-a9d678a23e23" containerName="keystone-bootstrap" Dec 03 22:25:29.688576 master-0 kubenswrapper[36504]: I1203 22:25:29.687042 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="22075cd6-19d1-4b03-8517-a9d678a23e23" containerName="keystone-bootstrap" Dec 03 22:25:29.688576 master-0 kubenswrapper[36504]: I1203 22:25:29.688030 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="22075cd6-19d1-4b03-8517-a9d678a23e23" containerName="keystone-bootstrap" Dec 03 22:25:29.688576 master-0 kubenswrapper[36504]: I1203 22:25:29.688081 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b12b6b8-6e84-4e56-b403-d4d31d309852" containerName="placement-db-sync" Dec 03 22:25:29.740891 master-0 kubenswrapper[36504]: I1203 22:25:29.740658 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7864d79866-k7zqf"] Dec 03 22:25:29.753002 master-0 kubenswrapper[36504]: I1203 22:25:29.751162 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.790075 master-0 kubenswrapper[36504]: I1203 22:25:29.788523 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 22:25:29.790075 master-0 kubenswrapper[36504]: I1203 22:25:29.788808 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 22:25:29.790075 master-0 kubenswrapper[36504]: I1203 22:25:29.789211 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 22:25:29.790075 master-0 kubenswrapper[36504]: I1203 22:25:29.789532 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 22:25:29.793606 master-0 kubenswrapper[36504]: I1203 22:25:29.790740 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bf95cb78d-jt2z5"] Dec 03 22:25:29.793606 master-0 kubenswrapper[36504]: I1203 22:25:29.790970 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.815809 master-0 kubenswrapper[36504]: I1203 22:25:29.797812 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 22:25:29.815809 master-0 kubenswrapper[36504]: I1203 22:25:29.798197 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 22:25:29.815809 master-0 kubenswrapper[36504]: I1203 22:25:29.798372 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 22:25:29.823796 master-0 kubenswrapper[36504]: I1203 22:25:29.819194 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 22:25:29.823796 master-0 kubenswrapper[36504]: I1203 22:25:29.819592 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.868649 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhhl\" (UniqueName: \"kubernetes.io/projected/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-kube-api-access-9hhhl\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.868819 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-config-data\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.868945 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-internal-tls-certs\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869002 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-credential-keys\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869145 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-combined-ca-bundle\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869179 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-logs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869303 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-combined-ca-bundle\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869332 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-public-tls-certs\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869384 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm25c\" (UniqueName: \"kubernetes.io/projected/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-kube-api-access-cm25c\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869413 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-config-data\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869493 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7864d79866-k7zqf"] Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.869632 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-scripts\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.871907 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-public-tls-certs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.872176 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-internal-tls-certs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.872262 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-fernet-keys\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.875620 master-0 kubenswrapper[36504]: I1203 22:25:29.872345 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-scripts\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.983745 master-0 kubenswrapper[36504]: I1203 22:25:29.983315 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-scripts\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.983745 master-0 kubenswrapper[36504]: I1203 22:25:29.983421 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-public-tls-certs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.983745 master-0 kubenswrapper[36504]: I1203 22:25:29.983609 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-internal-tls-certs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:29.983745 master-0 kubenswrapper[36504]: I1203 22:25:29.983701 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-fernet-keys\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:29.985543 master-0 kubenswrapper[36504]: I1203 22:25:29.983811 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-scripts\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007138 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhhl\" (UniqueName: \"kubernetes.io/projected/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-kube-api-access-9hhhl\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007322 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-config-data\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007466 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-internal-tls-certs\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007532 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-credential-keys\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007750 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-combined-ca-bundle\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007810 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-logs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.007984 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-combined-ca-bundle\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.008015 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-public-tls-certs\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.008067 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm25c\" (UniqueName: \"kubernetes.io/projected/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-kube-api-access-cm25c\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.013599 master-0 kubenswrapper[36504]: I1203 22:25:30.008100 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-config-data\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.015308 master-0 kubenswrapper[36504]: I1203 22:25:30.014822 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-logs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.025388 master-0 kubenswrapper[36504]: I1203 22:25:30.024821 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-credential-keys\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.031229 master-0 kubenswrapper[36504]: I1203 22:25:30.029364 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-config-data\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.057234 master-0 kubenswrapper[36504]: I1203 22:25:30.046114 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-public-tls-certs\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.057234 master-0 kubenswrapper[36504]: I1203 22:25:30.046682 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhhl\" (UniqueName: \"kubernetes.io/projected/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-kube-api-access-9hhhl\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.057234 master-0 kubenswrapper[36504]: I1203 22:25:30.052934 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-internal-tls-certs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.073831 master-0 kubenswrapper[36504]: I1203 22:25:30.073417 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-combined-ca-bundle\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.075070 master-0 kubenswrapper[36504]: I1203 22:25:30.075029 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-combined-ca-bundle\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.075736 master-0 kubenswrapper[36504]: I1203 22:25:30.075698 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-config-data\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.076451 master-0 kubenswrapper[36504]: I1203 22:25:30.076407 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-internal-tls-certs\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.076975 master-0 kubenswrapper[36504]: I1203 22:25:30.076938 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-public-tls-certs\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.083416 master-0 kubenswrapper[36504]: I1203 22:25:30.081270 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-scripts\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.083416 master-0 kubenswrapper[36504]: I1203 22:25:30.081904 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c4fd6ea-11cf-4738-9abb-c9c604e217bb-fernet-keys\") pod \"keystone-6bf95cb78d-jt2z5\" (UID: \"2c4fd6ea-11cf-4738-9abb-c9c604e217bb\") " pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.091063 master-0 kubenswrapper[36504]: I1203 22:25:30.090714 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm25c\" (UniqueName: \"kubernetes.io/projected/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-kube-api-access-cm25c\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.129797 master-0 kubenswrapper[36504]: I1203 22:25:30.122574 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23fa164c-aba3-4c36-bdd4-7f952b7d3a24-scripts\") pod \"placement-7864d79866-k7zqf\" (UID: \"23fa164c-aba3-4c36-bdd4-7f952b7d3a24\") " pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.150844 master-0 kubenswrapper[36504]: I1203 22:25:30.150733 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:30.163450 master-0 kubenswrapper[36504]: I1203 22:25:30.162570 36504 generic.go:334] "Generic (PLEG): container finished" podID="79800ee3-07d8-43e7-9263-15e8cdeef26d" containerID="f73b6b3ecae967d9bb29ee494c0c19b97d70da3f1232551294416b1005658838" exitCode=0 Dec 03 22:25:30.163450 master-0 kubenswrapper[36504]: I1203 22:25:30.162641 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hghcd" event={"ID":"79800ee3-07d8-43e7-9263-15e8cdeef26d","Type":"ContainerDied","Data":"f73b6b3ecae967d9bb29ee494c0c19b97d70da3f1232551294416b1005658838"} Dec 03 22:25:30.270354 master-0 kubenswrapper[36504]: I1203 22:25:30.269644 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:30.280367 master-0 kubenswrapper[36504]: I1203 22:25:30.278599 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848b5d5dd5-k7qdq"] Dec 03 22:25:30.280367 master-0 kubenswrapper[36504]: I1203 22:25:30.279036 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="dnsmasq-dns" containerID="cri-o://9d300920b20d02c109bc28af15be4798547015038d64adc4899c559e8b4aa8b4" gracePeriod=10 Dec 03 22:25:31.186764 master-0 kubenswrapper[36504]: I1203 22:25:31.186610 36504 generic.go:334] "Generic (PLEG): container finished" podID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerID="9d300920b20d02c109bc28af15be4798547015038d64adc4899c559e8b4aa8b4" exitCode=0 Dec 03 22:25:31.188072 master-0 kubenswrapper[36504]: I1203 22:25:31.188043 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" event={"ID":"c52e6803-ca6e-4687-a71c-3029e0cb2253","Type":"ContainerDied","Data":"9d300920b20d02c109bc28af15be4798547015038d64adc4899c559e8b4aa8b4"} Dec 03 22:25:32.205973 master-0 kubenswrapper[36504]: I1203 22:25:32.205867 36504 generic.go:334] "Generic (PLEG): container finished" podID="c4f03542-de5f-457c-8843-01ea5d2febce" containerID="33176e2d1fb664c8b39f0ac856d7f6bdbdc4e6ffadca38e31d0edd73a918f866" exitCode=0 Dec 03 22:25:32.206680 master-0 kubenswrapper[36504]: I1203 22:25:32.205966 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-db-sync-wq9c6" event={"ID":"c4f03542-de5f-457c-8843-01ea5d2febce","Type":"ContainerDied","Data":"33176e2d1fb664c8b39f0ac856d7f6bdbdc4e6ffadca38e31d0edd73a918f866"} Dec 03 22:25:33.643301 master-0 kubenswrapper[36504]: I1203 22:25:33.643165 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.227:5353: connect: connection refused" Dec 03 22:25:35.079448 master-0 kubenswrapper[36504]: I1203 22:25:35.079380 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8lw82" Dec 03 22:25:35.093980 master-0 kubenswrapper[36504]: I1203 22:25:35.093827 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:25:35.108018 master-0 kubenswrapper[36504]: I1203 22:25:35.107831 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hghcd" Dec 03 22:25:35.202412 master-0 kubenswrapper[36504]: I1203 22:25:35.202024 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-combined-ca-bundle\") pod \"c4f03542-de5f-457c-8843-01ea5d2febce\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " Dec 03 22:25:35.202412 master-0 kubenswrapper[36504]: I1203 22:25:35.202339 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-config-data\") pod \"79800ee3-07d8-43e7-9263-15e8cdeef26d\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " Dec 03 22:25:35.202761 master-0 kubenswrapper[36504]: I1203 22:25:35.202433 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jqwl\" (UniqueName: \"kubernetes.io/projected/79800ee3-07d8-43e7-9263-15e8cdeef26d-kube-api-access-7jqwl\") pod \"79800ee3-07d8-43e7-9263-15e8cdeef26d\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " Dec 03 22:25:35.202761 master-0 kubenswrapper[36504]: I1203 22:25:35.202631 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9622j\" (UniqueName: \"kubernetes.io/projected/c4f03542-de5f-457c-8843-01ea5d2febce-kube-api-access-9622j\") pod \"c4f03542-de5f-457c-8843-01ea5d2febce\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " Dec 03 22:25:35.202854 master-0 kubenswrapper[36504]: I1203 22:25:35.202762 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmb28\" (UniqueName: \"kubernetes.io/projected/486e7352-51c4-4e75-9ff9-ead9eb721c74-kube-api-access-jmb28\") pod \"486e7352-51c4-4e75-9ff9-ead9eb721c74\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " Dec 03 22:25:35.202894 master-0 kubenswrapper[36504]: I1203 22:25:35.202858 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-combined-ca-bundle\") pod \"79800ee3-07d8-43e7-9263-15e8cdeef26d\" (UID: \"79800ee3-07d8-43e7-9263-15e8cdeef26d\") " Dec 03 22:25:35.202894 master-0 kubenswrapper[36504]: I1203 22:25:35.202888 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-scripts\") pod \"c4f03542-de5f-457c-8843-01ea5d2febce\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " Dec 03 22:25:35.204318 master-0 kubenswrapper[36504]: I1203 22:25:35.203038 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-db-sync-config-data\") pod \"486e7352-51c4-4e75-9ff9-ead9eb721c74\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " Dec 03 22:25:35.204318 master-0 kubenswrapper[36504]: I1203 22:25:35.203118 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4f03542-de5f-457c-8843-01ea5d2febce-etc-machine-id\") pod \"c4f03542-de5f-457c-8843-01ea5d2febce\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " Dec 03 22:25:35.204318 master-0 kubenswrapper[36504]: I1203 22:25:35.203161 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-combined-ca-bundle\") pod \"486e7352-51c4-4e75-9ff9-ead9eb721c74\" (UID: \"486e7352-51c4-4e75-9ff9-ead9eb721c74\") " Dec 03 22:25:35.204318 master-0 kubenswrapper[36504]: I1203 22:25:35.203263 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-config-data\") pod \"c4f03542-de5f-457c-8843-01ea5d2febce\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " Dec 03 22:25:35.204318 master-0 kubenswrapper[36504]: I1203 22:25:35.203322 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-db-sync-config-data\") pod \"c4f03542-de5f-457c-8843-01ea5d2febce\" (UID: \"c4f03542-de5f-457c-8843-01ea5d2febce\") " Dec 03 22:25:35.204318 master-0 kubenswrapper[36504]: I1203 22:25:35.203587 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c4f03542-de5f-457c-8843-01ea5d2febce-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c4f03542-de5f-457c-8843-01ea5d2febce" (UID: "c4f03542-de5f-457c-8843-01ea5d2febce"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:25:35.210317 master-0 kubenswrapper[36504]: I1203 22:25:35.205142 36504 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c4f03542-de5f-457c-8843-01ea5d2febce-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.210317 master-0 kubenswrapper[36504]: I1203 22:25:35.208539 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f03542-de5f-457c-8843-01ea5d2febce-kube-api-access-9622j" (OuterVolumeSpecName: "kube-api-access-9622j") pod "c4f03542-de5f-457c-8843-01ea5d2febce" (UID: "c4f03542-de5f-457c-8843-01ea5d2febce"). InnerVolumeSpecName "kube-api-access-9622j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:35.210317 master-0 kubenswrapper[36504]: I1203 22:25:35.209665 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-scripts" (OuterVolumeSpecName: "scripts") pod "c4f03542-de5f-457c-8843-01ea5d2febce" (UID: "c4f03542-de5f-457c-8843-01ea5d2febce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.216949 master-0 kubenswrapper[36504]: I1203 22:25:35.216319 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "486e7352-51c4-4e75-9ff9-ead9eb721c74" (UID: "486e7352-51c4-4e75-9ff9-ead9eb721c74"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.226937 master-0 kubenswrapper[36504]: I1203 22:25:35.226731 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c4f03542-de5f-457c-8843-01ea5d2febce" (UID: "c4f03542-de5f-457c-8843-01ea5d2febce"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.236228 master-0 kubenswrapper[36504]: I1203 22:25:35.236090 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79800ee3-07d8-43e7-9263-15e8cdeef26d-kube-api-access-7jqwl" (OuterVolumeSpecName: "kube-api-access-7jqwl") pod "79800ee3-07d8-43e7-9263-15e8cdeef26d" (UID: "79800ee3-07d8-43e7-9263-15e8cdeef26d"). InnerVolumeSpecName "kube-api-access-7jqwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:35.264647 master-0 kubenswrapper[36504]: I1203 22:25:35.256370 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486e7352-51c4-4e75-9ff9-ead9eb721c74-kube-api-access-jmb28" (OuterVolumeSpecName: "kube-api-access-jmb28") pod "486e7352-51c4-4e75-9ff9-ead9eb721c74" (UID: "486e7352-51c4-4e75-9ff9-ead9eb721c74"). InnerVolumeSpecName "kube-api-access-jmb28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:35.270855 master-0 kubenswrapper[36504]: I1203 22:25:35.268277 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-db-sync-wq9c6" event={"ID":"c4f03542-de5f-457c-8843-01ea5d2febce","Type":"ContainerDied","Data":"81fd3294112ed9b71d0e6bd3037f08801b2717de17a5a7de2d203880301da72f"} Dec 03 22:25:35.270855 master-0 kubenswrapper[36504]: I1203 22:25:35.268337 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81fd3294112ed9b71d0e6bd3037f08801b2717de17a5a7de2d203880301da72f" Dec 03 22:25:35.270855 master-0 kubenswrapper[36504]: I1203 22:25:35.268414 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-db-sync-wq9c6" Dec 03 22:25:35.273979 master-0 kubenswrapper[36504]: I1203 22:25:35.273930 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8lw82" event={"ID":"486e7352-51c4-4e75-9ff9-ead9eb721c74","Type":"ContainerDied","Data":"978129855c4971acdba6cce10364a4534166a91b4f69c59af280a65210ee8c56"} Dec 03 22:25:35.274084 master-0 kubenswrapper[36504]: I1203 22:25:35.273979 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978129855c4971acdba6cce10364a4534166a91b4f69c59af280a65210ee8c56" Dec 03 22:25:35.274084 master-0 kubenswrapper[36504]: I1203 22:25:35.273984 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8lw82" Dec 03 22:25:35.276371 master-0 kubenswrapper[36504]: I1203 22:25:35.276155 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79800ee3-07d8-43e7-9263-15e8cdeef26d" (UID: "79800ee3-07d8-43e7-9263-15e8cdeef26d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.279594 master-0 kubenswrapper[36504]: I1203 22:25:35.279056 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-hghcd" event={"ID":"79800ee3-07d8-43e7-9263-15e8cdeef26d","Type":"ContainerDied","Data":"b1b54fb4feb60c0f8bfe2f4add76f9fa15c437460cd7532f88265a9ed7c177dc"} Dec 03 22:25:35.279594 master-0 kubenswrapper[36504]: I1203 22:25:35.279094 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b54fb4feb60c0f8bfe2f4add76f9fa15c437460cd7532f88265a9ed7c177dc" Dec 03 22:25:35.279594 master-0 kubenswrapper[36504]: I1203 22:25:35.279207 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-hghcd" Dec 03 22:25:35.305021 master-0 kubenswrapper[36504]: I1203 22:25:35.304939 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "486e7352-51c4-4e75-9ff9-ead9eb721c74" (UID: "486e7352-51c4-4e75-9ff9-ead9eb721c74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.308650 master-0 kubenswrapper[36504]: I1203 22:25:35.308577 36504 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308650 master-0 kubenswrapper[36504]: I1203 22:25:35.308643 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jqwl\" (UniqueName: \"kubernetes.io/projected/79800ee3-07d8-43e7-9263-15e8cdeef26d-kube-api-access-7jqwl\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308832 master-0 kubenswrapper[36504]: I1203 22:25:35.308660 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9622j\" (UniqueName: \"kubernetes.io/projected/c4f03542-de5f-457c-8843-01ea5d2febce-kube-api-access-9622j\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308832 master-0 kubenswrapper[36504]: I1203 22:25:35.308699 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmb28\" (UniqueName: \"kubernetes.io/projected/486e7352-51c4-4e75-9ff9-ead9eb721c74-kube-api-access-jmb28\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308832 master-0 kubenswrapper[36504]: I1203 22:25:35.308712 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308832 master-0 kubenswrapper[36504]: I1203 22:25:35.308726 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308832 master-0 kubenswrapper[36504]: I1203 22:25:35.308739 36504 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.308832 master-0 kubenswrapper[36504]: I1203 22:25:35.308751 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e7352-51c4-4e75-9ff9-ead9eb721c74-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.318053 master-0 kubenswrapper[36504]: I1203 22:25:35.317980 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4f03542-de5f-457c-8843-01ea5d2febce" (UID: "c4f03542-de5f-457c-8843-01ea5d2febce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.327583 master-0 kubenswrapper[36504]: I1203 22:25:35.327497 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-config-data" (OuterVolumeSpecName: "config-data") pod "79800ee3-07d8-43e7-9263-15e8cdeef26d" (UID: "79800ee3-07d8-43e7-9263-15e8cdeef26d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.328979 master-0 kubenswrapper[36504]: I1203 22:25:35.328941 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-config-data" (OuterVolumeSpecName: "config-data") pod "c4f03542-de5f-457c-8843-01ea5d2febce" (UID: "c4f03542-de5f-457c-8843-01ea5d2febce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:35.411698 master-0 kubenswrapper[36504]: I1203 22:25:35.411623 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.411698 master-0 kubenswrapper[36504]: I1203 22:25:35.411697 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4f03542-de5f-457c-8843-01ea5d2febce-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.411698 master-0 kubenswrapper[36504]: I1203 22:25:35.411712 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79800ee3-07d8-43e7-9263-15e8cdeef26d-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:35.723858 master-0 kubenswrapper[36504]: E1203 22:25:35.723789 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:25:37.639411 master-0 kubenswrapper[36504]: I1203 22:25:37.639271 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:25:37.697424 master-0 kubenswrapper[36504]: I1203 22:25:37.697348 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-nb\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.697424 master-0 kubenswrapper[36504]: I1203 22:25:37.697432 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-config\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.697727 master-0 kubenswrapper[36504]: I1203 22:25:37.697525 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-sb\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.697727 master-0 kubenswrapper[36504]: I1203 22:25:37.697599 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-edpm\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.697816 master-0 kubenswrapper[36504]: I1203 22:25:37.697749 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-svc\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.697953 master-0 kubenswrapper[36504]: I1203 22:25:37.697924 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-swift-storage-0\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.698070 master-0 kubenswrapper[36504]: I1203 22:25:37.698040 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwcpw\" (UniqueName: \"kubernetes.io/projected/c52e6803-ca6e-4687-a71c-3029e0cb2253-kube-api-access-kwcpw\") pod \"c52e6803-ca6e-4687-a71c-3029e0cb2253\" (UID: \"c52e6803-ca6e-4687-a71c-3029e0cb2253\") " Dec 03 22:25:37.771981 master-0 kubenswrapper[36504]: I1203 22:25:37.771936 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.849228 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65c678856f-xzvss"] Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: E1203 22:25:37.850108 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486e7352-51c4-4e75-9ff9-ead9eb721c74" containerName="barbican-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850130 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="486e7352-51c4-4e75-9ff9-ead9eb721c74" containerName="barbican-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: E1203 22:25:37.850168 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="dnsmasq-dns" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850176 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="dnsmasq-dns" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: E1203 22:25:37.850207 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f03542-de5f-457c-8843-01ea5d2febce" containerName="cinder-baebb-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850214 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f03542-de5f-457c-8843-01ea5d2febce" containerName="cinder-baebb-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: E1203 22:25:37.850256 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="init" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850265 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="init" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: E1203 22:25:37.850294 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79800ee3-07d8-43e7-9263-15e8cdeef26d" containerName="heat-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850302 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="79800ee3-07d8-43e7-9263-15e8cdeef26d" containerName="heat-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850593 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="486e7352-51c4-4e75-9ff9-ead9eb721c74" containerName="barbican-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850656 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="79800ee3-07d8-43e7-9263-15e8cdeef26d" containerName="heat-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850694 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f03542-de5f-457c-8843-01ea5d2febce" containerName="cinder-baebb-db-sync" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.850713 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" containerName="dnsmasq-dns" Dec 03 22:25:37.854724 master-0 kubenswrapper[36504]: I1203 22:25:37.852878 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:37.862257 master-0 kubenswrapper[36504]: I1203 22:25:37.861345 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 22:25:37.862257 master-0 kubenswrapper[36504]: I1203 22:25:37.862165 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 22:25:37.877575 master-0 kubenswrapper[36504]: I1203 22:25:37.877147 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52e6803-ca6e-4687-a71c-3029e0cb2253-kube-api-access-kwcpw" (OuterVolumeSpecName: "kube-api-access-kwcpw") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "kube-api-access-kwcpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:37.892553 master-0 kubenswrapper[36504]: I1203 22:25:37.891688 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-86689bd958-gshr9"] Dec 03 22:25:37.895342 master-0 kubenswrapper[36504]: I1203 22:25:37.895284 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:37.902735 master-0 kubenswrapper[36504]: I1203 22:25:37.902284 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 22:25:37.905927 master-0 kubenswrapper[36504]: I1203 22:25:37.905856 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwcpw\" (UniqueName: \"kubernetes.io/projected/c52e6803-ca6e-4687-a71c-3029e0cb2253-kube-api-access-kwcpw\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:37.938409 master-0 kubenswrapper[36504]: I1203 22:25:37.937573 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65c678856f-xzvss"] Dec 03 22:25:37.939445 master-0 kubenswrapper[36504]: I1203 22:25:37.939367 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-config" (OuterVolumeSpecName: "config") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:37.975856 master-0 kubenswrapper[36504]: I1203 22:25:37.975729 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:37.983946 master-0 kubenswrapper[36504]: I1203 22:25:37.976505 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86689bd958-gshr9"] Dec 03 22:25:37.983946 master-0 kubenswrapper[36504]: I1203 22:25:37.980540 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38.000366 master-0 kubenswrapper[36504]: I1203 22:25:38.000223 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-edpm" (OuterVolumeSpecName: "edpm") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38.012279 master-0 kubenswrapper[36504]: I1203 22:25:38.012182 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd552\" (UniqueName: \"kubernetes.io/projected/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-kube-api-access-rd552\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.012279 master-0 kubenswrapper[36504]: I1203 22:25:38.012250 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-combined-ca-bundle\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012302 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpjg\" (UniqueName: \"kubernetes.io/projected/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-kube-api-access-ljpjg\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012346 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-config-data-custom\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012395 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-logs\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012416 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-config-data\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012457 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-combined-ca-bundle\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012556 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-config-data-custom\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.012875 master-0 kubenswrapper[36504]: I1203 22:25:38.012606 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-logs\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.013169 master-0 kubenswrapper[36504]: I1203 22:25:38.013105 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-config-data\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.013605 master-0 kubenswrapper[36504]: I1203 22:25:38.013566 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:38.013605 master-0 kubenswrapper[36504]: I1203 22:25:38.013598 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:38.013720 master-0 kubenswrapper[36504]: I1203 22:25:38.013614 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:38.013720 master-0 kubenswrapper[36504]: I1203 22:25:38.013633 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:38.073057 master-0 kubenswrapper[36504]: I1203 22:25:38.073012 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d798fdcb9-r9nhg"] Dec 03 22:25:38.094578 master-0 kubenswrapper[36504]: I1203 22:25:38.087234 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38.097884 master-0 kubenswrapper[36504]: I1203 22:25:38.097833 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.123832 master-0 kubenswrapper[36504]: I1203 22:25:38.121318 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.123832 master-0 kubenswrapper[36504]: I1203 22:25:38.121475 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-config-data-custom\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.123832 master-0 kubenswrapper[36504]: I1203 22:25:38.121499 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-edpm\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.123832 master-0 kubenswrapper[36504]: I1203 22:25:38.121575 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.123832 master-0 kubenswrapper[36504]: I1203 22:25:38.121598 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-logs\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.123832 master-0 kubenswrapper[36504]: I1203 22:25:38.121643 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-config\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136081 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-config-data\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136138 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-logs\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136196 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-svc\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136346 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dn75\" (UniqueName: \"kubernetes.io/projected/158ee096-45b2-4588-9b52-8658e75aaac8-kube-api-access-8dn75\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136397 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd552\" (UniqueName: \"kubernetes.io/projected/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-kube-api-access-rd552\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136471 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-combined-ca-bundle\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136554 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-swift-storage-0\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136707 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpjg\" (UniqueName: \"kubernetes.io/projected/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-kube-api-access-ljpjg\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.136845 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-config-data-custom\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.137008 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-logs\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.137189 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-config-data\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.137256 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-combined-ca-bundle\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.137533 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:38.142950 master-0 kubenswrapper[36504]: I1203 22:25:38.141719 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-logs\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.150386 master-0 kubenswrapper[36504]: I1203 22:25:38.150312 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-config-data-custom\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.157981 master-0 kubenswrapper[36504]: I1203 22:25:38.157802 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-config-data\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.164795 master-0 kubenswrapper[36504]: I1203 22:25:38.163112 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-combined-ca-bundle\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.165591 master-0 kubenswrapper[36504]: I1203 22:25:38.165133 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd552\" (UniqueName: \"kubernetes.io/projected/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-kube-api-access-rd552\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.167346 master-0 kubenswrapper[36504]: I1203 22:25:38.167284 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpjg\" (UniqueName: \"kubernetes.io/projected/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-kube-api-access-ljpjg\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.173275 master-0 kubenswrapper[36504]: I1203 22:25:38.173134 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c52e6803-ca6e-4687-a71c-3029e0cb2253" (UID: "c52e6803-ca6e-4687-a71c-3029e0cb2253"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:38.176341 master-0 kubenswrapper[36504]: I1203 22:25:38.176283 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d798fdcb9-r9nhg"] Dec 03 22:25:38.176442 master-0 kubenswrapper[36504]: I1203 22:25:38.176408 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-combined-ca-bundle\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.191427 master-0 kubenswrapper[36504]: I1203 22:25:38.191380 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef686a34-c2bd-4a3c-9e9b-39b4318afbb9-config-data\") pod \"barbican-keystone-listener-86689bd958-gshr9\" (UID: \"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9\") " pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.193152 master-0 kubenswrapper[36504]: I1203 22:25:38.193101 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f0fd30e-728d-4479-a6ed-70fd35fc92ee-config-data-custom\") pod \"barbican-worker-65c678856f-xzvss\" (UID: \"3f0fd30e-728d-4479-a6ed-70fd35fc92ee\") " pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.193246 master-0 kubenswrapper[36504]: I1203 22:25:38.193220 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7997d6f544-ljwhp"] Dec 03 22:25:38.196745 master-0 kubenswrapper[36504]: I1203 22:25:38.196667 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.199541 master-0 kubenswrapper[36504]: I1203 22:25:38.199484 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 22:25:38.223923 master-0 kubenswrapper[36504]: I1203 22:25:38.223332 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7997d6f544-ljwhp"] Dec 03 22:25:38.278122 master-0 kubenswrapper[36504]: I1203 22:25:38.278062 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-svc\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.278357 master-0 kubenswrapper[36504]: I1203 22:25:38.278341 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dn75\" (UniqueName: \"kubernetes.io/projected/158ee096-45b2-4588-9b52-8658e75aaac8-kube-api-access-8dn75\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.278521 master-0 kubenswrapper[36504]: I1203 22:25:38.278503 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data-custom\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.278658 master-0 kubenswrapper[36504]: I1203 22:25:38.278640 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-swift-storage-0\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.279030 master-0 kubenswrapper[36504]: I1203 22:25:38.279011 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.279240 master-0 kubenswrapper[36504]: I1203 22:25:38.279218 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1293e094-4f1e-4c59-8050-7eeb11695c40-logs\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.279375 master-0 kubenswrapper[36504]: I1203 22:25:38.279358 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-edpm\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.279550 master-0 kubenswrapper[36504]: I1203 22:25:38.279536 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.279654 master-0 kubenswrapper[36504]: I1203 22:25:38.279642 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-config\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.279750 master-0 kubenswrapper[36504]: I1203 22:25:38.279738 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-combined-ca-bundle\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.280237 master-0 kubenswrapper[36504]: I1203 22:25:38.280159 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsk4l\" (UniqueName: \"kubernetes.io/projected/1293e094-4f1e-4c59-8050-7eeb11695c40-kube-api-access-vsk4l\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.280385 master-0 kubenswrapper[36504]: I1203 22:25:38.280348 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.281045 master-0 kubenswrapper[36504]: I1203 22:25:38.281014 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c52e6803-ca6e-4687-a71c-3029e0cb2253-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:38.283795 master-0 kubenswrapper[36504]: I1203 22:25:38.281872 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-svc\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.288405 master-0 kubenswrapper[36504]: I1203 22:25:38.288362 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-edpm\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.307492 master-0 kubenswrapper[36504]: I1203 22:25:38.305913 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.316984 master-0 kubenswrapper[36504]: I1203 22:25:38.316936 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-config\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.317447 master-0 kubenswrapper[36504]: I1203 22:25:38.317389 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-swift-storage-0\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.323861 master-0 kubenswrapper[36504]: I1203 22:25:38.291824 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.349485 master-0 kubenswrapper[36504]: I1203 22:25:38.349422 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65c678856f-xzvss" Dec 03 22:25:38.352402 master-0 kubenswrapper[36504]: I1203 22:25:38.352377 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dn75\" (UniqueName: \"kubernetes.io/projected/158ee096-45b2-4588-9b52-8658e75aaac8-kube-api-access-8dn75\") pod \"dnsmasq-dns-5d798fdcb9-r9nhg\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.385067 master-0 kubenswrapper[36504]: I1203 22:25:38.384993 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-86689bd958-gshr9" Dec 03 22:25:38.389636 master-0 kubenswrapper[36504]: I1203 22:25:38.389582 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.390000 master-0 kubenswrapper[36504]: I1203 22:25:38.389928 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data-custom\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.390532 master-0 kubenswrapper[36504]: I1203 22:25:38.390488 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1293e094-4f1e-4c59-8050-7eeb11695c40-logs\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.391174 master-0 kubenswrapper[36504]: I1203 22:25:38.391116 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-combined-ca-bundle\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.391907 master-0 kubenswrapper[36504]: I1203 22:25:38.391474 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsk4l\" (UniqueName: \"kubernetes.io/projected/1293e094-4f1e-4c59-8050-7eeb11695c40-kube-api-access-vsk4l\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.392476 master-0 kubenswrapper[36504]: I1203 22:25:38.392418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1293e094-4f1e-4c59-8050-7eeb11695c40-logs\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.396387 master-0 kubenswrapper[36504]: I1203 22:25:38.396348 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-combined-ca-bundle\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.401971 master-0 kubenswrapper[36504]: I1203 22:25:38.401745 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.402089 master-0 kubenswrapper[36504]: I1203 22:25:38.402003 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" Dec 03 22:25:38.402193 master-0 kubenswrapper[36504]: I1203 22:25:38.401751 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848b5d5dd5-k7qdq" event={"ID":"c52e6803-ca6e-4687-a71c-3029e0cb2253","Type":"ContainerDied","Data":"68ef88e1870f4fa944edf9ee7e2c26aceba0ba41c4cc26b3302c2a24909a709f"} Dec 03 22:25:38.402344 master-0 kubenswrapper[36504]: I1203 22:25:38.402308 36504 scope.go:117] "RemoveContainer" containerID="9d300920b20d02c109bc28af15be4798547015038d64adc4899c559e8b4aa8b4" Dec 03 22:25:38.404054 master-0 kubenswrapper[36504]: I1203 22:25:38.403798 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data-custom\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.419737 master-0 kubenswrapper[36504]: I1203 22:25:38.419671 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56f6c98b97-zjmbp" event={"ID":"32f345c1-f0c4-48a5-b3ab-da35ac924812","Type":"ContainerStarted","Data":"ad885610921fb0ef6fbb8c127429151204d5cc86445246b08941432975c0dae4"} Dec 03 22:25:38.420011 master-0 kubenswrapper[36504]: I1203 22:25:38.419847 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsk4l\" (UniqueName: \"kubernetes.io/projected/1293e094-4f1e-4c59-8050-7eeb11695c40-kube-api-access-vsk4l\") pod \"barbican-api-7997d6f544-ljwhp\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.420148 master-0 kubenswrapper[36504]: I1203 22:25:38.420028 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:38.434755 master-0 kubenswrapper[36504]: I1203 22:25:38.434702 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:38.523818 master-0 kubenswrapper[36504]: I1203 22:25:38.517702 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56f6c98b97-zjmbp" podStartSLOduration=16.517676909 podStartE2EDuration="16.517676909s" podCreationTimestamp="2025-12-03 22:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:38.449088352 +0000 UTC m=+903.668860359" watchObservedRunningTime="2025-12-03 22:25:38.517676909 +0000 UTC m=+903.737448916" Dec 03 22:25:38.523818 master-0 kubenswrapper[36504]: I1203 22:25:38.523408 36504 scope.go:117] "RemoveContainer" containerID="ef067c318d9cb82534ae69782df516e6360c57080d949cfd3845fbf5b90718af" Dec 03 22:25:38.562183 master-0 kubenswrapper[36504]: I1203 22:25:38.562141 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848b5d5dd5-k7qdq"] Dec 03 22:25:38.581281 master-0 kubenswrapper[36504]: I1203 22:25:38.577834 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848b5d5dd5-k7qdq"] Dec 03 22:25:38.622599 master-0 kubenswrapper[36504]: I1203 22:25:38.615640 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:38.833025 master-0 kubenswrapper[36504]: I1203 22:25:38.832935 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7864d79866-k7zqf"] Dec 03 22:25:38.923348 master-0 kubenswrapper[36504]: I1203 22:25:38.921298 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:25:38.928139 master-0 kubenswrapper[36504]: I1203 22:25:38.928075 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:38.932278 master-0 kubenswrapper[36504]: I1203 22:25:38.932239 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-scheduler-config-data" Dec 03 22:25:38.932489 master-0 kubenswrapper[36504]: I1203 22:25:38.932463 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-config-data" Dec 03 22:25:38.932615 master-0 kubenswrapper[36504]: I1203 22:25:38.932596 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-scripts" Dec 03 22:25:38.965922 master-0 kubenswrapper[36504]: I1203 22:25:38.965493 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:25:38.981788 master-0 kubenswrapper[36504]: I1203 22:25:38.979527 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:38.993282 master-0 kubenswrapper[36504]: I1203 22:25:38.992198 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-volume-lvm-iscsi-config-data" Dec 03 22:25:38.997736 master-0 kubenswrapper[36504]: I1203 22:25:38.997329 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:25:39.026504 master-0 kubenswrapper[36504]: I1203 22:25:39.026427 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053469 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-run\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053553 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data-custom\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053586 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-sys\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053649 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053676 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-brick\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053736 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pr2h\" (UniqueName: \"kubernetes.io/projected/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-kube-api-access-2pr2h\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053827 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-iscsi\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.053922 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data-custom\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054026 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-scripts\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054103 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-scripts\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054232 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-combined-ca-bundle\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054251 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwtv\" (UniqueName: \"kubernetes.io/projected/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-kube-api-access-vrwtv\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054327 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054352 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-dev\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-lib-modules\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054439 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-etc-machine-id\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054487 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-combined-ca-bundle\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054517 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-lib-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054595 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054676 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-machine-id\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.056849 master-0 kubenswrapper[36504]: I1203 22:25:39.054698 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-nvme\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.085250 master-0 kubenswrapper[36504]: I1203 22:25:39.085101 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6bf95cb78d-jt2z5"] Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170134 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-scripts\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170301 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-scripts\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170502 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-combined-ca-bundle\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170535 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwtv\" (UniqueName: \"kubernetes.io/projected/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-kube-api-access-vrwtv\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170655 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170689 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-dev\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170762 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-lib-modules\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170821 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-etc-machine-id\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170880 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-combined-ca-bundle\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.170913 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-lib-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.171009 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.171107 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-machine-id\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.171627 master-0 kubenswrapper[36504]: I1203 22:25:39.171121 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-nvme\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.174563 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-run\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.174618 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data-custom\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.174642 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-sys\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.174700 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.174778 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-brick\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.174860 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pr2h\" (UniqueName: \"kubernetes.io/projected/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-kube-api-access-2pr2h\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175191 master-0 kubenswrapper[36504]: I1203 22:25:39.175177 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-iscsi\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175602 master-0 kubenswrapper[36504]: I1203 22:25:39.175309 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data-custom\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175812 master-0 kubenswrapper[36504]: I1203 22:25:39.175753 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-iscsi\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.175887 master-0 kubenswrapper[36504]: I1203 22:25:39.175845 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-lib-modules\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.179580 master-0 kubenswrapper[36504]: I1203 22:25:39.179484 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-etc-machine-id\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.180195 master-0 kubenswrapper[36504]: I1203 22:25:39.180143 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-scripts\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.180356 master-0 kubenswrapper[36504]: I1203 22:25:39.180215 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-dev\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.182008 master-0 kubenswrapper[36504]: I1203 22:25:39.181382 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-run\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.182891 master-0 kubenswrapper[36504]: I1203 22:25:39.182855 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-machine-id\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.182991 master-0 kubenswrapper[36504]: I1203 22:25:39.182969 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-nvme\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.183041 master-0 kubenswrapper[36504]: I1203 22:25:39.182994 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-lib-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.183195 master-0 kubenswrapper[36504]: I1203 22:25:39.183157 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.183928 master-0 kubenswrapper[36504]: I1203 22:25:39.183812 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-brick\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.184030 master-0 kubenswrapper[36504]: I1203 22:25:39.183968 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-sys\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.184314 master-0 kubenswrapper[36504]: I1203 22:25:39.184283 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data-custom\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.194420 master-0 kubenswrapper[36504]: I1203 22:25:39.194237 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data-custom\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.194420 master-0 kubenswrapper[36504]: I1203 22:25:39.194230 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.197091 master-0 kubenswrapper[36504]: I1203 22:25:39.197062 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-combined-ca-bundle\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.197246 master-0 kubenswrapper[36504]: I1203 22:25:39.197149 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.197805 master-0 kubenswrapper[36504]: I1203 22:25:39.197711 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-combined-ca-bundle\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.208469 master-0 kubenswrapper[36504]: I1203 22:25:39.208408 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwtv\" (UniqueName: \"kubernetes.io/projected/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-kube-api-access-vrwtv\") pod \"cinder-baebb-scheduler-0\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.232819 master-0 kubenswrapper[36504]: I1203 22:25:39.232720 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-scripts\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.237475 master-0 kubenswrapper[36504]: I1203 22:25:39.237393 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52e6803-ca6e-4687-a71c-3029e0cb2253" path="/var/lib/kubelet/pods/c52e6803-ca6e-4687-a71c-3029e0cb2253/volumes" Dec 03 22:25:39.238650 master-0 kubenswrapper[36504]: I1203 22:25:39.238619 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d798fdcb9-r9nhg"] Dec 03 22:25:39.285761 master-0 kubenswrapper[36504]: I1203 22:25:39.282889 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:25:39.309816 master-0 kubenswrapper[36504]: I1203 22:25:39.309740 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:25:39.317631 master-0 kubenswrapper[36504]: I1203 22:25:39.309890 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.317631 master-0 kubenswrapper[36504]: I1203 22:25:39.315436 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-backup-config-data" Dec 03 22:25:39.319598 master-0 kubenswrapper[36504]: I1203 22:25:39.318894 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pr2h\" (UniqueName: \"kubernetes.io/projected/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-kube-api-access-2pr2h\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.330245 master-0 kubenswrapper[36504]: I1203 22:25:39.330138 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5759b6fcdf-jh9hb"] Dec 03 22:25:39.344677 master-0 kubenswrapper[36504]: I1203 22:25:39.336613 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.359862 master-0 kubenswrapper[36504]: W1203 22:25:39.352891 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f0fd30e_728d_4479_a6ed_70fd35fc92ee.slice/crio-f18f16a343813311eedc2bf2e0cb21ff8d859a48712908a8d16c7766e82ee5f7 WatchSource:0}: Error finding container f18f16a343813311eedc2bf2e0cb21ff8d859a48712908a8d16c7766e82ee5f7: Status 404 returned error can't find the container with id f18f16a343813311eedc2bf2e0cb21ff8d859a48712908a8d16c7766e82ee5f7 Dec 03 22:25:39.394795 master-0 kubenswrapper[36504]: I1203 22:25:39.394546 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qv82\" (UniqueName: \"kubernetes.io/projected/23860378-b004-4ef4-9a8b-847fd6277fd8-kube-api-access-6qv82\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.394795 master-0 kubenswrapper[36504]: I1203 22:25:39.394691 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.394795 master-0 kubenswrapper[36504]: I1203 22:25:39.394788 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6xbn\" (UniqueName: \"kubernetes.io/projected/d7bf8267-d5ba-4e78-b1aa-d73517d74730-kube-api-access-d6xbn\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.394997 master-0 kubenswrapper[36504]: I1203 22:25:39.394870 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-iscsi\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.395044 master-0 kubenswrapper[36504]: I1203 22:25:39.395008 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-machine-id\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.395209 master-0 kubenswrapper[36504]: I1203 22:25:39.395150 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data-custom\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.395251 master-0 kubenswrapper[36504]: I1203 22:25:39.395230 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.395418 master-0 kubenswrapper[36504]: I1203 22:25:39.395363 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-lib-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.396529 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-lib-modules\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.396565 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-config\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.396803 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-dev\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.396901 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397122 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397202 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-brick\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397305 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-sys\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397332 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-run\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397379 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397435 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-svc\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.397456 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-nvme\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.398733 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-scripts\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.398849 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-combined-ca-bundle\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.400126 master-0 kubenswrapper[36504]: I1203 22:25:39.398883 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.405732 master-0 kubenswrapper[36504]: I1203 22:25:39.405653 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:39.453963 master-0 kubenswrapper[36504]: I1203 22:25:39.450637 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:39.463213 master-0 kubenswrapper[36504]: I1203 22:25:39.463148 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86689bd958-gshr9" event={"ID":"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9","Type":"ContainerStarted","Data":"1b3b781137bc8c1d4cdc2ca935068a153975afa45ea25c7bbd60671fbc69c417"} Dec 03 22:25:39.466557 master-0 kubenswrapper[36504]: I1203 22:25:39.466485 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7864d79866-k7zqf" event={"ID":"23fa164c-aba3-4c36-bdd4-7f952b7d3a24","Type":"ContainerStarted","Data":"9ddc0f078005460abf2119c262fce33272420d7f4a68c96dd3bd721bc2fbefb9"} Dec 03 22:25:39.490822 master-0 kubenswrapper[36504]: I1203 22:25:39.490534 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5759b6fcdf-jh9hb"] Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513699 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513759 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data-custom\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513802 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-lib-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513852 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-lib-modules\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513879 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-config\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513896 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-dev\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513932 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513951 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.513976 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-brick\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514016 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-sys\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514031 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-run\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514051 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514077 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-svc\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514093 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-nvme\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514395 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-scripts\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514474 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-combined-ca-bundle\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514499 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514526 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qv82\" (UniqueName: \"kubernetes.io/projected/23860378-b004-4ef4-9a8b-847fd6277fd8-kube-api-access-6qv82\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514590 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.515251 master-0 kubenswrapper[36504]: I1203 22:25:39.514618 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6xbn\" (UniqueName: \"kubernetes.io/projected/d7bf8267-d5ba-4e78-b1aa-d73517d74730-kube-api-access-d6xbn\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.517000 master-0 kubenswrapper[36504]: I1203 22:25:39.516673 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-iscsi\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.517000 master-0 kubenswrapper[36504]: I1203 22:25:39.516785 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-machine-id\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.517000 master-0 kubenswrapper[36504]: I1203 22:25:39.516910 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-machine-id\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.517425 master-0 kubenswrapper[36504]: I1203 22:25:39.517357 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-run\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.522074 master-0 kubenswrapper[36504]: I1203 22:25:39.522040 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data-custom\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.522326 master-0 kubenswrapper[36504]: I1203 22:25:39.522312 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-lib-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.522524 master-0 kubenswrapper[36504]: I1203 22:25:39.522416 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-lib-modules\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.523619 master-0 kubenswrapper[36504]: I1203 22:25:39.523596 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-config\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.523748 master-0 kubenswrapper[36504]: I1203 22:25:39.523733 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-dev\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.527364 master-0 kubenswrapper[36504]: I1203 22:25:39.524404 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-brick\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.527364 master-0 kubenswrapper[36504]: I1203 22:25:39.524431 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.529695 master-0 kubenswrapper[36504]: I1203 22:25:39.529634 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.529822 master-0 kubenswrapper[36504]: I1203 22:25:39.529683 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-sys\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.530397 master-0 kubenswrapper[36504]: I1203 22:25:39.530369 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.531356 master-0 kubenswrapper[36504]: I1203 22:25:39.531331 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-svc\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.533034 master-0 kubenswrapper[36504]: I1203 22:25:39.532971 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-iscsi\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.533199 master-0 kubenswrapper[36504]: I1203 22:25:39.533162 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.533419 master-0 kubenswrapper[36504]: I1203 22:25:39.533375 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-nvme\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.533968 master-0 kubenswrapper[36504]: I1203 22:25:39.533927 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.534385 master-0 kubenswrapper[36504]: I1203 22:25:39.534355 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65c678856f-xzvss"] Dec 03 22:25:39.535663 master-0 kubenswrapper[36504]: I1203 22:25:39.535621 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bf95cb78d-jt2z5" event={"ID":"2c4fd6ea-11cf-4738-9abb-c9c604e217bb","Type":"ContainerStarted","Data":"c939231a3605b653bc6ebaa0ec479ebe51484c21b97d9e5230c5214a554824f4"} Dec 03 22:25:39.536096 master-0 kubenswrapper[36504]: I1203 22:25:39.535890 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-combined-ca-bundle\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.537524 master-0 kubenswrapper[36504]: I1203 22:25:39.537491 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.558799 master-0 kubenswrapper[36504]: I1203 22:25:39.552356 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-scripts\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.558799 master-0 kubenswrapper[36504]: I1203 22:25:39.554016 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6xbn\" (UniqueName: \"kubernetes.io/projected/d7bf8267-d5ba-4e78-b1aa-d73517d74730-kube-api-access-d6xbn\") pod \"dnsmasq-dns-5759b6fcdf-jh9hb\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.560222 master-0 kubenswrapper[36504]: I1203 22:25:39.559816 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qv82\" (UniqueName: \"kubernetes.io/projected/23860378-b004-4ef4-9a8b-847fd6277fd8-kube-api-access-6qv82\") pod \"cinder-baebb-backup-0\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.579109 master-0 kubenswrapper[36504]: I1203 22:25:39.578810 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-86689bd958-gshr9"] Dec 03 22:25:39.617831 master-0 kubenswrapper[36504]: I1203 22:25:39.616352 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" event={"ID":"e4c57b50-c8ce-423b-9d25-b659e26c03f8","Type":"ContainerStarted","Data":"1bb45df3e8027d6c5023023017b2646aeff498f99c7be70e6e07bdb8168e17de"} Dec 03 22:25:39.703655 master-0 kubenswrapper[36504]: I1203 22:25:39.702696 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:39.703655 master-0 kubenswrapper[36504]: I1203 22:25:39.702826 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:39.708468 master-0 kubenswrapper[36504]: I1203 22:25:39.707924 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.712243 master-0 kubenswrapper[36504]: I1203 22:25:39.711989 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-api-config-data" Dec 03 22:25:39.743645 master-0 kubenswrapper[36504]: I1203 22:25:39.742755 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerStarted","Data":"d4bed8f56fdb2f06515925b352a7197ebcca8f90a54b11d9abc7ac47d81f8ee4"} Dec 03 22:25:39.751567 master-0 kubenswrapper[36504]: I1203 22:25:39.751338 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65c678856f-xzvss" event={"ID":"3f0fd30e-728d-4479-a6ed-70fd35fc92ee","Type":"ContainerStarted","Data":"f18f16a343813311eedc2bf2e0cb21ff8d859a48712908a8d16c7766e82ee5f7"} Dec 03 22:25:39.756409 master-0 kubenswrapper[36504]: I1203 22:25:39.756341 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:39.794266 master-0 kubenswrapper[36504]: I1203 22:25:39.791203 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:39.831316 master-0 kubenswrapper[36504]: I1203 22:25:39.831246 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-scripts\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.831617 master-0 kubenswrapper[36504]: I1203 22:25:39.831435 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdkh7\" (UniqueName: \"kubernetes.io/projected/714a0435-27a3-456a-a077-68d416b5c13b-kube-api-access-tdkh7\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.831617 master-0 kubenswrapper[36504]: I1203 22:25:39.831500 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/714a0435-27a3-456a-a077-68d416b5c13b-logs\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.831617 master-0 kubenswrapper[36504]: I1203 22:25:39.831576 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/714a0435-27a3-456a-a077-68d416b5c13b-etc-machine-id\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.831617 master-0 kubenswrapper[36504]: I1203 22:25:39.831613 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.832055 master-0 kubenswrapper[36504]: I1203 22:25:39.831630 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data-custom\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.832055 master-0 kubenswrapper[36504]: I1203 22:25:39.831679 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-combined-ca-bundle\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.846068 master-0 kubenswrapper[36504]: I1203 22:25:39.844432 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d798fdcb9-r9nhg"] Dec 03 22:25:39.862193 master-0 kubenswrapper[36504]: I1203 22:25:39.846703 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" podStartSLOduration=9.02428532 podStartE2EDuration="28.846688659s" podCreationTimestamp="2025-12-03 22:25:11 +0000 UTC" firstStartedPulling="2025-12-03 22:25:17.936636826 +0000 UTC m=+883.156408833" lastFinishedPulling="2025-12-03 22:25:37.759040165 +0000 UTC m=+902.978812172" observedRunningTime="2025-12-03 22:25:39.644136012 +0000 UTC m=+904.863908019" watchObservedRunningTime="2025-12-03 22:25:39.846688659 +0000 UTC m=+905.066460666" Dec 03 22:25:39.936461 master-0 kubenswrapper[36504]: I1203 22:25:39.936365 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.936687 master-0 kubenswrapper[36504]: I1203 22:25:39.936476 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data-custom\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.936687 master-0 kubenswrapper[36504]: I1203 22:25:39.936540 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-combined-ca-bundle\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.936687 master-0 kubenswrapper[36504]: I1203 22:25:39.936653 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-scripts\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.936872 master-0 kubenswrapper[36504]: I1203 22:25:39.936763 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdkh7\" (UniqueName: \"kubernetes.io/projected/714a0435-27a3-456a-a077-68d416b5c13b-kube-api-access-tdkh7\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.936872 master-0 kubenswrapper[36504]: I1203 22:25:39.936860 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/714a0435-27a3-456a-a077-68d416b5c13b-logs\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.936945 master-0 kubenswrapper[36504]: I1203 22:25:39.936934 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/714a0435-27a3-456a-a077-68d416b5c13b-etc-machine-id\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.937094 master-0 kubenswrapper[36504]: I1203 22:25:39.937066 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/714a0435-27a3-456a-a077-68d416b5c13b-etc-machine-id\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.938218 master-0 kubenswrapper[36504]: I1203 22:25:39.938188 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/714a0435-27a3-456a-a077-68d416b5c13b-logs\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.939853 master-0 kubenswrapper[36504]: I1203 22:25:39.939818 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7997d6f544-ljwhp"] Dec 03 22:25:39.941683 master-0 kubenswrapper[36504]: I1203 22:25:39.941641 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data-custom\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.941879 master-0 kubenswrapper[36504]: I1203 22:25:39.941849 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-combined-ca-bundle\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.942042 master-0 kubenswrapper[36504]: I1203 22:25:39.941849 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:39.942365 master-0 kubenswrapper[36504]: I1203 22:25:39.942340 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-scripts\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:40.081762 master-0 kubenswrapper[36504]: I1203 22:25:40.081690 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdkh7\" (UniqueName: \"kubernetes.io/projected/714a0435-27a3-456a-a077-68d416b5c13b-kube-api-access-tdkh7\") pod \"cinder-baebb-api-0\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:40.173947 master-0 kubenswrapper[36504]: I1203 22:25:40.173797 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:40.400426 master-0 kubenswrapper[36504]: I1203 22:25:40.400192 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:25:40.436998 master-0 kubenswrapper[36504]: I1203 22:25:40.436885 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:25:40.480482 master-0 kubenswrapper[36504]: W1203 22:25:40.480409 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2f5cca7_d1b3_4572_859e_e77f1f4055ae.slice/crio-a8b586c8dbc5c8b590a9f361c39ac6759f1c5d32ba63bafb55bb749cb3d438e0 WatchSource:0}: Error finding container a8b586c8dbc5c8b590a9f361c39ac6759f1c5d32ba63bafb55bb749cb3d438e0: Status 404 returned error can't find the container with id a8b586c8dbc5c8b590a9f361c39ac6759f1c5d32ba63bafb55bb749cb3d438e0 Dec 03 22:25:40.777071 master-0 kubenswrapper[36504]: I1203 22:25:40.774599 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7997d6f544-ljwhp" event={"ID":"1293e094-4f1e-4c59-8050-7eeb11695c40","Type":"ContainerStarted","Data":"33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc"} Dec 03 22:25:40.777071 master-0 kubenswrapper[36504]: I1203 22:25:40.774670 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7997d6f544-ljwhp" event={"ID":"1293e094-4f1e-4c59-8050-7eeb11695c40","Type":"ContainerStarted","Data":"f2d1b8b582c171b83578d67685e800fe9005b2e52d696f8a8225e597cd69883d"} Dec 03 22:25:40.780869 master-0 kubenswrapper[36504]: I1203 22:25:40.780806 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"c2f5cca7-d1b3-4572-859e-e77f1f4055ae","Type":"ContainerStarted","Data":"a8b586c8dbc5c8b590a9f361c39ac6759f1c5d32ba63bafb55bb749cb3d438e0"} Dec 03 22:25:40.786602 master-0 kubenswrapper[36504]: I1203 22:25:40.786462 36504 generic.go:334] "Generic (PLEG): container finished" podID="158ee096-45b2-4588-9b52-8658e75aaac8" containerID="2fe29d5266fdc23c93fab90fc41def15aeb55d3f1c28b5a77d51fb436ec4604b" exitCode=0 Dec 03 22:25:40.786898 master-0 kubenswrapper[36504]: I1203 22:25:40.786674 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" event={"ID":"158ee096-45b2-4588-9b52-8658e75aaac8","Type":"ContainerDied","Data":"2fe29d5266fdc23c93fab90fc41def15aeb55d3f1c28b5a77d51fb436ec4604b"} Dec 03 22:25:40.786898 master-0 kubenswrapper[36504]: I1203 22:25:40.786724 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" event={"ID":"158ee096-45b2-4588-9b52-8658e75aaac8","Type":"ContainerStarted","Data":"2665a983b61c17fdd04ab39947fb0473f2dcb5af3c1de47a9dbf296425966206"} Dec 03 22:25:40.800462 master-0 kubenswrapper[36504]: I1203 22:25:40.800274 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7864d79866-k7zqf" event={"ID":"23fa164c-aba3-4c36-bdd4-7f952b7d3a24","Type":"ContainerStarted","Data":"a06b232bec7178dc6a10be16584c76afac4cf8d7c1622c0187e8c70ee227b2c2"} Dec 03 22:25:40.811292 master-0 kubenswrapper[36504]: I1203 22:25:40.811132 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9","Type":"ContainerStarted","Data":"7bc64eda8806bf2918604bcdfe66e50baa76f25955032db3a5388426ddad5db3"} Dec 03 22:25:40.848112 master-0 kubenswrapper[36504]: I1203 22:25:40.847136 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6bf95cb78d-jt2z5" event={"ID":"2c4fd6ea-11cf-4738-9abb-c9c604e217bb","Type":"ContainerStarted","Data":"be13b0464b360300aea2a665346938e7bf45f51f3a4d9f7f724948ffb8158157"} Dec 03 22:25:40.848112 master-0 kubenswrapper[36504]: I1203 22:25:40.847203 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:25:40.864699 master-0 kubenswrapper[36504]: I1203 22:25:40.864588 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:25:40.896389 master-0 kubenswrapper[36504]: I1203 22:25:40.893188 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5759b6fcdf-jh9hb"] Dec 03 22:25:41.388655 master-0 kubenswrapper[36504]: I1203 22:25:41.388599 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:41.625661 master-0 kubenswrapper[36504]: I1203 22:25:41.621657 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6bf95cb78d-jt2z5" podStartSLOduration=13.621627154 podStartE2EDuration="13.621627154s" podCreationTimestamp="2025-12-03 22:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:40.924356689 +0000 UTC m=+906.144128716" watchObservedRunningTime="2025-12-03 22:25:41.621627154 +0000 UTC m=+906.841399161" Dec 03 22:25:41.651350 master-0 kubenswrapper[36504]: I1203 22:25:41.650755 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-svc\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.651350 master-0 kubenswrapper[36504]: I1203 22:25:41.651198 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-nb\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.651350 master-0 kubenswrapper[36504]: I1203 22:25:41.651351 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dn75\" (UniqueName: \"kubernetes.io/projected/158ee096-45b2-4588-9b52-8658e75aaac8-kube-api-access-8dn75\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.651350 master-0 kubenswrapper[36504]: I1203 22:25:41.651383 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-sb\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.651985 master-0 kubenswrapper[36504]: I1203 22:25:41.651436 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-edpm\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.651985 master-0 kubenswrapper[36504]: I1203 22:25:41.651609 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-config\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.651985 master-0 kubenswrapper[36504]: I1203 22:25:41.651695 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-swift-storage-0\") pod \"158ee096-45b2-4588-9b52-8658e75aaac8\" (UID: \"158ee096-45b2-4588-9b52-8658e75aaac8\") " Dec 03 22:25:41.667796 master-0 kubenswrapper[36504]: I1203 22:25:41.658931 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:41.683958 master-0 kubenswrapper[36504]: I1203 22:25:41.676075 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158ee096-45b2-4588-9b52-8658e75aaac8-kube-api-access-8dn75" (OuterVolumeSpecName: "kube-api-access-8dn75") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "kube-api-access-8dn75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:41.697917 master-0 kubenswrapper[36504]: I1203 22:25:41.697793 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:41.716174 master-0 kubenswrapper[36504]: I1203 22:25:41.716056 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-edpm" (OuterVolumeSpecName: "edpm") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:41.716966 master-0 kubenswrapper[36504]: I1203 22:25:41.716916 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-config" (OuterVolumeSpecName: "config") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:41.723670 master-0 kubenswrapper[36504]: I1203 22:25:41.721726 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:41.740018 master-0 kubenswrapper[36504]: I1203 22:25:41.739882 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:41.747910 master-0 kubenswrapper[36504]: I1203 22:25:41.747833 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "158ee096-45b2-4588-9b52-8658e75aaac8" (UID: "158ee096-45b2-4588-9b52-8658e75aaac8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756816 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756877 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756889 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756917 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756931 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dn75\" (UniqueName: \"kubernetes.io/projected/158ee096-45b2-4588-9b52-8658e75aaac8-kube-api-access-8dn75\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756940 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.757029 master-0 kubenswrapper[36504]: I1203 22:25:41.756948 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/158ee096-45b2-4588-9b52-8658e75aaac8-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:41.873569 master-0 kubenswrapper[36504]: I1203 22:25:41.873303 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"23860378-b004-4ef4-9a8b-847fd6277fd8","Type":"ContainerStarted","Data":"ba5a5c91ed9eb8a2d6d4e2f7102d983f17bb3d36300e51fb997e4e0bb9d6bd5f"} Dec 03 22:25:41.912842 master-0 kubenswrapper[36504]: I1203 22:25:41.901190 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"714a0435-27a3-456a-a077-68d416b5c13b","Type":"ContainerStarted","Data":"546d0218c498ddd80089585eebe0e4ce008275746da2b4fed47a0830797acf93"} Dec 03 22:25:41.941404 master-0 kubenswrapper[36504]: I1203 22:25:41.941288 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" event={"ID":"158ee096-45b2-4588-9b52-8658e75aaac8","Type":"ContainerDied","Data":"2665a983b61c17fdd04ab39947fb0473f2dcb5af3c1de47a9dbf296425966206"} Dec 03 22:25:41.941404 master-0 kubenswrapper[36504]: I1203 22:25:41.941397 36504 scope.go:117] "RemoveContainer" containerID="2fe29d5266fdc23c93fab90fc41def15aeb55d3f1c28b5a77d51fb436ec4604b" Dec 03 22:25:41.942586 master-0 kubenswrapper[36504]: I1203 22:25:41.942215 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d798fdcb9-r9nhg" Dec 03 22:25:41.971266 master-0 kubenswrapper[36504]: I1203 22:25:41.971134 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7864d79866-k7zqf" event={"ID":"23fa164c-aba3-4c36-bdd4-7f952b7d3a24","Type":"ContainerStarted","Data":"c4fb9b52c7c822b1c1ca1822262c82bc4ce5513a06f5812c168bf1df9c6fba13"} Dec 03 22:25:41.971266 master-0 kubenswrapper[36504]: I1203 22:25:41.971278 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:41.971811 master-0 kubenswrapper[36504]: I1203 22:25:41.971749 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:41.988280 master-0 kubenswrapper[36504]: I1203 22:25:41.988202 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" event={"ID":"d7bf8267-d5ba-4e78-b1aa-d73517d74730","Type":"ContainerStarted","Data":"14c9e6117142ac738f00be96830ac4be84aafafa5ba4ee95ab41ba1647abae62"} Dec 03 22:25:41.988593 master-0 kubenswrapper[36504]: I1203 22:25:41.988315 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" event={"ID":"d7bf8267-d5ba-4e78-b1aa-d73517d74730","Type":"ContainerStarted","Data":"734107ca3c92d21497bb97dec6aea2fb61825d02af3113cf7ab38a6c14b820a7"} Dec 03 22:25:41.997134 master-0 kubenswrapper[36504]: I1203 22:25:41.997061 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7997d6f544-ljwhp" event={"ID":"1293e094-4f1e-4c59-8050-7eeb11695c40","Type":"ContainerStarted","Data":"e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a"} Dec 03 22:25:41.997853 master-0 kubenswrapper[36504]: I1203 22:25:41.997806 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:42.015494 master-0 kubenswrapper[36504]: I1203 22:25:42.015337 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7864d79866-k7zqf" podStartSLOduration=14.015291666 podStartE2EDuration="14.015291666s" podCreationTimestamp="2025-12-03 22:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:42.012523589 +0000 UTC m=+907.232295616" watchObservedRunningTime="2025-12-03 22:25:42.015291666 +0000 UTC m=+907.235063673" Dec 03 22:25:42.171069 master-0 kubenswrapper[36504]: I1203 22:25:42.170622 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7997d6f544-ljwhp" podStartSLOduration=4.170595398 podStartE2EDuration="4.170595398s" podCreationTimestamp="2025-12-03 22:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:42.123347263 +0000 UTC m=+907.343119290" watchObservedRunningTime="2025-12-03 22:25:42.170595398 +0000 UTC m=+907.390367405" Dec 03 22:25:42.312958 master-0 kubenswrapper[36504]: I1203 22:25:42.312887 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d798fdcb9-r9nhg"] Dec 03 22:25:42.342855 master-0 kubenswrapper[36504]: I1203 22:25:42.341340 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d798fdcb9-r9nhg"] Dec 03 22:25:43.018933 master-0 kubenswrapper[36504]: I1203 22:25:43.018734 36504 generic.go:334] "Generic (PLEG): container finished" podID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerID="14c9e6117142ac738f00be96830ac4be84aafafa5ba4ee95ab41ba1647abae62" exitCode=0 Dec 03 22:25:43.018933 master-0 kubenswrapper[36504]: I1203 22:25:43.018812 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" event={"ID":"d7bf8267-d5ba-4e78-b1aa-d73517d74730","Type":"ContainerDied","Data":"14c9e6117142ac738f00be96830ac4be84aafafa5ba4ee95ab41ba1647abae62"} Dec 03 22:25:43.024163 master-0 kubenswrapper[36504]: I1203 22:25:43.024098 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"714a0435-27a3-456a-a077-68d416b5c13b","Type":"ContainerStarted","Data":"b9b2e21bd1bd504aa0b1694fc7751c5a55b3c72b0be0d9037ba726bdff600006"} Dec 03 22:25:43.024530 master-0 kubenswrapper[36504]: I1203 22:25:43.024505 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:43.118716 master-0 kubenswrapper[36504]: I1203 22:25:43.118623 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158ee096-45b2-4588-9b52-8658e75aaac8" path="/var/lib/kubelet/pods/158ee096-45b2-4588-9b52-8658e75aaac8/volumes" Dec 03 22:25:43.708666 master-0 kubenswrapper[36504]: I1203 22:25:43.708593 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:45.169570 master-0 kubenswrapper[36504]: I1203 22:25:45.169338 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:45.169570 master-0 kubenswrapper[36504]: I1203 22:25:45.169409 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" event={"ID":"d7bf8267-d5ba-4e78-b1aa-d73517d74730","Type":"ContainerStarted","Data":"c205221b927124c9b9a8263ebb568182ce030dfce8c7d6b4a27fa1fdafc43a70"} Dec 03 22:25:45.169570 master-0 kubenswrapper[36504]: I1203 22:25:45.169438 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65c678856f-xzvss" event={"ID":"3f0fd30e-728d-4479-a6ed-70fd35fc92ee","Type":"ContainerStarted","Data":"3e5b313b4300a81d657aaf81caaac86c1207dd525ae30de6fc105daa7185cb57"} Dec 03 22:25:45.169570 master-0 kubenswrapper[36504]: I1203 22:25:45.169459 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86689bd958-gshr9" event={"ID":"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9","Type":"ContainerStarted","Data":"81e686cb7f2ec0399226d901b850df563453ade355664f12f8987780c1d02e98"} Dec 03 22:25:45.169570 master-0 kubenswrapper[36504]: I1203 22:25:45.169479 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9","Type":"ContainerStarted","Data":"046a26ca3f61540f96df2e42fa7563b7fb36949f5dffe5ecbdbe48f047f7a926"} Dec 03 22:25:45.409783 master-0 kubenswrapper[36504]: I1203 22:25:45.408196 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" podStartSLOduration=6.408139442 podStartE2EDuration="6.408139442s" podCreationTimestamp="2025-12-03 22:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:45.385380527 +0000 UTC m=+910.605152534" watchObservedRunningTime="2025-12-03 22:25:45.408139442 +0000 UTC m=+910.627911449" Dec 03 22:25:45.691682 master-0 kubenswrapper[36504]: I1203 22:25:45.691480 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:25:46.185021 master-0 kubenswrapper[36504]: I1203 22:25:46.184682 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"714a0435-27a3-456a-a077-68d416b5c13b","Type":"ContainerStarted","Data":"37f6fafea42a3e5c4b7cee66cae7793fad02da0ce672ab6460e1735aa0125412"} Dec 03 22:25:46.185900 master-0 kubenswrapper[36504]: I1203 22:25:46.185580 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-api-0" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-baebb-api-log" containerID="cri-o://b9b2e21bd1bd504aa0b1694fc7751c5a55b3c72b0be0d9037ba726bdff600006" gracePeriod=30 Dec 03 22:25:46.185900 master-0 kubenswrapper[36504]: I1203 22:25:46.185622 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:46.185900 master-0 kubenswrapper[36504]: I1203 22:25:46.185721 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-api-0" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-api" containerID="cri-o://37f6fafea42a3e5c4b7cee66cae7793fad02da0ce672ab6460e1735aa0125412" gracePeriod=30 Dec 03 22:25:46.188922 master-0 kubenswrapper[36504]: I1203 22:25:46.188812 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"c2f5cca7-d1b3-4572-859e-e77f1f4055ae","Type":"ContainerStarted","Data":"980eabc1cd233ce4cb2f3e940e5ee016977ff58fffe1e9d3c0848840a2d4345b"} Dec 03 22:25:46.194708 master-0 kubenswrapper[36504]: I1203 22:25:46.193376 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"23860378-b004-4ef4-9a8b-847fd6277fd8","Type":"ContainerStarted","Data":"75e9521003bd0fd877234e008e36945cca4473dbcbdbf6d9cd59ca7b9a4cd952"} Dec 03 22:25:46.296389 master-0 kubenswrapper[36504]: I1203 22:25:46.294690 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-api-0" podStartSLOduration=7.294665295 podStartE2EDuration="7.294665295s" podCreationTimestamp="2025-12-03 22:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:46.292961841 +0000 UTC m=+911.512733858" watchObservedRunningTime="2025-12-03 22:25:46.294665295 +0000 UTC m=+911.514437302" Dec 03 22:25:46.950807 master-0 kubenswrapper[36504]: I1203 22:25:46.950106 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56565586f6-z78xs"] Dec 03 22:25:46.951149 master-0 kubenswrapper[36504]: E1203 22:25:46.951025 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158ee096-45b2-4588-9b52-8658e75aaac8" containerName="init" Dec 03 22:25:46.951149 master-0 kubenswrapper[36504]: I1203 22:25:46.951048 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="158ee096-45b2-4588-9b52-8658e75aaac8" containerName="init" Dec 03 22:25:46.953229 master-0 kubenswrapper[36504]: I1203 22:25:46.951455 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="158ee096-45b2-4588-9b52-8658e75aaac8" containerName="init" Dec 03 22:25:46.953552 master-0 kubenswrapper[36504]: I1203 22:25:46.953504 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:46.959689 master-0 kubenswrapper[36504]: I1203 22:25:46.959102 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 22:25:46.990918 master-0 kubenswrapper[36504]: I1203 22:25:46.990823 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56565586f6-z78xs"] Dec 03 22:25:46.992710 master-0 kubenswrapper[36504]: I1203 22:25:46.992513 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 22:25:47.087586 master-0 kubenswrapper[36504]: I1203 22:25:47.087502 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-public-tls-certs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.087586 master-0 kubenswrapper[36504]: I1203 22:25:47.087593 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-logs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.088022 master-0 kubenswrapper[36504]: I1203 22:25:47.087736 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-internal-tls-certs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.088022 master-0 kubenswrapper[36504]: I1203 22:25:47.087895 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-config-data-custom\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.088022 master-0 kubenswrapper[36504]: I1203 22:25:47.087919 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-config-data\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.088022 master-0 kubenswrapper[36504]: I1203 22:25:47.087951 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72mzn\" (UniqueName: \"kubernetes.io/projected/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-kube-api-access-72mzn\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.088022 master-0 kubenswrapper[36504]: I1203 22:25:47.087993 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-combined-ca-bundle\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.192040 master-0 kubenswrapper[36504]: I1203 22:25:47.191980 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-internal-tls-certs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.192940 master-0 kubenswrapper[36504]: I1203 22:25:47.192913 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-config-data-custom\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.193015 master-0 kubenswrapper[36504]: I1203 22:25:47.192969 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-config-data\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.193056 master-0 kubenswrapper[36504]: I1203 22:25:47.193030 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72mzn\" (UniqueName: \"kubernetes.io/projected/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-kube-api-access-72mzn\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.193896 master-0 kubenswrapper[36504]: I1203 22:25:47.193105 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-combined-ca-bundle\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.194081 master-0 kubenswrapper[36504]: I1203 22:25:47.194042 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-public-tls-certs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.194135 master-0 kubenswrapper[36504]: I1203 22:25:47.194094 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-logs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.196897 master-0 kubenswrapper[36504]: I1203 22:25:47.196858 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-logs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.197432 master-0 kubenswrapper[36504]: I1203 22:25:47.197383 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-config-data-custom\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.198111 master-0 kubenswrapper[36504]: I1203 22:25:47.198059 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-internal-tls-certs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.199511 master-0 kubenswrapper[36504]: I1203 22:25:47.199474 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-config-data\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.202518 master-0 kubenswrapper[36504]: I1203 22:25:47.202462 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-combined-ca-bundle\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.208509 master-0 kubenswrapper[36504]: I1203 22:25:47.208434 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-public-tls-certs\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.215649 master-0 kubenswrapper[36504]: I1203 22:25:47.215571 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72mzn\" (UniqueName: \"kubernetes.io/projected/edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad-kube-api-access-72mzn\") pod \"barbican-api-56565586f6-z78xs\" (UID: \"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad\") " pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:47.229246 master-0 kubenswrapper[36504]: I1203 22:25:47.229188 36504 generic.go:334] "Generic (PLEG): container finished" podID="714a0435-27a3-456a-a077-68d416b5c13b" containerID="37f6fafea42a3e5c4b7cee66cae7793fad02da0ce672ab6460e1735aa0125412" exitCode=0 Dec 03 22:25:47.229246 master-0 kubenswrapper[36504]: I1203 22:25:47.229238 36504 generic.go:334] "Generic (PLEG): container finished" podID="714a0435-27a3-456a-a077-68d416b5c13b" containerID="b9b2e21bd1bd504aa0b1694fc7751c5a55b3c72b0be0d9037ba726bdff600006" exitCode=143 Dec 03 22:25:47.229457 master-0 kubenswrapper[36504]: I1203 22:25:47.229276 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"714a0435-27a3-456a-a077-68d416b5c13b","Type":"ContainerDied","Data":"37f6fafea42a3e5c4b7cee66cae7793fad02da0ce672ab6460e1735aa0125412"} Dec 03 22:25:47.230699 master-0 kubenswrapper[36504]: I1203 22:25:47.230669 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"714a0435-27a3-456a-a077-68d416b5c13b","Type":"ContainerDied","Data":"b9b2e21bd1bd504aa0b1694fc7751c5a55b3c72b0be0d9037ba726bdff600006"} Dec 03 22:25:47.331710 master-0 kubenswrapper[36504]: I1203 22:25:47.330984 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:49.717118 master-0 kubenswrapper[36504]: I1203 22:25:49.717027 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:49.793996 master-0 kubenswrapper[36504]: I1203 22:25:49.793920 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:25:49.996868 master-0 kubenswrapper[36504]: I1203 22:25:49.996746 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b7f767947-jf44q"] Dec 03 22:25:49.997298 master-0 kubenswrapper[36504]: I1203 22:25:49.997213 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b7f767947-jf44q" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerName="dnsmasq-dns" containerID="cri-o://320c354237b8f5025d220d84753f9b6d65c75e6860e9b6ff615aed0d058e959d" gracePeriod=10 Dec 03 22:25:50.183598 master-0 kubenswrapper[36504]: I1203 22:25:50.183472 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-baebb-api-0" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-api" probeResult="failure" output="Get \"http://10.128.0.245:8776/healthcheck\": dial tcp 10.128.0.245:8776: connect: connection refused" Dec 03 22:25:50.258532 master-0 kubenswrapper[36504]: I1203 22:25:50.258346 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:50.266538 master-0 kubenswrapper[36504]: I1203 22:25:50.266476 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:25:50.344895 master-0 kubenswrapper[36504]: I1203 22:25:50.343509 36504 generic.go:334] "Generic (PLEG): container finished" podID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerID="320c354237b8f5025d220d84753f9b6d65c75e6860e9b6ff615aed0d058e959d" exitCode=0 Dec 03 22:25:50.344895 master-0 kubenswrapper[36504]: I1203 22:25:50.343609 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7f767947-jf44q" event={"ID":"4214a09f-9cb8-4d17-9a41-369838f0eedc","Type":"ContainerDied","Data":"320c354237b8f5025d220d84753f9b6d65c75e6860e9b6ff615aed0d058e959d"} Dec 03 22:25:52.407071 master-0 kubenswrapper[36504]: I1203 22:25:52.406967 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-86689bd958-gshr9" event={"ID":"ef686a34-c2bd-4a3c-9e9b-39b4318afbb9","Type":"ContainerStarted","Data":"019c409f6536b28d7a5d6161579e6d1801b654656cc67006009d35b08d6a02d7"} Dec 03 22:25:52.463403 master-0 kubenswrapper[36504]: I1203 22:25:52.463064 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-86689bd958-gshr9" podStartSLOduration=11.186590597 podStartE2EDuration="15.463032924s" podCreationTimestamp="2025-12-03 22:25:37 +0000 UTC" firstStartedPulling="2025-12-03 22:25:39.456541746 +0000 UTC m=+904.676313743" lastFinishedPulling="2025-12-03 22:25:43.732984063 +0000 UTC m=+908.952756070" observedRunningTime="2025-12-03 22:25:52.446947238 +0000 UTC m=+917.666719255" watchObservedRunningTime="2025-12-03 22:25:52.463032924 +0000 UTC m=+917.682804931" Dec 03 22:25:53.072011 master-0 kubenswrapper[36504]: I1203 22:25:53.071929 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.101426 master-0 kubenswrapper[36504]: I1203 22:25:53.101362 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:53.237226 master-0 kubenswrapper[36504]: I1203 22:25:53.236271 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56f6c98b97-zjmbp" Dec 03 22:25:53.241616 master-0 kubenswrapper[36504]: I1203 22:25:53.241557 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-config\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.241900 master-0 kubenswrapper[36504]: I1203 22:25:53.241808 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data-custom\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.241900 master-0 kubenswrapper[36504]: I1203 22:25:53.241855 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-nb\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.241988 master-0 kubenswrapper[36504]: I1203 22:25:53.241948 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-svc\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.242031 master-0 kubenswrapper[36504]: I1203 22:25:53.242003 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.242107 master-0 kubenswrapper[36504]: I1203 22:25:53.242089 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdkh7\" (UniqueName: \"kubernetes.io/projected/714a0435-27a3-456a-a077-68d416b5c13b-kube-api-access-tdkh7\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.242158 master-0 kubenswrapper[36504]: I1203 22:25:53.242140 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/714a0435-27a3-456a-a077-68d416b5c13b-etc-machine-id\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.242198 master-0 kubenswrapper[36504]: I1203 22:25:53.242172 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-sb\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.242198 master-0 kubenswrapper[36504]: I1203 22:25:53.242191 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-combined-ca-bundle\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.242268 master-0 kubenswrapper[36504]: I1203 22:25:53.242220 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-edpm\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.242347 master-0 kubenswrapper[36504]: I1203 22:25:53.242327 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rflp\" (UniqueName: \"kubernetes.io/projected/4214a09f-9cb8-4d17-9a41-369838f0eedc-kube-api-access-7rflp\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.242512 master-0 kubenswrapper[36504]: I1203 22:25:53.242484 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-scripts\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.242558 master-0 kubenswrapper[36504]: I1203 22:25:53.242523 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/714a0435-27a3-456a-a077-68d416b5c13b-logs\") pod \"714a0435-27a3-456a-a077-68d416b5c13b\" (UID: \"714a0435-27a3-456a-a077-68d416b5c13b\") " Dec 03 22:25:53.242612 master-0 kubenswrapper[36504]: I1203 22:25:53.242581 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-swift-storage-0\") pod \"4214a09f-9cb8-4d17-9a41-369838f0eedc\" (UID: \"4214a09f-9cb8-4d17-9a41-369838f0eedc\") " Dec 03 22:25:53.246143 master-0 kubenswrapper[36504]: I1203 22:25:53.246106 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/714a0435-27a3-456a-a077-68d416b5c13b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:25:53.247994 master-0 kubenswrapper[36504]: I1203 22:25:53.247952 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714a0435-27a3-456a-a077-68d416b5c13b-kube-api-access-tdkh7" (OuterVolumeSpecName: "kube-api-access-tdkh7") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "kube-api-access-tdkh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:53.249454 master-0 kubenswrapper[36504]: I1203 22:25:53.249350 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdkh7\" (UniqueName: \"kubernetes.io/projected/714a0435-27a3-456a-a077-68d416b5c13b-kube-api-access-tdkh7\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.249454 master-0 kubenswrapper[36504]: I1203 22:25:53.249410 36504 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/714a0435-27a3-456a-a077-68d416b5c13b-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.249454 master-0 kubenswrapper[36504]: I1203 22:25:53.249350 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714a0435-27a3-456a-a077-68d416b5c13b-logs" (OuterVolumeSpecName: "logs") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:25:53.266816 master-0 kubenswrapper[36504]: I1203 22:25:53.266048 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-scripts" (OuterVolumeSpecName: "scripts") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:53.266816 master-0 kubenswrapper[36504]: I1203 22:25:53.266689 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:53.274871 master-0 kubenswrapper[36504]: I1203 22:25:53.274636 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4214a09f-9cb8-4d17-9a41-369838f0eedc-kube-api-access-7rflp" (OuterVolumeSpecName: "kube-api-access-7rflp") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "kube-api-access-7rflp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:53.348834 master-0 kubenswrapper[36504]: I1203 22:25:53.348674 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-edpm" (OuterVolumeSpecName: "edpm") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:53.352599 master-0 kubenswrapper[36504]: I1203 22:25:53.352496 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.352599 master-0 kubenswrapper[36504]: I1203 22:25:53.352580 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.352599 master-0 kubenswrapper[36504]: I1203 22:25:53.352598 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rflp\" (UniqueName: \"kubernetes.io/projected/4214a09f-9cb8-4d17-9a41-369838f0eedc-kube-api-access-7rflp\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.352599 master-0 kubenswrapper[36504]: I1203 22:25:53.352612 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.352979 master-0 kubenswrapper[36504]: I1203 22:25:53.352622 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/714a0435-27a3-456a-a077-68d416b5c13b-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.435108 master-0 kubenswrapper[36504]: I1203 22:25:53.420049 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data" (OuterVolumeSpecName: "config-data") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:53.455717 master-0 kubenswrapper[36504]: I1203 22:25:53.441390 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56565586f6-z78xs"] Dec 03 22:25:53.469435 master-0 kubenswrapper[36504]: I1203 22:25:53.466556 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.487398 master-0 kubenswrapper[36504]: I1203 22:25:53.480337 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "714a0435-27a3-456a-a077-68d416b5c13b" (UID: "714a0435-27a3-456a-a077-68d416b5c13b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:53.494415 master-0 kubenswrapper[36504]: I1203 22:25:53.493268 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.501742 master-0 kubenswrapper[36504]: I1203 22:25:53.492237 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"714a0435-27a3-456a-a077-68d416b5c13b","Type":"ContainerDied","Data":"546d0218c498ddd80089585eebe0e4ce008275746da2b4fed47a0830797acf93"} Dec 03 22:25:53.502136 master-0 kubenswrapper[36504]: I1203 22:25:53.501748 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64488669cb-b8cqj"] Dec 03 22:25:53.502136 master-0 kubenswrapper[36504]: I1203 22:25:53.501814 36504 scope.go:117] "RemoveContainer" containerID="37f6fafea42a3e5c4b7cee66cae7793fad02da0ce672ab6460e1735aa0125412" Dec 03 22:25:53.502433 master-0 kubenswrapper[36504]: I1203 22:25:53.502392 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64488669cb-b8cqj" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-api" containerID="cri-o://92b857429875a99f53e495dd31dfab68a737dd9585a62fb08ccbd9d6dff299e2" gracePeriod=30 Dec 03 22:25:53.502599 master-0 kubenswrapper[36504]: I1203 22:25:53.502495 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-64488669cb-b8cqj" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-httpd" containerID="cri-o://fe175fa8193f4ae74569fd34bf60c3a220d9fe564ee1c661e695748d46ef3a36" gracePeriod=30 Dec 03 22:25:53.508926 master-0 kubenswrapper[36504]: I1203 22:25:53.508885 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b7f767947-jf44q" event={"ID":"4214a09f-9cb8-4d17-9a41-369838f0eedc","Type":"ContainerDied","Data":"e0dc1317629462445b8dc30902570d0d46ced1d896ed967cd3b8f1628525eb34"} Dec 03 22:25:53.509029 master-0 kubenswrapper[36504]: I1203 22:25:53.509011 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b7f767947-jf44q" Dec 03 22:25:53.531086 master-0 kubenswrapper[36504]: W1203 22:25:53.529091 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedf6204b_c1ee_4d11_8ab6_0e7d58ab11ad.slice/crio-522cf6f8e65c9deb6289e07ee21ebdd9da64e77bc60722b1f1cd6defb89e1e9e WatchSource:0}: Error finding container 522cf6f8e65c9deb6289e07ee21ebdd9da64e77bc60722b1f1cd6defb89e1e9e: Status 404 returned error can't find the container with id 522cf6f8e65c9deb6289e07ee21ebdd9da64e77bc60722b1f1cd6defb89e1e9e Dec 03 22:25:53.532375 master-0 kubenswrapper[36504]: I1203 22:25:53.531951 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65c678856f-xzvss" event={"ID":"3f0fd30e-728d-4479-a6ed-70fd35fc92ee","Type":"ContainerStarted","Data":"fae283b5b3a4387257593f8bb306937324d6dac7846c5b4c79af4a51cdeec8ed"} Dec 03 22:25:53.568366 master-0 kubenswrapper[36504]: I1203 22:25:53.568315 36504 scope.go:117] "RemoveContainer" containerID="b9b2e21bd1bd504aa0b1694fc7751c5a55b3c72b0be0d9037ba726bdff600006" Dec 03 22:25:53.573764 master-0 kubenswrapper[36504]: I1203 22:25:53.573697 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/714a0435-27a3-456a-a077-68d416b5c13b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.595150 master-0 kubenswrapper[36504]: I1203 22:25:53.593703 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:53.615939 master-0 kubenswrapper[36504]: I1203 22:25:53.606736 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:53.663038 master-0 kubenswrapper[36504]: I1203 22:25:53.651250 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65c678856f-xzvss" podStartSLOduration=12.166447614 podStartE2EDuration="16.651224828s" podCreationTimestamp="2025-12-03 22:25:37 +0000 UTC" firstStartedPulling="2025-12-03 22:25:39.368166889 +0000 UTC m=+904.587938886" lastFinishedPulling="2025-12-03 22:25:43.852944093 +0000 UTC m=+909.072716100" observedRunningTime="2025-12-03 22:25:53.587717222 +0000 UTC m=+918.807489249" watchObservedRunningTime="2025-12-03 22:25:53.651224828 +0000 UTC m=+918.870996835" Dec 03 22:25:53.692700 master-0 kubenswrapper[36504]: I1203 22:25:53.692624 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.692700 master-0 kubenswrapper[36504]: I1203 22:25:53.692684 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:53.702781 master-0 kubenswrapper[36504]: I1203 22:25:53.702686 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:53.721571 master-0 kubenswrapper[36504]: I1203 22:25:53.721439 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:53.794305 master-0 kubenswrapper[36504]: I1203 22:25:53.792577 36504 scope.go:117] "RemoveContainer" containerID="320c354237b8f5025d220d84753f9b6d65c75e6860e9b6ff615aed0d058e959d" Dec 03 22:25:53.851322 master-0 kubenswrapper[36504]: I1203 22:25:53.851236 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: E1203 22:25:53.852124 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerName="init" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852153 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerName="init" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: E1203 22:25:53.852171 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-api" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852182 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-api" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: E1203 22:25:53.852224 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-baebb-api-log" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852234 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-baebb-api-log" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: E1203 22:25:53.852277 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerName="dnsmasq-dns" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852285 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerName="dnsmasq-dns" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852701 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" containerName="dnsmasq-dns" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852755 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-baebb-api-log" Dec 03 22:25:53.853040 master-0 kubenswrapper[36504]: I1203 22:25:53.852807 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="714a0435-27a3-456a-a077-68d416b5c13b" containerName="cinder-api" Dec 03 22:25:53.855102 master-0 kubenswrapper[36504]: I1203 22:25:53.855032 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.859654 master-0 kubenswrapper[36504]: I1203 22:25:53.859470 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 22:25:53.859983 master-0 kubenswrapper[36504]: I1203 22:25:53.859919 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-api-config-data" Dec 03 22:25:53.862364 master-0 kubenswrapper[36504]: I1203 22:25:53.861634 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 22:25:53.898820 master-0 kubenswrapper[36504]: I1203 22:25:53.898601 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-combined-ca-bundle\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.899347 master-0 kubenswrapper[36504]: I1203 22:25:53.899320 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlshk\" (UniqueName: \"kubernetes.io/projected/51d0278a-4441-4eb7-b71e-a5a067fc0f76-kube-api-access-nlshk\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.899517 master-0 kubenswrapper[36504]: I1203 22:25:53.899497 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-scripts\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.899834 master-0 kubenswrapper[36504]: I1203 22:25:53.899734 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-config-data-custom\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.900017 master-0 kubenswrapper[36504]: I1203 22:25:53.899946 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d0278a-4441-4eb7-b71e-a5a067fc0f76-logs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.900291 master-0 kubenswrapper[36504]: I1203 22:25:53.900257 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d0278a-4441-4eb7-b71e-a5a067fc0f76-etc-machine-id\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.900356 master-0 kubenswrapper[36504]: I1203 22:25:53.900303 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-config-data\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.900837 master-0 kubenswrapper[36504]: I1203 22:25:53.900793 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-internal-tls-certs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.900910 master-0 kubenswrapper[36504]: I1203 22:25:53.900876 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-public-tls-certs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:53.977803 master-0 kubenswrapper[36504]: I1203 22:25:53.954645 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:53.977803 master-0 kubenswrapper[36504]: I1203 22:25:53.954902 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:54.026033 master-0 kubenswrapper[36504]: I1203 22:25:54.025848 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-config-data-custom\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026033 master-0 kubenswrapper[36504]: I1203 22:25:54.026028 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d0278a-4441-4eb7-b71e-a5a067fc0f76-logs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026525 master-0 kubenswrapper[36504]: I1203 22:25:54.026204 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-config-data\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026525 master-0 kubenswrapper[36504]: I1203 22:25:54.026251 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d0278a-4441-4eb7-b71e-a5a067fc0f76-etc-machine-id\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026724 master-0 kubenswrapper[36504]: I1203 22:25:54.026687 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-internal-tls-certs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026815 master-0 kubenswrapper[36504]: I1203 22:25:54.026742 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-public-tls-certs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026913 master-0 kubenswrapper[36504]: I1203 22:25:54.026887 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-combined-ca-bundle\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.026987 master-0 kubenswrapper[36504]: I1203 22:25:54.026970 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlshk\" (UniqueName: \"kubernetes.io/projected/51d0278a-4441-4eb7-b71e-a5a067fc0f76-kube-api-access-nlshk\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.027061 master-0 kubenswrapper[36504]: I1203 22:25:54.027020 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-scripts\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.027761 master-0 kubenswrapper[36504]: I1203 22:25:54.027334 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:54.046337 master-0 kubenswrapper[36504]: I1203 22:25:54.036958 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51d0278a-4441-4eb7-b71e-a5a067fc0f76-etc-machine-id\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.046337 master-0 kubenswrapper[36504]: I1203 22:25:54.037433 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51d0278a-4441-4eb7-b71e-a5a067fc0f76-logs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.056710 master-0 kubenswrapper[36504]: I1203 22:25:54.056630 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-scripts\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.069701 master-0 kubenswrapper[36504]: I1203 22:25:54.069637 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-public-tls-certs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.070031 master-0 kubenswrapper[36504]: I1203 22:25:54.069641 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlshk\" (UniqueName: \"kubernetes.io/projected/51d0278a-4441-4eb7-b71e-a5a067fc0f76-kube-api-access-nlshk\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.073279 master-0 kubenswrapper[36504]: I1203 22:25:54.073210 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-config-data-custom\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.076228 master-0 kubenswrapper[36504]: I1203 22:25:54.076170 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-config-data\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.084690 master-0 kubenswrapper[36504]: I1203 22:25:54.084613 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-internal-tls-certs\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.088688 master-0 kubenswrapper[36504]: I1203 22:25:54.088630 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51d0278a-4441-4eb7-b71e-a5a067fc0f76-combined-ca-bundle\") pod \"cinder-baebb-api-0\" (UID: \"51d0278a-4441-4eb7-b71e-a5a067fc0f76\") " pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.155456 master-0 kubenswrapper[36504]: I1203 22:25:54.155377 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-config" (OuterVolumeSpecName: "config") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:54.160999 master-0 kubenswrapper[36504]: I1203 22:25:54.160647 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4214a09f-9cb8-4d17-9a41-369838f0eedc" (UID: "4214a09f-9cb8-4d17-9a41-369838f0eedc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:25:54.180898 master-0 kubenswrapper[36504]: E1203 22:25:54.167214 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bf27d66_22d6_4ae2_8b10_41eba3e48294.slice/crio-conmon-fe175fa8193f4ae74569fd34bf60c3a220d9fe564ee1c661e695748d46ef3a36.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:25:54.233573 master-0 kubenswrapper[36504]: I1203 22:25:54.233278 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:54.233573 master-0 kubenswrapper[36504]: I1203 22:25:54.233345 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4214a09f-9cb8-4d17-9a41-369838f0eedc-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:54.370922 master-0 kubenswrapper[36504]: I1203 22:25:54.370878 36504 scope.go:117] "RemoveContainer" containerID="8e1cbbcd94b5bd5b44674846b4116d0c3e8c98bd508594f298ee41cbbd08e75b" Dec 03 22:25:54.559799 master-0 kubenswrapper[36504]: I1203 22:25:54.558591 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:54.563868 master-0 kubenswrapper[36504]: I1203 22:25:54.562069 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"23860378-b004-4ef4-9a8b-847fd6277fd8","Type":"ContainerStarted","Data":"1b3a9d23e30459db4271a30e0bcde374db3a3e8413db043d91601740ebe1418d"} Dec 03 22:25:54.574030 master-0 kubenswrapper[36504]: I1203 22:25:54.572703 36504 generic.go:334] "Generic (PLEG): container finished" podID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerID="fe175fa8193f4ae74569fd34bf60c3a220d9fe564ee1c661e695748d46ef3a36" exitCode=0 Dec 03 22:25:54.577395 master-0 kubenswrapper[36504]: I1203 22:25:54.574898 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64488669cb-b8cqj" event={"ID":"2bf27d66-22d6-4ae2-8b10-41eba3e48294","Type":"ContainerDied","Data":"fe175fa8193f4ae74569fd34bf60c3a220d9fe564ee1c661e695748d46ef3a36"} Dec 03 22:25:54.592237 master-0 kubenswrapper[36504]: I1203 22:25:54.591092 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56565586f6-z78xs" event={"ID":"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad","Type":"ContainerStarted","Data":"522cf6f8e65c9deb6289e07ee21ebdd9da64e77bc60722b1f1cd6defb89e1e9e"} Dec 03 22:25:54.597127 master-0 kubenswrapper[36504]: I1203 22:25:54.597051 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9","Type":"ContainerStarted","Data":"aaf9860bf25290f04e05b9c2785e9cbcdd96be5128e0defad975e07c649cabc3"} Dec 03 22:25:54.653166 master-0 kubenswrapper[36504]: I1203 22:25:54.652823 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b7f767947-jf44q"] Dec 03 22:25:54.679186 master-0 kubenswrapper[36504]: I1203 22:25:54.679083 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b7f767947-jf44q"] Dec 03 22:25:54.704924 master-0 kubenswrapper[36504]: I1203 22:25:54.704571 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-backup-0" podStartSLOduration=13.78743272 podStartE2EDuration="16.704532633s" podCreationTimestamp="2025-12-03 22:25:38 +0000 UTC" firstStartedPulling="2025-12-03 22:25:40.95015157 +0000 UTC m=+906.169923577" lastFinishedPulling="2025-12-03 22:25:43.867251483 +0000 UTC m=+909.087023490" observedRunningTime="2025-12-03 22:25:54.698457542 +0000 UTC m=+919.918229569" watchObservedRunningTime="2025-12-03 22:25:54.704532633 +0000 UTC m=+919.924304640" Dec 03 22:25:54.711340 master-0 kubenswrapper[36504]: I1203 22:25:54.711142 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-baebb-backup-0" Dec 03 22:25:54.721877 master-0 kubenswrapper[36504]: I1203 22:25:54.719234 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-baebb-backup-0" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="cinder-backup" probeResult="failure" output="Get \"http://10.128.0.243:8080/\": dial tcp 10.128.0.243:8080: connect: connection refused" Dec 03 22:25:54.763861 master-0 kubenswrapper[36504]: I1203 22:25:54.763708 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" podStartSLOduration=13.502478244 podStartE2EDuration="16.763676402s" podCreationTimestamp="2025-12-03 22:25:38 +0000 UTC" firstStartedPulling="2025-12-03 22:25:40.47162845 +0000 UTC m=+905.691400457" lastFinishedPulling="2025-12-03 22:25:43.732826608 +0000 UTC m=+908.952598615" observedRunningTime="2025-12-03 22:25:54.75216824 +0000 UTC m=+919.971940247" watchObservedRunningTime="2025-12-03 22:25:54.763676402 +0000 UTC m=+919.983448409" Dec 03 22:25:55.221596 master-0 kubenswrapper[36504]: I1203 22:25:55.216918 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4214a09f-9cb8-4d17-9a41-369838f0eedc" path="/var/lib/kubelet/pods/4214a09f-9cb8-4d17-9a41-369838f0eedc/volumes" Dec 03 22:25:55.221596 master-0 kubenswrapper[36504]: I1203 22:25:55.217809 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714a0435-27a3-456a-a077-68d416b5c13b" path="/var/lib/kubelet/pods/714a0435-27a3-456a-a077-68d416b5c13b/volumes" Dec 03 22:25:55.307527 master-0 kubenswrapper[36504]: W1203 22:25:55.305581 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d0278a_4441_4eb7_b71e_a5a067fc0f76.slice/crio-682d6ed44ee9b68b7289d77bda673871d8a3bd89d3a4c90b1ac4f83a5acbb498 WatchSource:0}: Error finding container 682d6ed44ee9b68b7289d77bda673871d8a3bd89d3a4c90b1ac4f83a5acbb498: Status 404 returned error can't find the container with id 682d6ed44ee9b68b7289d77bda673871d8a3bd89d3a4c90b1ac4f83a5acbb498 Dec 03 22:25:55.307527 master-0 kubenswrapper[36504]: I1203 22:25:55.306426 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-api-0"] Dec 03 22:25:55.649274 master-0 kubenswrapper[36504]: I1203 22:25:55.649175 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"51d0278a-4441-4eb7-b71e-a5a067fc0f76","Type":"ContainerStarted","Data":"682d6ed44ee9b68b7289d77bda673871d8a3bd89d3a4c90b1ac4f83a5acbb498"} Dec 03 22:25:55.665721 master-0 kubenswrapper[36504]: I1203 22:25:55.665542 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerStarted","Data":"5428d7d878d3dc119ed4c477dd759b3a4823738ebd8dcde8979efbd897e7fbb3"} Dec 03 22:25:55.666022 master-0 kubenswrapper[36504]: I1203 22:25:55.665878 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-central-agent" containerID="cri-o://da59aad6ab606c854f6b9ee71b211b268f994002d51927273d1f73b261fb46de" gracePeriod=30 Dec 03 22:25:55.666440 master-0 kubenswrapper[36504]: I1203 22:25:55.666381 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:25:55.667250 master-0 kubenswrapper[36504]: I1203 22:25:55.667190 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="proxy-httpd" containerID="cri-o://5428d7d878d3dc119ed4c477dd759b3a4823738ebd8dcde8979efbd897e7fbb3" gracePeriod=30 Dec 03 22:25:55.667318 master-0 kubenswrapper[36504]: I1203 22:25:55.667267 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="sg-core" containerID="cri-o://d4bed8f56fdb2f06515925b352a7197ebcca8f90a54b11d9abc7ac47d81f8ee4" gracePeriod=30 Dec 03 22:25:55.667372 master-0 kubenswrapper[36504]: I1203 22:25:55.667321 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-notification-agent" containerID="cri-o://ccebe6789c8d277e28fc5b0704007c0ded605cc5b201be05d197ca94a05e11a5" gracePeriod=30 Dec 03 22:25:55.677089 master-0 kubenswrapper[36504]: I1203 22:25:55.677001 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56565586f6-z78xs" event={"ID":"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad","Type":"ContainerStarted","Data":"3508f41fc8039c896f0ae2922479f1c02e608ab177b3ad5a9de5eebaa2be24cf"} Dec 03 22:25:55.677089 master-0 kubenswrapper[36504]: I1203 22:25:55.677084 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56565586f6-z78xs" event={"ID":"edf6204b-c1ee-4d11-8ab6-0e7d58ab11ad","Type":"ContainerStarted","Data":"7f61e7e67f94a94a63f8acbfd36fde3a317776bdf1645d6bc755044b6e6e7b7e"} Dec 03 22:25:55.678969 master-0 kubenswrapper[36504]: I1203 22:25:55.678920 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:55.679039 master-0 kubenswrapper[36504]: I1203 22:25:55.678978 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:25:55.689581 master-0 kubenswrapper[36504]: I1203 22:25:55.689076 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"c2f5cca7-d1b3-4572-859e-e77f1f4055ae","Type":"ContainerStarted","Data":"625c91988e3d7a4c337f61823b2e54ed2afb765ad1bbce57532c4e4c2ca31d03"} Dec 03 22:25:56.068655 master-0 kubenswrapper[36504]: I1203 22:25:56.068513 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.219411138 podStartE2EDuration="1m9.068479661s" podCreationTimestamp="2025-12-03 22:24:47 +0000 UTC" firstStartedPulling="2025-12-03 22:24:49.651471589 +0000 UTC m=+854.871243596" lastFinishedPulling="2025-12-03 22:25:53.500540112 +0000 UTC m=+918.720312119" observedRunningTime="2025-12-03 22:25:55.987425844 +0000 UTC m=+921.207197851" watchObservedRunningTime="2025-12-03 22:25:56.068479661 +0000 UTC m=+921.288251668" Dec 03 22:25:56.184801 master-0 kubenswrapper[36504]: I1203 22:25:56.172499 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56565586f6-z78xs" podStartSLOduration=10.172398298 podStartE2EDuration="10.172398298s" podCreationTimestamp="2025-12-03 22:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:56.154373161 +0000 UTC m=+921.374145168" watchObservedRunningTime="2025-12-03 22:25:56.172398298 +0000 UTC m=+921.392170305" Dec 03 22:25:56.249044 master-0 kubenswrapper[36504]: I1203 22:25:56.248592 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-scheduler-0" podStartSLOduration=14.873604437000001 podStartE2EDuration="18.24856014s" podCreationTimestamp="2025-12-03 22:25:38 +0000 UTC" firstStartedPulling="2025-12-03 22:25:40.488095758 +0000 UTC m=+905.707867775" lastFinishedPulling="2025-12-03 22:25:43.863051471 +0000 UTC m=+909.082823478" observedRunningTime="2025-12-03 22:25:56.199927753 +0000 UTC m=+921.419699760" watchObservedRunningTime="2025-12-03 22:25:56.24856014 +0000 UTC m=+921.468332147" Dec 03 22:25:56.730748 master-0 kubenswrapper[36504]: I1203 22:25:56.730255 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"51d0278a-4441-4eb7-b71e-a5a067fc0f76","Type":"ContainerStarted","Data":"74faf3a34c218736fd6ae0bd936edc2c309651964a6412a265b2d586eca9f3fa"} Dec 03 22:25:56.749164 master-0 kubenswrapper[36504]: I1203 22:25:56.749094 36504 generic.go:334] "Generic (PLEG): container finished" podID="50ff2a15-1a3d-459a-80fe-440d20765350" containerID="5428d7d878d3dc119ed4c477dd759b3a4823738ebd8dcde8979efbd897e7fbb3" exitCode=0 Dec 03 22:25:56.749501 master-0 kubenswrapper[36504]: I1203 22:25:56.749487 36504 generic.go:334] "Generic (PLEG): container finished" podID="50ff2a15-1a3d-459a-80fe-440d20765350" containerID="d4bed8f56fdb2f06515925b352a7197ebcca8f90a54b11d9abc7ac47d81f8ee4" exitCode=2 Dec 03 22:25:56.749572 master-0 kubenswrapper[36504]: I1203 22:25:56.749561 36504 generic.go:334] "Generic (PLEG): container finished" podID="50ff2a15-1a3d-459a-80fe-440d20765350" containerID="da59aad6ab606c854f6b9ee71b211b268f994002d51927273d1f73b261fb46de" exitCode=0 Dec 03 22:25:56.751090 master-0 kubenswrapper[36504]: I1203 22:25:56.751067 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerDied","Data":"5428d7d878d3dc119ed4c477dd759b3a4823738ebd8dcde8979efbd897e7fbb3"} Dec 03 22:25:56.751203 master-0 kubenswrapper[36504]: I1203 22:25:56.751188 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerDied","Data":"d4bed8f56fdb2f06515925b352a7197ebcca8f90a54b11d9abc7ac47d81f8ee4"} Dec 03 22:25:56.751272 master-0 kubenswrapper[36504]: I1203 22:25:56.751260 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerDied","Data":"da59aad6ab606c854f6b9ee71b211b268f994002d51927273d1f73b261fb46de"} Dec 03 22:25:57.775311 master-0 kubenswrapper[36504]: I1203 22:25:57.775202 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-api-0" event={"ID":"51d0278a-4441-4eb7-b71e-a5a067fc0f76","Type":"ContainerStarted","Data":"3148912de49fe068ecad5e7d20f191f57a0a41e47f07948715ce830c1147927a"} Dec 03 22:25:57.777236 master-0 kubenswrapper[36504]: I1203 22:25:57.777153 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-baebb-api-0" Dec 03 22:25:57.938050 master-0 kubenswrapper[36504]: I1203 22:25:57.937335 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-api-0" podStartSLOduration=4.93675617 podStartE2EDuration="4.93675617s" podCreationTimestamp="2025-12-03 22:25:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:25:57.933674463 +0000 UTC m=+923.153446460" watchObservedRunningTime="2025-12-03 22:25:57.93675617 +0000 UTC m=+923.156528177" Dec 03 22:25:58.794369 master-0 kubenswrapper[36504]: I1203 22:25:58.794137 36504 generic.go:334] "Generic (PLEG): container finished" podID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerID="92b857429875a99f53e495dd31dfab68a737dd9585a62fb08ccbd9d6dff299e2" exitCode=0 Dec 03 22:25:58.795147 master-0 kubenswrapper[36504]: I1203 22:25:58.794761 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64488669cb-b8cqj" event={"ID":"2bf27d66-22d6-4ae2-8b10-41eba3e48294","Type":"ContainerDied","Data":"92b857429875a99f53e495dd31dfab68a737dd9585a62fb08ccbd9d6dff299e2"} Dec 03 22:25:58.988866 master-0 kubenswrapper[36504]: I1203 22:25:58.988694 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:59.127330 master-0 kubenswrapper[36504]: I1203 22:25:59.127248 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-combined-ca-bundle\") pod \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " Dec 03 22:25:59.127684 master-0 kubenswrapper[36504]: I1203 22:25:59.127506 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-httpd-config\") pod \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " Dec 03 22:25:59.127684 master-0 kubenswrapper[36504]: I1203 22:25:59.127672 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjcff\" (UniqueName: \"kubernetes.io/projected/2bf27d66-22d6-4ae2-8b10-41eba3e48294-kube-api-access-bjcff\") pod \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " Dec 03 22:25:59.128005 master-0 kubenswrapper[36504]: I1203 22:25:59.127977 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-ovndb-tls-certs\") pod \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " Dec 03 22:25:59.128052 master-0 kubenswrapper[36504]: I1203 22:25:59.128016 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-config\") pod \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\" (UID: \"2bf27d66-22d6-4ae2-8b10-41eba3e48294\") " Dec 03 22:25:59.133672 master-0 kubenswrapper[36504]: I1203 22:25:59.133587 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf27d66-22d6-4ae2-8b10-41eba3e48294-kube-api-access-bjcff" (OuterVolumeSpecName: "kube-api-access-bjcff") pod "2bf27d66-22d6-4ae2-8b10-41eba3e48294" (UID: "2bf27d66-22d6-4ae2-8b10-41eba3e48294"). InnerVolumeSpecName "kube-api-access-bjcff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:25:59.146309 master-0 kubenswrapper[36504]: I1203 22:25:59.146233 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2bf27d66-22d6-4ae2-8b10-41eba3e48294" (UID: "2bf27d66-22d6-4ae2-8b10-41eba3e48294"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:59.201053 master-0 kubenswrapper[36504]: I1203 22:25:59.200965 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-config" (OuterVolumeSpecName: "config") pod "2bf27d66-22d6-4ae2-8b10-41eba3e48294" (UID: "2bf27d66-22d6-4ae2-8b10-41eba3e48294"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:59.203719 master-0 kubenswrapper[36504]: I1203 22:25:59.203653 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bf27d66-22d6-4ae2-8b10-41eba3e48294" (UID: "2bf27d66-22d6-4ae2-8b10-41eba3e48294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:59.233039 master-0 kubenswrapper[36504]: I1203 22:25:59.232882 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:59.233039 master-0 kubenswrapper[36504]: I1203 22:25:59.232936 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:59.233039 master-0 kubenswrapper[36504]: I1203 22:25:59.232948 36504 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-httpd-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:59.233039 master-0 kubenswrapper[36504]: I1203 22:25:59.232961 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjcff\" (UniqueName: \"kubernetes.io/projected/2bf27d66-22d6-4ae2-8b10-41eba3e48294-kube-api-access-bjcff\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:59.240881 master-0 kubenswrapper[36504]: I1203 22:25:59.240676 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2bf27d66-22d6-4ae2-8b10-41eba3e48294" (UID: "2bf27d66-22d6-4ae2-8b10-41eba3e48294"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:25:59.337215 master-0 kubenswrapper[36504]: I1203 22:25:59.337028 36504 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2bf27d66-22d6-4ae2-8b10-41eba3e48294-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:25:59.407820 master-0 kubenswrapper[36504]: I1203 22:25:59.407657 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:59.452891 master-0 kubenswrapper[36504]: I1203 22:25:59.452491 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:59.702311 master-0 kubenswrapper[36504]: I1203 22:25:59.702143 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:25:59.741856 master-0 kubenswrapper[36504]: I1203 22:25:59.741758 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:25:59.809603 master-0 kubenswrapper[36504]: I1203 22:25:59.809501 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-64488669cb-b8cqj" event={"ID":"2bf27d66-22d6-4ae2-8b10-41eba3e48294","Type":"ContainerDied","Data":"e217d04d35e47042a150331523c569b66b183fb96f40b9d4bca7018443d2d386"} Dec 03 22:25:59.809603 master-0 kubenswrapper[36504]: I1203 22:25:59.809580 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-64488669cb-b8cqj" Dec 03 22:25:59.810381 master-0 kubenswrapper[36504]: I1203 22:25:59.809629 36504 scope.go:117] "RemoveContainer" containerID="fe175fa8193f4ae74569fd34bf60c3a220d9fe564ee1c661e695748d46ef3a36" Dec 03 22:25:59.852189 master-0 kubenswrapper[36504]: I1203 22:25:59.852113 36504 scope.go:117] "RemoveContainer" containerID="92b857429875a99f53e495dd31dfab68a737dd9585a62fb08ccbd9d6dff299e2" Dec 03 22:25:59.891344 master-0 kubenswrapper[36504]: I1203 22:25:59.891252 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:25:59.987203 master-0 kubenswrapper[36504]: I1203 22:25:59.986948 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:26:00.043798 master-0 kubenswrapper[36504]: I1203 22:26:00.024898 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-64488669cb-b8cqj"] Dec 03 22:26:00.057593 master-0 kubenswrapper[36504]: I1203 22:26:00.055944 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-64488669cb-b8cqj"] Dec 03 22:26:00.088597 master-0 kubenswrapper[36504]: I1203 22:26:00.088507 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:00.240937 master-0 kubenswrapper[36504]: I1203 22:26:00.240699 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:26:00.855796 master-0 kubenswrapper[36504]: I1203 22:26:00.855698 36504 generic.go:334] "Generic (PLEG): container finished" podID="50ff2a15-1a3d-459a-80fe-440d20765350" containerID="ccebe6789c8d277e28fc5b0704007c0ded605cc5b201be05d197ca94a05e11a5" exitCode=0 Dec 03 22:26:00.856688 master-0 kubenswrapper[36504]: I1203 22:26:00.856059 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="cinder-volume" containerID="cri-o://046a26ca3f61540f96df2e42fa7563b7fb36949f5dffe5ecbdbe48f047f7a926" gracePeriod=30 Dec 03 22:26:00.856688 master-0 kubenswrapper[36504]: I1203 22:26:00.856488 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerDied","Data":"ccebe6789c8d277e28fc5b0704007c0ded605cc5b201be05d197ca94a05e11a5"} Dec 03 22:26:00.856688 master-0 kubenswrapper[36504]: I1203 22:26:00.856679 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-backup-0" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="cinder-backup" containerID="cri-o://75e9521003bd0fd877234e008e36945cca4473dbcbdbf6d9cd59ca7b9a4cd952" gracePeriod=30 Dec 03 22:26:00.856900 master-0 kubenswrapper[36504]: I1203 22:26:00.856870 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-scheduler-0" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="cinder-scheduler" containerID="cri-o://980eabc1cd233ce4cb2f3e940e5ee016977ff58fffe1e9d3c0848840a2d4345b" gracePeriod=30 Dec 03 22:26:00.857504 master-0 kubenswrapper[36504]: I1203 22:26:00.857416 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="probe" containerID="cri-o://aaf9860bf25290f04e05b9c2785e9cbcdd96be5128e0defad975e07c649cabc3" gracePeriod=30 Dec 03 22:26:00.857504 master-0 kubenswrapper[36504]: I1203 22:26:00.857496 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-backup-0" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="probe" containerID="cri-o://1b3a9d23e30459db4271a30e0bcde374db3a3e8413db043d91601740ebe1418d" gracePeriod=30 Dec 03 22:26:00.857629 master-0 kubenswrapper[36504]: I1203 22:26:00.857551 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-baebb-scheduler-0" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="probe" containerID="cri-o://625c91988e3d7a4c337f61823b2e54ed2afb765ad1bbce57532c4e4c2ca31d03" gracePeriod=30 Dec 03 22:26:01.187372 master-0 kubenswrapper[36504]: I1203 22:26:01.187147 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" path="/var/lib/kubelet/pods/2bf27d66-22d6-4ae2-8b10-41eba3e48294/volumes" Dec 03 22:26:01.341065 master-0 kubenswrapper[36504]: I1203 22:26:01.340764 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:01.475226 master-0 kubenswrapper[36504]: I1203 22:26:01.475012 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lfb4\" (UniqueName: \"kubernetes.io/projected/50ff2a15-1a3d-459a-80fe-440d20765350-kube-api-access-5lfb4\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.475226 master-0 kubenswrapper[36504]: I1203 22:26:01.475099 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-sg-core-conf-yaml\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.475584 master-0 kubenswrapper[36504]: I1203 22:26:01.475423 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-run-httpd\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.475584 master-0 kubenswrapper[36504]: I1203 22:26:01.475536 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-scripts\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.475677 master-0 kubenswrapper[36504]: I1203 22:26:01.475618 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-log-httpd\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.475824 master-0 kubenswrapper[36504]: I1203 22:26:01.475795 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-config-data\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.475902 master-0 kubenswrapper[36504]: I1203 22:26:01.475840 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-combined-ca-bundle\") pod \"50ff2a15-1a3d-459a-80fe-440d20765350\" (UID: \"50ff2a15-1a3d-459a-80fe-440d20765350\") " Dec 03 22:26:01.477305 master-0 kubenswrapper[36504]: I1203 22:26:01.476512 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7864d79866-k7zqf" Dec 03 22:26:01.477305 master-0 kubenswrapper[36504]: I1203 22:26:01.476649 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:01.477305 master-0 kubenswrapper[36504]: I1203 22:26:01.476741 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:01.486130 master-0 kubenswrapper[36504]: I1203 22:26:01.486024 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-scripts" (OuterVolumeSpecName: "scripts") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:01.498043 master-0 kubenswrapper[36504]: I1203 22:26:01.497951 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ff2a15-1a3d-459a-80fe-440d20765350-kube-api-access-5lfb4" (OuterVolumeSpecName: "kube-api-access-5lfb4") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "kube-api-access-5lfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:01.580444 master-0 kubenswrapper[36504]: I1203 22:26:01.579415 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.580444 master-0 kubenswrapper[36504]: I1203 22:26:01.579475 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.580444 master-0 kubenswrapper[36504]: I1203 22:26:01.579520 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lfb4\" (UniqueName: \"kubernetes.io/projected/50ff2a15-1a3d-459a-80fe-440d20765350-kube-api-access-5lfb4\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.580444 master-0 kubenswrapper[36504]: I1203 22:26:01.579557 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/50ff2a15-1a3d-459a-80fe-440d20765350-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.601152 master-0 kubenswrapper[36504]: I1203 22:26:01.601058 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:01.691401 master-0 kubenswrapper[36504]: I1203 22:26:01.690864 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.762985 master-0 kubenswrapper[36504]: I1203 22:26:01.762862 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-config-data" (OuterVolumeSpecName: "config-data") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:01.798058 master-0 kubenswrapper[36504]: I1203 22:26:01.797922 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.807106 master-0 kubenswrapper[36504]: I1203 22:26:01.807043 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50ff2a15-1a3d-459a-80fe-440d20765350" (UID: "50ff2a15-1a3d-459a-80fe-440d20765350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:01.876191 master-0 kubenswrapper[36504]: I1203 22:26:01.876123 36504 generic.go:334] "Generic (PLEG): container finished" podID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerID="625c91988e3d7a4c337f61823b2e54ed2afb765ad1bbce57532c4e4c2ca31d03" exitCode=0 Dec 03 22:26:01.876665 master-0 kubenswrapper[36504]: I1203 22:26:01.876232 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"c2f5cca7-d1b3-4572-859e-e77f1f4055ae","Type":"ContainerDied","Data":"625c91988e3d7a4c337f61823b2e54ed2afb765ad1bbce57532c4e4c2ca31d03"} Dec 03 22:26:01.879573 master-0 kubenswrapper[36504]: I1203 22:26:01.879523 36504 generic.go:334] "Generic (PLEG): container finished" podID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerID="aaf9860bf25290f04e05b9c2785e9cbcdd96be5128e0defad975e07c649cabc3" exitCode=0 Dec 03 22:26:01.879573 master-0 kubenswrapper[36504]: I1203 22:26:01.879554 36504 generic.go:334] "Generic (PLEG): container finished" podID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerID="046a26ca3f61540f96df2e42fa7563b7fb36949f5dffe5ecbdbe48f047f7a926" exitCode=0 Dec 03 22:26:01.879673 master-0 kubenswrapper[36504]: I1203 22:26:01.879605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9","Type":"ContainerDied","Data":"aaf9860bf25290f04e05b9c2785e9cbcdd96be5128e0defad975e07c649cabc3"} Dec 03 22:26:01.879708 master-0 kubenswrapper[36504]: I1203 22:26:01.879686 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9","Type":"ContainerDied","Data":"046a26ca3f61540f96df2e42fa7563b7fb36949f5dffe5ecbdbe48f047f7a926"} Dec 03 22:26:01.913814 master-0 kubenswrapper[36504]: I1203 22:26:01.911375 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"50ff2a15-1a3d-459a-80fe-440d20765350","Type":"ContainerDied","Data":"bbbc5f1def52cbb213f07748935b6e988143707bbe3c4849f4061f99ab5ee4fa"} Dec 03 22:26:01.913814 master-0 kubenswrapper[36504]: I1203 22:26:01.911476 36504 scope.go:117] "RemoveContainer" containerID="5428d7d878d3dc119ed4c477dd759b3a4823738ebd8dcde8979efbd897e7fbb3" Dec 03 22:26:01.913814 master-0 kubenswrapper[36504]: I1203 22:26:01.911812 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:01.922190 master-0 kubenswrapper[36504]: I1203 22:26:01.922104 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50ff2a15-1a3d-459a-80fe-440d20765350-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:01.971567 master-0 kubenswrapper[36504]: I1203 22:26:01.971513 36504 scope.go:117] "RemoveContainer" containerID="d4bed8f56fdb2f06515925b352a7197ebcca8f90a54b11d9abc7ac47d81f8ee4" Dec 03 22:26:02.006053 master-0 kubenswrapper[36504]: E1203 22:26:02.005894 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2f5cca7_d1b3_4572_859e_e77f1f4055ae.slice/crio-625c91988e3d7a4c337f61823b2e54ed2afb765ad1bbce57532c4e4c2ca31d03.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac2c203_3863_4bdd_b2bb_0f5c0037cee9.slice/crio-conmon-aaf9860bf25290f04e05b9c2785e9cbcdd96be5128e0defad975e07c649cabc3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50ff2a15_1a3d_459a_80fe_440d20765350.slice/crio-bbbc5f1def52cbb213f07748935b6e988143707bbe3c4849f4061f99ab5ee4fa\": RecentStats: unable to find data in memory cache]" Dec 03 22:26:02.007289 master-0 kubenswrapper[36504]: I1203 22:26:02.007187 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:02.032079 master-0 kubenswrapper[36504]: I1203 22:26:02.031012 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.053063 36504 scope.go:117] "RemoveContainer" containerID="ccebe6789c8d277e28fc5b0704007c0ded605cc5b201be05d197ca94a05e11a5" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.055141 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: E1203 22:26:02.056117 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-notification-agent" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056144 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-notification-agent" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: E1203 22:26:02.056179 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-central-agent" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056187 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-central-agent" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: E1203 22:26:02.056224 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-httpd" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056233 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-httpd" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: E1203 22:26:02.056282 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="proxy-httpd" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056292 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="proxy-httpd" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: E1203 22:26:02.056316 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="sg-core" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056324 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="sg-core" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: E1203 22:26:02.056335 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-api" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056342 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-api" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056650 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-central-agent" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056687 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="ceilometer-notification-agent" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056710 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="proxy-httpd" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056726 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" containerName="sg-core" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056884 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-httpd" Dec 03 22:26:02.059840 master-0 kubenswrapper[36504]: I1203 22:26:02.056922 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf27d66-22d6-4ae2-8b10-41eba3e48294" containerName="neutron-api" Dec 03 22:26:02.062101 master-0 kubenswrapper[36504]: I1203 22:26:02.061525 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:02.070160 master-0 kubenswrapper[36504]: I1203 22:26:02.067519 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:26:02.076507 master-0 kubenswrapper[36504]: I1203 22:26:02.076450 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:26:02.094945 master-0 kubenswrapper[36504]: I1203 22:26:02.093014 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.140887 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-scripts\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.141036 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.141068 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-log-httpd\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.141118 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.141269 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-config-data\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.141411 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-run-httpd\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.154122 master-0 kubenswrapper[36504]: I1203 22:26:02.141495 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klh6k\" (UniqueName: \"kubernetes.io/projected/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-kube-api-access-klh6k\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.224496 master-0 kubenswrapper[36504]: I1203 22:26:02.218392 36504 scope.go:117] "RemoveContainer" containerID="da59aad6ab606c854f6b9ee71b211b268f994002d51927273d1f73b261fb46de" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.244828 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klh6k\" (UniqueName: \"kubernetes.io/projected/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-kube-api-access-klh6k\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.244928 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-scripts\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.244981 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.245002 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-log-httpd\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.245033 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.245316 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-config-data\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.245388 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-run-httpd\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.246080 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-run-httpd\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.247470 master-0 kubenswrapper[36504]: I1203 22:26:02.246510 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-log-httpd\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.250818 master-0 kubenswrapper[36504]: I1203 22:26:02.250783 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.254593 master-0 kubenswrapper[36504]: I1203 22:26:02.253223 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-config-data\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.254593 master-0 kubenswrapper[36504]: I1203 22:26:02.253287 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-scripts\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.255530 master-0 kubenswrapper[36504]: I1203 22:26:02.255457 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.269904 master-0 kubenswrapper[36504]: I1203 22:26:02.269709 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klh6k\" (UniqueName: \"kubernetes.io/projected/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-kube-api-access-klh6k\") pod \"ceilometer-0\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " pod="openstack/ceilometer-0" Dec 03 22:26:02.372359 master-0 kubenswrapper[36504]: I1203 22:26:02.372294 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:02.475548 master-0 kubenswrapper[36504]: I1203 22:26:02.475463 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:02.579004 master-0 kubenswrapper[36504]: I1203 22:26:02.578938 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-combined-ca-bundle\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.579484 master-0 kubenswrapper[36504]: I1203 22:26:02.579462 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pr2h\" (UniqueName: \"kubernetes.io/projected/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-kube-api-access-2pr2h\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.579673 master-0 kubenswrapper[36504]: I1203 22:26:02.579655 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-cinder\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.579923 master-0 kubenswrapper[36504]: I1203 22:26:02.579824 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.579997 master-0 kubenswrapper[36504]: I1203 22:26:02.579858 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-iscsi\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.580084 master-0 kubenswrapper[36504]: I1203 22:26:02.580059 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-nvme\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.580134 master-0 kubenswrapper[36504]: I1203 22:26:02.580090 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-lib-cinder\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.580186 master-0 kubenswrapper[36504]: I1203 22:26:02.580065 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.580334 master-0 kubenswrapper[36504]: I1203 22:26:02.580202 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.580219 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-scripts\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.581876 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-sys\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.581906 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-dev\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.580240 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.581966 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-sys" (OuterVolumeSpecName: "sys") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.581975 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.582020 master-0 kubenswrapper[36504]: I1203 22:26:02.581939 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-machine-id\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582281 master-0 kubenswrapper[36504]: I1203 22:26:02.582045 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-dev" (OuterVolumeSpecName: "dev") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.582351 master-0 kubenswrapper[36504]: I1203 22:26:02.582327 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-run\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582423 master-0 kubenswrapper[36504]: I1203 22:26:02.582400 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-brick\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582474 master-0 kubenswrapper[36504]: I1203 22:26:02.582443 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data-custom\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582474 master-0 kubenswrapper[36504]: I1203 22:26:02.582463 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582536 master-0 kubenswrapper[36504]: I1203 22:26:02.582518 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-lib-modules\") pod \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\" (UID: \"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9\") " Dec 03 22:26:02.582610 master-0 kubenswrapper[36504]: I1203 22:26:02.582590 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-run" (OuterVolumeSpecName: "run") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.584082 master-0 kubenswrapper[36504]: I1203 22:26:02.584048 36504 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584082 master-0 kubenswrapper[36504]: I1203 22:26:02.584074 36504 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584088 36504 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584100 36504 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-nvme\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584112 36504 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584125 36504 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-sys\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584140 36504 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-dev\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584150 36504 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.584202 master-0 kubenswrapper[36504]: I1203 22:26:02.584192 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.584412 master-0 kubenswrapper[36504]: I1203 22:26:02.584220 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:02.585465 master-0 kubenswrapper[36504]: I1203 22:26:02.585429 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-scripts" (OuterVolumeSpecName: "scripts") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:02.587467 master-0 kubenswrapper[36504]: I1203 22:26:02.587422 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:02.588461 master-0 kubenswrapper[36504]: I1203 22:26:02.588390 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-kube-api-access-2pr2h" (OuterVolumeSpecName: "kube-api-access-2pr2h") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "kube-api-access-2pr2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:02.653024 master-0 kubenswrapper[36504]: I1203 22:26:02.652906 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:02.687513 master-0 kubenswrapper[36504]: I1203 22:26:02.687444 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.687513 master-0 kubenswrapper[36504]: I1203 22:26:02.687489 36504 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.687513 master-0 kubenswrapper[36504]: I1203 22:26:02.687503 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.687513 master-0 kubenswrapper[36504]: I1203 22:26:02.687512 36504 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-lib-modules\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.687513 master-0 kubenswrapper[36504]: I1203 22:26:02.687521 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.687513 master-0 kubenswrapper[36504]: I1203 22:26:02.687531 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pr2h\" (UniqueName: \"kubernetes.io/projected/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-kube-api-access-2pr2h\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.794675 master-0 kubenswrapper[36504]: I1203 22:26:02.794611 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data" (OuterVolumeSpecName: "config-data") pod "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" (UID: "6ac2c203-3863-4bdd-b2bb-0f5c0037cee9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:02.845673 master-0 kubenswrapper[36504]: I1203 22:26:02.845492 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6bf95cb78d-jt2z5" Dec 03 22:26:02.897680 master-0 kubenswrapper[36504]: I1203 22:26:02.897591 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:02.966644 master-0 kubenswrapper[36504]: I1203 22:26:02.962911 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"23860378-b004-4ef4-9a8b-847fd6277fd8","Type":"ContainerDied","Data":"1b3a9d23e30459db4271a30e0bcde374db3a3e8413db043d91601740ebe1418d"} Dec 03 22:26:02.966644 master-0 kubenswrapper[36504]: I1203 22:26:02.957276 36504 generic.go:334] "Generic (PLEG): container finished" podID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerID="1b3a9d23e30459db4271a30e0bcde374db3a3e8413db043d91601740ebe1418d" exitCode=0 Dec 03 22:26:03.013829 master-0 kubenswrapper[36504]: I1203 22:26:03.013443 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6ac2c203-3863-4bdd-b2bb-0f5c0037cee9","Type":"ContainerDied","Data":"7bc64eda8806bf2918604bcdfe66e50baa76f25955032db3a5388426ddad5db3"} Dec 03 22:26:03.013829 master-0 kubenswrapper[36504]: I1203 22:26:03.013576 36504 scope.go:117] "RemoveContainer" containerID="aaf9860bf25290f04e05b9c2785e9cbcdd96be5128e0defad975e07c649cabc3" Dec 03 22:26:03.013829 master-0 kubenswrapper[36504]: I1203 22:26:03.013755 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.076627 master-0 kubenswrapper[36504]: I1203 22:26:03.076005 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:03.112574 master-0 kubenswrapper[36504]: I1203 22:26:03.112510 36504 scope.go:117] "RemoveContainer" containerID="046a26ca3f61540f96df2e42fa7563b7fb36949f5dffe5ecbdbe48f047f7a926" Dec 03 22:26:03.157869 master-0 kubenswrapper[36504]: W1203 22:26:03.157679 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod089feb3e_dc06_4ffc_b27d_661e6eb95f1a.slice/crio-48759c024e96c437dea4715e25b0b13e01eeefc864e663712b06f57f8952a715 WatchSource:0}: Error finding container 48759c024e96c437dea4715e25b0b13e01eeefc864e663712b06f57f8952a715: Status 404 returned error can't find the container with id 48759c024e96c437dea4715e25b0b13e01eeefc864e663712b06f57f8952a715 Dec 03 22:26:03.193819 master-0 kubenswrapper[36504]: I1203 22:26:03.175624 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ff2a15-1a3d-459a-80fe-440d20765350" path="/var/lib/kubelet/pods/50ff2a15-1a3d-459a-80fe-440d20765350/volumes" Dec 03 22:26:03.193819 master-0 kubenswrapper[36504]: I1203 22:26:03.185820 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:26:03.204655 master-0 kubenswrapper[36504]: I1203 22:26:03.204379 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:26:03.204655 master-0 kubenswrapper[36504]: I1203 22:26:03.204471 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:26:03.205273 master-0 kubenswrapper[36504]: E1203 22:26:03.205231 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="probe" Dec 03 22:26:03.205273 master-0 kubenswrapper[36504]: I1203 22:26:03.205256 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="probe" Dec 03 22:26:03.205391 master-0 kubenswrapper[36504]: E1203 22:26:03.205313 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="cinder-volume" Dec 03 22:26:03.205391 master-0 kubenswrapper[36504]: I1203 22:26:03.205324 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="cinder-volume" Dec 03 22:26:03.205736 master-0 kubenswrapper[36504]: I1203 22:26:03.205703 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="cinder-volume" Dec 03 22:26:03.205736 master-0 kubenswrapper[36504]: I1203 22:26:03.205725 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" containerName="probe" Dec 03 22:26:03.208459 master-0 kubenswrapper[36504]: I1203 22:26:03.207400 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:26:03.208459 master-0 kubenswrapper[36504]: I1203 22:26:03.207537 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.220803 master-0 kubenswrapper[36504]: I1203 22:26:03.220728 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-volume-lvm-iscsi-config-data" Dec 03 22:26:03.311998 master-0 kubenswrapper[36504]: I1203 22:26:03.311920 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9t5\" (UniqueName: \"kubernetes.io/projected/6f6ab462-046f-4f8a-8250-41ae4d6ace65-kube-api-access-kh9t5\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.311998 master-0 kubenswrapper[36504]: I1203 22:26:03.312010 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-lib-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312687 master-0 kubenswrapper[36504]: I1203 22:26:03.312193 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-run\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312687 master-0 kubenswrapper[36504]: I1203 22:26:03.312471 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-dev\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312687 master-0 kubenswrapper[36504]: I1203 22:26:03.312515 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-machine-id\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312687 master-0 kubenswrapper[36504]: I1203 22:26:03.312559 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-combined-ca-bundle\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312687 master-0 kubenswrapper[36504]: I1203 22:26:03.312662 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-sys\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312950 master-0 kubenswrapper[36504]: I1203 22:26:03.312701 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-locks-brick\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312950 master-0 kubenswrapper[36504]: I1203 22:26:03.312813 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-nvme\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.312950 master-0 kubenswrapper[36504]: I1203 22:26:03.312846 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-scripts\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.313096 master-0 kubenswrapper[36504]: I1203 22:26:03.313020 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-iscsi\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.313096 master-0 kubenswrapper[36504]: I1203 22:26:03.313075 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-lib-modules\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.313185 master-0 kubenswrapper[36504]: I1203 22:26:03.313117 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-config-data\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.313185 master-0 kubenswrapper[36504]: I1203 22:26:03.313148 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-config-data-custom\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.313318 master-0 kubenswrapper[36504]: I1203 22:26:03.313270 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-locks-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416420 master-0 kubenswrapper[36504]: I1203 22:26:03.416313 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-sys\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416420 master-0 kubenswrapper[36504]: I1203 22:26:03.416418 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-locks-brick\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416488 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-nvme\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416510 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-scripts\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416523 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-sys\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416614 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-iscsi\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416616 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-locks-brick\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416616 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-nvme\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416558 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-iscsi\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416855 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-lib-modules\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.416917 master-0 kubenswrapper[36504]: I1203 22:26:03.416901 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-config-data\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417198 master-0 kubenswrapper[36504]: I1203 22:26:03.416931 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-config-data-custom\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417198 master-0 kubenswrapper[36504]: I1203 22:26:03.417149 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-locks-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417288 master-0 kubenswrapper[36504]: I1203 22:26:03.417257 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9t5\" (UniqueName: \"kubernetes.io/projected/6f6ab462-046f-4f8a-8250-41ae4d6ace65-kube-api-access-kh9t5\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417324 master-0 kubenswrapper[36504]: I1203 22:26:03.417310 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-lib-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417443 master-0 kubenswrapper[36504]: I1203 22:26:03.417412 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-locks-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417483 master-0 kubenswrapper[36504]: I1203 22:26:03.417432 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-lib-modules\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417515 master-0 kubenswrapper[36504]: I1203 22:26:03.417497 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-var-lib-cinder\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417626 master-0 kubenswrapper[36504]: I1203 22:26:03.417585 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-run\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.417673 master-0 kubenswrapper[36504]: I1203 22:26:03.417644 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-run\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.418135 master-0 kubenswrapper[36504]: I1203 22:26:03.417891 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-dev\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.418135 master-0 kubenswrapper[36504]: I1203 22:26:03.417975 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-machine-id\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.418135 master-0 kubenswrapper[36504]: I1203 22:26:03.418005 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-dev\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.418135 master-0 kubenswrapper[36504]: I1203 22:26:03.418072 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-combined-ca-bundle\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.418135 master-0 kubenswrapper[36504]: I1203 22:26:03.418114 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6f6ab462-046f-4f8a-8250-41ae4d6ace65-etc-machine-id\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.421429 master-0 kubenswrapper[36504]: I1203 22:26:03.421389 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-scripts\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.422208 master-0 kubenswrapper[36504]: I1203 22:26:03.422154 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-config-data-custom\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.422811 master-0 kubenswrapper[36504]: I1203 22:26:03.422729 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-config-data\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.425523 master-0 kubenswrapper[36504]: I1203 22:26:03.425483 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f6ab462-046f-4f8a-8250-41ae4d6ace65-combined-ca-bundle\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.627800 master-0 kubenswrapper[36504]: I1203 22:26:03.626746 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9t5\" (UniqueName: \"kubernetes.io/projected/6f6ab462-046f-4f8a-8250-41ae4d6ace65-kube-api-access-kh9t5\") pod \"cinder-baebb-volume-lvm-iscsi-0\" (UID: \"6f6ab462-046f-4f8a-8250-41ae4d6ace65\") " pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:03.850836 master-0 kubenswrapper[36504]: I1203 22:26:03.850241 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:04.060899 master-0 kubenswrapper[36504]: I1203 22:26:04.057097 36504 generic.go:334] "Generic (PLEG): container finished" podID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerID="980eabc1cd233ce4cb2f3e940e5ee016977ff58fffe1e9d3c0848840a2d4345b" exitCode=0 Dec 03 22:26:04.060899 master-0 kubenswrapper[36504]: I1203 22:26:04.057216 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"c2f5cca7-d1b3-4572-859e-e77f1f4055ae","Type":"ContainerDied","Data":"980eabc1cd233ce4cb2f3e940e5ee016977ff58fffe1e9d3c0848840a2d4345b"} Dec 03 22:26:04.076615 master-0 kubenswrapper[36504]: I1203 22:26:04.076139 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerStarted","Data":"48759c024e96c437dea4715e25b0b13e01eeefc864e663712b06f57f8952a715"} Dec 03 22:26:04.125127 master-0 kubenswrapper[36504]: I1203 22:26:04.118853 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:26:04.181795 master-0 kubenswrapper[36504]: I1203 22:26:04.178999 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56565586f6-z78xs" Dec 03 22:26:04.319125 master-0 kubenswrapper[36504]: I1203 22:26:04.318725 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7997d6f544-ljwhp"] Dec 03 22:26:04.319520 master-0 kubenswrapper[36504]: I1203 22:26:04.319146 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7997d6f544-ljwhp" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api-log" containerID="cri-o://33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc" gracePeriod=30 Dec 03 22:26:04.320286 master-0 kubenswrapper[36504]: I1203 22:26:04.320087 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7997d6f544-ljwhp" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api" containerID="cri-o://e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a" gracePeriod=30 Dec 03 22:26:04.393190 master-0 kubenswrapper[36504]: E1203 22:26:04.392751 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1293e094_4f1e_4c59_8050_7eeb11695c40.slice/crio-33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:26:04.565805 master-0 kubenswrapper[36504]: I1203 22:26:04.565269 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:04.698153 master-0 kubenswrapper[36504]: I1203 22:26:04.697494 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data\") pod \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " Dec 03 22:26:04.698153 master-0 kubenswrapper[36504]: I1203 22:26:04.697655 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-etc-machine-id\") pod \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " Dec 03 22:26:04.719930 master-0 kubenswrapper[36504]: I1203 22:26:04.717599 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c2f5cca7-d1b3-4572-859e-e77f1f4055ae" (UID: "c2f5cca7-d1b3-4572-859e-e77f1f4055ae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:04.719930 master-0 kubenswrapper[36504]: I1203 22:26:04.697690 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrwtv\" (UniqueName: \"kubernetes.io/projected/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-kube-api-access-vrwtv\") pod \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " Dec 03 22:26:04.720289 master-0 kubenswrapper[36504]: I1203 22:26:04.719972 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-scripts\") pod \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " Dec 03 22:26:04.720289 master-0 kubenswrapper[36504]: I1203 22:26:04.720028 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-combined-ca-bundle\") pod \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " Dec 03 22:26:04.720289 master-0 kubenswrapper[36504]: I1203 22:26:04.720263 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data-custom\") pod \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\" (UID: \"c2f5cca7-d1b3-4572-859e-e77f1f4055ae\") " Dec 03 22:26:04.725287 master-0 kubenswrapper[36504]: I1203 22:26:04.724786 36504 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:04.744318 master-0 kubenswrapper[36504]: I1203 22:26:04.744205 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-kube-api-access-vrwtv" (OuterVolumeSpecName: "kube-api-access-vrwtv") pod "c2f5cca7-d1b3-4572-859e-e77f1f4055ae" (UID: "c2f5cca7-d1b3-4572-859e-e77f1f4055ae"). InnerVolumeSpecName "kube-api-access-vrwtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:04.754174 master-0 kubenswrapper[36504]: I1203 22:26:04.752229 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-scripts" (OuterVolumeSpecName: "scripts") pod "c2f5cca7-d1b3-4572-859e-e77f1f4055ae" (UID: "c2f5cca7-d1b3-4572-859e-e77f1f4055ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:04.773878 master-0 kubenswrapper[36504]: I1203 22:26:04.773381 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c2f5cca7-d1b3-4572-859e-e77f1f4055ae" (UID: "c2f5cca7-d1b3-4572-859e-e77f1f4055ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:04.834370 master-0 kubenswrapper[36504]: I1203 22:26:04.832753 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-volume-lvm-iscsi-0"] Dec 03 22:26:04.910437 master-0 kubenswrapper[36504]: I1203 22:26:04.910390 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:04.910577 master-0 kubenswrapper[36504]: I1203 22:26:04.910451 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:04.910577 master-0 kubenswrapper[36504]: I1203 22:26:04.910469 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrwtv\" (UniqueName: \"kubernetes.io/projected/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-kube-api-access-vrwtv\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:04.916759 master-0 kubenswrapper[36504]: I1203 22:26:04.916692 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2f5cca7-d1b3-4572-859e-e77f1f4055ae" (UID: "c2f5cca7-d1b3-4572-859e-e77f1f4055ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:05.030320 master-0 kubenswrapper[36504]: I1203 22:26:05.019728 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.038041 master-0 kubenswrapper[36504]: I1203 22:26:05.037959 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data" (OuterVolumeSpecName: "config-data") pod "c2f5cca7-d1b3-4572-859e-e77f1f4055ae" (UID: "c2f5cca7-d1b3-4572-859e-e77f1f4055ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:05.128345 master-0 kubenswrapper[36504]: I1203 22:26:05.127489 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2f5cca7-d1b3-4572-859e-e77f1f4055ae-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.143025 master-0 kubenswrapper[36504]: I1203 22:26:05.142949 36504 generic.go:334] "Generic (PLEG): container finished" podID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerID="33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc" exitCode=143 Dec 03 22:26:05.161120 master-0 kubenswrapper[36504]: I1203 22:26:05.161031 36504 generic.go:334] "Generic (PLEG): container finished" podID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerID="75e9521003bd0fd877234e008e36945cca4473dbcbdbf6d9cd59ca7b9a4cd952" exitCode=0 Dec 03 22:26:05.177173 master-0 kubenswrapper[36504]: I1203 22:26:05.177112 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.178208 master-0 kubenswrapper[36504]: I1203 22:26:05.178148 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac2c203-3863-4bdd-b2bb-0f5c0037cee9" path="/var/lib/kubelet/pods/6ac2c203-3863-4bdd-b2bb-0f5c0037cee9/volumes" Dec 03 22:26:05.179646 master-0 kubenswrapper[36504]: I1203 22:26:05.179602 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"c2f5cca7-d1b3-4572-859e-e77f1f4055ae","Type":"ContainerDied","Data":"a8b586c8dbc5c8b590a9f361c39ac6759f1c5d32ba63bafb55bb749cb3d438e0"} Dec 03 22:26:05.179732 master-0 kubenswrapper[36504]: I1203 22:26:05.179661 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7997d6f544-ljwhp" event={"ID":"1293e094-4f1e-4c59-8050-7eeb11695c40","Type":"ContainerDied","Data":"33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc"} Dec 03 22:26:05.179732 master-0 kubenswrapper[36504]: I1203 22:26:05.179687 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"23860378-b004-4ef4-9a8b-847fd6277fd8","Type":"ContainerDied","Data":"75e9521003bd0fd877234e008e36945cca4473dbcbdbf6d9cd59ca7b9a4cd952"} Dec 03 22:26:05.179732 master-0 kubenswrapper[36504]: I1203 22:26:05.179706 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6f6ab462-046f-4f8a-8250-41ae4d6ace65","Type":"ContainerStarted","Data":"30eabb1dafc1457f95362442e2be4ad8f0ed80310af061fe64c0795a74dd58c0"} Dec 03 22:26:05.179732 master-0 kubenswrapper[36504]: I1203 22:26:05.179722 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerStarted","Data":"39b633a884a9121e66c91ba2fc7e06cf8f415a2a1201873ac497ae4210b224a3"} Dec 03 22:26:05.179886 master-0 kubenswrapper[36504]: I1203 22:26:05.179750 36504 scope.go:117] "RemoveContainer" containerID="625c91988e3d7a4c337f61823b2e54ed2afb765ad1bbce57532c4e4c2ca31d03" Dec 03 22:26:05.307683 master-0 kubenswrapper[36504]: I1203 22:26:05.307621 36504 scope.go:117] "RemoveContainer" containerID="980eabc1cd233ce4cb2f3e940e5ee016977ff58fffe1e9d3c0848840a2d4345b" Dec 03 22:26:05.384162 master-0 kubenswrapper[36504]: I1203 22:26:05.382518 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:26:05.407073 master-0 kubenswrapper[36504]: I1203 22:26:05.406998 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:26:05.426598 master-0 kubenswrapper[36504]: I1203 22:26:05.424140 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:26:05.426598 master-0 kubenswrapper[36504]: E1203 22:26:05.425206 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="probe" Dec 03 22:26:05.426598 master-0 kubenswrapper[36504]: I1203 22:26:05.425224 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="probe" Dec 03 22:26:05.426598 master-0 kubenswrapper[36504]: E1203 22:26:05.425261 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="cinder-scheduler" Dec 03 22:26:05.426598 master-0 kubenswrapper[36504]: I1203 22:26:05.425271 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="cinder-scheduler" Dec 03 22:26:05.431334 master-0 kubenswrapper[36504]: I1203 22:26:05.430191 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="cinder-scheduler" Dec 03 22:26:05.431334 master-0 kubenswrapper[36504]: I1203 22:26:05.430253 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" containerName="probe" Dec 03 22:26:05.432423 master-0 kubenswrapper[36504]: I1203 22:26:05.432069 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.436381 master-0 kubenswrapper[36504]: I1203 22:26:05.436311 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-scheduler-config-data" Dec 03 22:26:05.442887 master-0 kubenswrapper[36504]: I1203 22:26:05.442705 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:26:05.489637 master-0 kubenswrapper[36504]: I1203 22:26:05.481596 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:05.506870 master-0 kubenswrapper[36504]: I1203 22:26:05.504185 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:26:05.570763 master-0 kubenswrapper[36504]: I1203 22:26:05.570656 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-config-data-custom\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.572482 master-0 kubenswrapper[36504]: I1203 22:26:05.572400 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-scripts\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.573586 master-0 kubenswrapper[36504]: I1203 22:26:05.573288 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88h8r\" (UniqueName: \"kubernetes.io/projected/738da7ca-4034-4602-a382-696d5416843b-kube-api-access-88h8r\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.573586 master-0 kubenswrapper[36504]: I1203 22:26:05.573464 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-combined-ca-bundle\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.574564 master-0 kubenswrapper[36504]: I1203 22:26:05.574288 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-config-data\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.574564 master-0 kubenswrapper[36504]: I1203 22:26:05.574333 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/738da7ca-4034-4602-a382-696d5416843b-etc-machine-id\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.677482 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-scripts\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.677870 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-sys\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678096 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-iscsi\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678208 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qv82\" (UniqueName: \"kubernetes.io/projected/23860378-b004-4ef4-9a8b-847fd6277fd8-kube-api-access-6qv82\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678251 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-brick\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678385 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-dev\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678456 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-lib-modules\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678499 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678558 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-cinder\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678601 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-run\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678656 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-combined-ca-bundle\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.678923 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-lib-cinder\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.679013 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data-custom\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.679045 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-nvme\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.679075 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-machine-id\") pod \"23860378-b004-4ef4-9a8b-847fd6277fd8\" (UID: \"23860378-b004-4ef4-9a8b-847fd6277fd8\") " Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.679962 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680143 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680444 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-sys" (OuterVolumeSpecName: "sys") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680523 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680583 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680633 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680491 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-run" (OuterVolumeSpecName: "run") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680599 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-dev" (OuterVolumeSpecName: "dev") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680726 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.680816 master-0 kubenswrapper[36504]: I1203 22:26:05.680758 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 22:26:05.682022 master-0 kubenswrapper[36504]: I1203 22:26:05.681136 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-config-data\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.682022 master-0 kubenswrapper[36504]: I1203 22:26:05.681422 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/738da7ca-4034-4602-a382-696d5416843b-etc-machine-id\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.685806 master-0 kubenswrapper[36504]: I1203 22:26:05.682944 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-config-data-custom\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.685806 master-0 kubenswrapper[36504]: I1203 22:26:05.684855 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-scripts\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.685806 master-0 kubenswrapper[36504]: I1203 22:26:05.685011 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/738da7ca-4034-4602-a382-696d5416843b-etc-machine-id\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.689033 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-scripts" (OuterVolumeSpecName: "scripts") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.689980 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88h8r\" (UniqueName: \"kubernetes.io/projected/738da7ca-4034-4602-a382-696d5416843b-kube-api-access-88h8r\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.690541 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-combined-ca-bundle\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691068 36504 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-dev\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691112 36504 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-lib-modules\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691128 36504 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691142 36504 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691156 36504 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691190 36504 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691206 36504 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-nvme\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691225 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691239 36504 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-sys\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691275 36504 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691292 36504 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/23860378-b004-4ef4-9a8b-847fd6277fd8-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.692793 master-0 kubenswrapper[36504]: I1203 22:26:05.691496 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-config-data-custom\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.693361 master-0 kubenswrapper[36504]: I1203 22:26:05.692940 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:05.698797 master-0 kubenswrapper[36504]: I1203 22:26:05.696208 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-combined-ca-bundle\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.698797 master-0 kubenswrapper[36504]: I1203 22:26:05.696409 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-config-data\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.698797 master-0 kubenswrapper[36504]: I1203 22:26:05.698295 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23860378-b004-4ef4-9a8b-847fd6277fd8-kube-api-access-6qv82" (OuterVolumeSpecName: "kube-api-access-6qv82") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "kube-api-access-6qv82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:05.714807 master-0 kubenswrapper[36504]: I1203 22:26:05.712416 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/738da7ca-4034-4602-a382-696d5416843b-scripts\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.747903 master-0 kubenswrapper[36504]: I1203 22:26:05.743746 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88h8r\" (UniqueName: \"kubernetes.io/projected/738da7ca-4034-4602-a382-696d5416843b-kube-api-access-88h8r\") pod \"cinder-baebb-scheduler-0\" (UID: \"738da7ca-4034-4602-a382-696d5416843b\") " pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.796103 master-0 kubenswrapper[36504]: I1203 22:26:05.796004 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.796103 master-0 kubenswrapper[36504]: I1203 22:26:05.796072 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qv82\" (UniqueName: \"kubernetes.io/projected/23860378-b004-4ef4-9a8b-847fd6277fd8-kube-api-access-6qv82\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.841575 master-0 kubenswrapper[36504]: I1203 22:26:05.841233 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:05.893256 master-0 kubenswrapper[36504]: I1203 22:26:05.890097 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:05.899109 master-0 kubenswrapper[36504]: I1203 22:26:05.899026 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:05.972121 master-0 kubenswrapper[36504]: I1203 22:26:05.971976 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data" (OuterVolumeSpecName: "config-data") pod "23860378-b004-4ef4-9a8b-847fd6277fd8" (UID: "23860378-b004-4ef4-9a8b-847fd6277fd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:06.002722 master-0 kubenswrapper[36504]: I1203 22:26:06.002167 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23860378-b004-4ef4-9a8b-847fd6277fd8-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:06.502795 master-0 kubenswrapper[36504]: I1203 22:26:06.492051 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"23860378-b004-4ef4-9a8b-847fd6277fd8","Type":"ContainerDied","Data":"ba5a5c91ed9eb8a2d6d4e2f7102d983f17bb3d36300e51fb997e4e0bb9d6bd5f"} Dec 03 22:26:06.502795 master-0 kubenswrapper[36504]: I1203 22:26:06.492134 36504 scope.go:117] "RemoveContainer" containerID="1b3a9d23e30459db4271a30e0bcde374db3a3e8413db043d91601740ebe1418d" Dec 03 22:26:06.502795 master-0 kubenswrapper[36504]: I1203 22:26:06.492341 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:06.827366 master-0 kubenswrapper[36504]: I1203 22:26:06.576442 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6f6ab462-046f-4f8a-8250-41ae4d6ace65","Type":"ContainerStarted","Data":"d5af7c61959c2013a8e6bb704a1240c72a81bc29f25cb6d0e539b990217b6e97"} Dec 03 22:26:06.827366 master-0 kubenswrapper[36504]: I1203 22:26:06.576527 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" event={"ID":"6f6ab462-046f-4f8a-8250-41ae4d6ace65","Type":"ContainerStarted","Data":"48131f2c81b3a745e26baa71d96b1b9252ad73627a445187c2c0555268c17a57"} Dec 03 22:26:06.827366 master-0 kubenswrapper[36504]: I1203 22:26:06.613202 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerStarted","Data":"9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1"} Dec 03 22:26:06.827366 master-0 kubenswrapper[36504]: I1203 22:26:06.640176 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" podStartSLOduration=3.640141714 podStartE2EDuration="3.640141714s" podCreationTimestamp="2025-12-03 22:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:06.618851174 +0000 UTC m=+931.838623211" watchObservedRunningTime="2025-12-03 22:26:06.640141714 +0000 UTC m=+931.859913721" Dec 03 22:26:06.827366 master-0 kubenswrapper[36504]: I1203 22:26:06.715159 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-scheduler-0"] Dec 03 22:26:07.063375 master-0 kubenswrapper[36504]: I1203 22:26:07.063307 36504 scope.go:117] "RemoveContainer" containerID="75e9521003bd0fd877234e008e36945cca4473dbcbdbf6d9cd59ca7b9a4cd952" Dec 03 22:26:07.091271 master-0 kubenswrapper[36504]: I1203 22:26:07.091175 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:26:07.186696 master-0 kubenswrapper[36504]: I1203 22:26:07.186592 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f5cca7-d1b3-4572-859e-e77f1f4055ae" path="/var/lib/kubelet/pods/c2f5cca7-d1b3-4572-859e-e77f1f4055ae/volumes" Dec 03 22:26:07.193659 master-0 kubenswrapper[36504]: I1203 22:26:07.193561 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:26:07.216814 master-0 kubenswrapper[36504]: I1203 22:26:07.216730 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:26:07.217900 master-0 kubenswrapper[36504]: E1203 22:26:07.217877 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="cinder-backup" Dec 03 22:26:07.217900 master-0 kubenswrapper[36504]: I1203 22:26:07.217900 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="cinder-backup" Dec 03 22:26:07.218016 master-0 kubenswrapper[36504]: E1203 22:26:07.217943 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="probe" Dec 03 22:26:07.218016 master-0 kubenswrapper[36504]: I1203 22:26:07.217950 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="probe" Dec 03 22:26:07.218315 master-0 kubenswrapper[36504]: I1203 22:26:07.218293 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="probe" Dec 03 22:26:07.218377 master-0 kubenswrapper[36504]: I1203 22:26:07.218352 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" containerName="cinder-backup" Dec 03 22:26:07.220356 master-0 kubenswrapper[36504]: I1203 22:26:07.220319 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.223154 master-0 kubenswrapper[36504]: I1203 22:26:07.223119 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-baebb-backup-config-data" Dec 03 22:26:07.248731 master-0 kubenswrapper[36504]: I1203 22:26:07.248593 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:26:07.274036 master-0 kubenswrapper[36504]: I1203 22:26:07.273944 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-machine-id\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274070 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-iscsi\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274102 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdtwx\" (UniqueName: \"kubernetes.io/projected/33dab541-cf59-45a8-9abf-389189f691d2-kube-api-access-qdtwx\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274141 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-config-data-custom\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274209 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-locks-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274254 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-dev\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274286 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-run\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274313 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-scripts\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274330 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-lib-modules\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274385 master-0 kubenswrapper[36504]: I1203 22:26:07.274355 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-lib-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274722 master-0 kubenswrapper[36504]: I1203 22:26:07.274412 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-nvme\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274722 master-0 kubenswrapper[36504]: I1203 22:26:07.274446 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-combined-ca-bundle\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274722 master-0 kubenswrapper[36504]: I1203 22:26:07.274476 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-config-data\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274722 master-0 kubenswrapper[36504]: I1203 22:26:07.274536 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-sys\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.274722 master-0 kubenswrapper[36504]: I1203 22:26:07.274567 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-locks-brick\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377503 master-0 kubenswrapper[36504]: I1203 22:26:07.377394 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-iscsi\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377518 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdtwx\" (UniqueName: \"kubernetes.io/projected/33dab541-cf59-45a8-9abf-389189f691d2-kube-api-access-qdtwx\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377554 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-config-data-custom\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377606 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-locks-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377642 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-dev\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377666 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-run\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377687 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-scripts\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377704 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-lib-modules\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377727 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-lib-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377789 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-nvme\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377818 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-combined-ca-bundle\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377854 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-config-data\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377906 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-sys\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.377936 master-0 kubenswrapper[36504]: I1203 22:26:07.377935 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-locks-brick\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.378461 master-0 kubenswrapper[36504]: I1203 22:26:07.377974 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-dev\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.378461 master-0 kubenswrapper[36504]: I1203 22:26:07.378057 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-machine-id\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.378461 master-0 kubenswrapper[36504]: I1203 22:26:07.377999 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-machine-id\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.378461 master-0 kubenswrapper[36504]: I1203 22:26:07.378115 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-run\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.378639 master-0 kubenswrapper[36504]: I1203 22:26:07.378595 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-locks-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.385820 master-0 kubenswrapper[36504]: I1203 22:26:07.385749 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-sys\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.385923 master-0 kubenswrapper[36504]: I1203 22:26:07.385850 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-locks-brick\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.385923 master-0 kubenswrapper[36504]: I1203 22:26:07.385885 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-iscsi\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.385923 master-0 kubenswrapper[36504]: I1203 22:26:07.385923 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-var-lib-cinder\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.386112 master-0 kubenswrapper[36504]: I1203 22:26:07.385945 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-lib-modules\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.388476 master-0 kubenswrapper[36504]: I1203 22:26:07.388408 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/33dab541-cf59-45a8-9abf-389189f691d2-etc-nvme\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.388576 master-0 kubenswrapper[36504]: I1203 22:26:07.388493 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-scripts\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.389188 master-0 kubenswrapper[36504]: I1203 22:26:07.389163 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-config-data\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.411137 master-0 kubenswrapper[36504]: I1203 22:26:07.403286 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-config-data-custom\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.415906 master-0 kubenswrapper[36504]: I1203 22:26:07.415832 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 22:26:07.418259 master-0 kubenswrapper[36504]: I1203 22:26:07.418225 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:26:07.424408 master-0 kubenswrapper[36504]: I1203 22:26:07.424309 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 22:26:07.424757 master-0 kubenswrapper[36504]: I1203 22:26:07.424717 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 22:26:07.429609 master-0 kubenswrapper[36504]: I1203 22:26:07.427924 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33dab541-cf59-45a8-9abf-389189f691d2-combined-ca-bundle\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.430666 master-0 kubenswrapper[36504]: I1203 22:26:07.430600 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdtwx\" (UniqueName: \"kubernetes.io/projected/33dab541-cf59-45a8-9abf-389189f691d2-kube-api-access-qdtwx\") pod \"cinder-baebb-backup-0\" (UID: \"33dab541-cf59-45a8-9abf-389189f691d2\") " pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.479826 master-0 kubenswrapper[36504]: I1203 22:26:07.475161 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 22:26:07.480732 master-0 kubenswrapper[36504]: I1203 22:26:07.480653 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60c66f18-6a0e-47e1-8b5d-964324197521-openstack-config\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.480945 master-0 kubenswrapper[36504]: I1203 22:26:07.480729 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60c66f18-6a0e-47e1-8b5d-964324197521-openstack-config-secret\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.480945 master-0 kubenswrapper[36504]: I1203 22:26:07.480836 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c66f18-6a0e-47e1-8b5d-964324197521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.481084 master-0 kubenswrapper[36504]: I1203 22:26:07.480958 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf72g\" (UniqueName: \"kubernetes.io/projected/60c66f18-6a0e-47e1-8b5d-964324197521-kube-api-access-wf72g\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.579804 master-0 kubenswrapper[36504]: I1203 22:26:07.579006 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:07.585435 master-0 kubenswrapper[36504]: I1203 22:26:07.582854 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c66f18-6a0e-47e1-8b5d-964324197521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.585435 master-0 kubenswrapper[36504]: I1203 22:26:07.582975 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf72g\" (UniqueName: \"kubernetes.io/projected/60c66f18-6a0e-47e1-8b5d-964324197521-kube-api-access-wf72g\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.585435 master-0 kubenswrapper[36504]: I1203 22:26:07.583075 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60c66f18-6a0e-47e1-8b5d-964324197521-openstack-config\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.585435 master-0 kubenswrapper[36504]: I1203 22:26:07.583106 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60c66f18-6a0e-47e1-8b5d-964324197521-openstack-config-secret\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.590001 master-0 kubenswrapper[36504]: I1203 22:26:07.587964 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/60c66f18-6a0e-47e1-8b5d-964324197521-openstack-config\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.597259 master-0 kubenswrapper[36504]: I1203 22:26:07.591740 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/60c66f18-6a0e-47e1-8b5d-964324197521-openstack-config-secret\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.597259 master-0 kubenswrapper[36504]: I1203 22:26:07.592867 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60c66f18-6a0e-47e1-8b5d-964324197521-combined-ca-bundle\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.615823 master-0 kubenswrapper[36504]: I1203 22:26:07.607008 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf72g\" (UniqueName: \"kubernetes.io/projected/60c66f18-6a0e-47e1-8b5d-964324197521-kube-api-access-wf72g\") pod \"openstackclient\" (UID: \"60c66f18-6a0e-47e1-8b5d-964324197521\") " pod="openstack/openstackclient" Dec 03 22:26:07.710349 master-0 kubenswrapper[36504]: I1203 22:26:07.710259 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"738da7ca-4034-4602-a382-696d5416843b","Type":"ContainerStarted","Data":"0587b7d9378eae06c459bb12599b200ca0f1a0da258c518b42093852f63826ca"} Dec 03 22:26:07.744951 master-0 kubenswrapper[36504]: I1203 22:26:07.744847 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerStarted","Data":"47dd760a1f4c95b9511d5891cca800c8be2498b111745e0ac32ab91438c4dc2a"} Dec 03 22:26:07.841795 master-0 kubenswrapper[36504]: I1203 22:26:07.833495 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 22:26:08.069187 master-0 kubenswrapper[36504]: I1203 22:26:08.069111 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-baebb-api-0" Dec 03 22:26:08.526932 master-0 kubenswrapper[36504]: I1203 22:26:08.521466 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:26:08.652980 master-0 kubenswrapper[36504]: I1203 22:26:08.652868 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-baebb-backup-0"] Dec 03 22:26:08.658885 master-0 kubenswrapper[36504]: I1203 22:26:08.657233 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data\") pod \"1293e094-4f1e-4c59-8050-7eeb11695c40\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " Dec 03 22:26:08.658885 master-0 kubenswrapper[36504]: I1203 22:26:08.657413 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-combined-ca-bundle\") pod \"1293e094-4f1e-4c59-8050-7eeb11695c40\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " Dec 03 22:26:08.658885 master-0 kubenswrapper[36504]: I1203 22:26:08.657616 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data-custom\") pod \"1293e094-4f1e-4c59-8050-7eeb11695c40\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " Dec 03 22:26:08.658885 master-0 kubenswrapper[36504]: I1203 22:26:08.657860 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1293e094-4f1e-4c59-8050-7eeb11695c40-logs\") pod \"1293e094-4f1e-4c59-8050-7eeb11695c40\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " Dec 03 22:26:08.658885 master-0 kubenswrapper[36504]: I1203 22:26:08.657990 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsk4l\" (UniqueName: \"kubernetes.io/projected/1293e094-4f1e-4c59-8050-7eeb11695c40-kube-api-access-vsk4l\") pod \"1293e094-4f1e-4c59-8050-7eeb11695c40\" (UID: \"1293e094-4f1e-4c59-8050-7eeb11695c40\") " Dec 03 22:26:08.675841 master-0 kubenswrapper[36504]: I1203 22:26:08.664502 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1293e094-4f1e-4c59-8050-7eeb11695c40-logs" (OuterVolumeSpecName: "logs") pod "1293e094-4f1e-4c59-8050-7eeb11695c40" (UID: "1293e094-4f1e-4c59-8050-7eeb11695c40"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:08.675841 master-0 kubenswrapper[36504]: I1203 22:26:08.667545 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 22:26:08.675841 master-0 kubenswrapper[36504]: I1203 22:26:08.674631 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1293e094-4f1e-4c59-8050-7eeb11695c40" (UID: "1293e094-4f1e-4c59-8050-7eeb11695c40"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:08.694811 master-0 kubenswrapper[36504]: I1203 22:26:08.686535 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1293e094-4f1e-4c59-8050-7eeb11695c40-kube-api-access-vsk4l" (OuterVolumeSpecName: "kube-api-access-vsk4l") pod "1293e094-4f1e-4c59-8050-7eeb11695c40" (UID: "1293e094-4f1e-4c59-8050-7eeb11695c40"). InnerVolumeSpecName "kube-api-access-vsk4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:08.730855 master-0 kubenswrapper[36504]: I1203 22:26:08.725198 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1293e094-4f1e-4c59-8050-7eeb11695c40" (UID: "1293e094-4f1e-4c59-8050-7eeb11695c40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:08.763799 master-0 kubenswrapper[36504]: I1203 22:26:08.763544 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1293e094-4f1e-4c59-8050-7eeb11695c40-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:08.763799 master-0 kubenswrapper[36504]: I1203 22:26:08.763617 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsk4l\" (UniqueName: \"kubernetes.io/projected/1293e094-4f1e-4c59-8050-7eeb11695c40-kube-api-access-vsk4l\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:08.763799 master-0 kubenswrapper[36504]: I1203 22:26:08.763630 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:08.763799 master-0 kubenswrapper[36504]: I1203 22:26:08.763644 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:08.852069 master-0 kubenswrapper[36504]: I1203 22:26:08.851879 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:08.870800 master-0 kubenswrapper[36504]: I1203 22:26:08.869558 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"33dab541-cf59-45a8-9abf-389189f691d2","Type":"ContainerStarted","Data":"30b5a9eaf3f00e9da53968bc7ec47299eae92d2c30b3fe05d87c294f0a9cabee"} Dec 03 22:26:08.880104 master-0 kubenswrapper[36504]: I1203 22:26:08.878010 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"738da7ca-4034-4602-a382-696d5416843b","Type":"ContainerStarted","Data":"d2876a47760b21a47dc1cbd82d388ebd2a2c564df3b946733b704fdbd9d8cb3c"} Dec 03 22:26:08.883794 master-0 kubenswrapper[36504]: I1203 22:26:08.880551 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data" (OuterVolumeSpecName: "config-data") pod "1293e094-4f1e-4c59-8050-7eeb11695c40" (UID: "1293e094-4f1e-4c59-8050-7eeb11695c40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:08.883794 master-0 kubenswrapper[36504]: I1203 22:26:08.881245 36504 generic.go:334] "Generic (PLEG): container finished" podID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerID="e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a" exitCode=0 Dec 03 22:26:08.883794 master-0 kubenswrapper[36504]: I1203 22:26:08.881301 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7997d6f544-ljwhp" event={"ID":"1293e094-4f1e-4c59-8050-7eeb11695c40","Type":"ContainerDied","Data":"e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a"} Dec 03 22:26:08.883794 master-0 kubenswrapper[36504]: I1203 22:26:08.881326 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7997d6f544-ljwhp" event={"ID":"1293e094-4f1e-4c59-8050-7eeb11695c40","Type":"ContainerDied","Data":"f2d1b8b582c171b83578d67685e800fe9005b2e52d696f8a8225e597cd69883d"} Dec 03 22:26:08.883794 master-0 kubenswrapper[36504]: I1203 22:26:08.881349 36504 scope.go:117] "RemoveContainer" containerID="e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a" Dec 03 22:26:08.883794 master-0 kubenswrapper[36504]: I1203 22:26:08.881476 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7997d6f544-ljwhp" Dec 03 22:26:08.925794 master-0 kubenswrapper[36504]: I1203 22:26:08.924922 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"60c66f18-6a0e-47e1-8b5d-964324197521","Type":"ContainerStarted","Data":"4a6e38d7df802d9e7de0d7ecf9c40c411c37965b4a26781dea34330bf1287fb7"} Dec 03 22:26:08.976801 master-0 kubenswrapper[36504]: I1203 22:26:08.975950 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1293e094-4f1e-4c59-8050-7eeb11695c40-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:09.197280 master-0 kubenswrapper[36504]: I1203 22:26:09.121451 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23860378-b004-4ef4-9a8b-847fd6277fd8" path="/var/lib/kubelet/pods/23860378-b004-4ef4-9a8b-847fd6277fd8/volumes" Dec 03 22:26:09.202705 master-0 kubenswrapper[36504]: I1203 22:26:09.196604 36504 scope.go:117] "RemoveContainer" containerID="33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc" Dec 03 22:26:09.232428 master-0 kubenswrapper[36504]: I1203 22:26:09.232374 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7997d6f544-ljwhp"] Dec 03 22:26:09.261502 master-0 kubenswrapper[36504]: I1203 22:26:09.261442 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7997d6f544-ljwhp"] Dec 03 22:26:09.312800 master-0 kubenswrapper[36504]: I1203 22:26:09.309763 36504 scope.go:117] "RemoveContainer" containerID="e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a" Dec 03 22:26:09.312800 master-0 kubenswrapper[36504]: E1203 22:26:09.310520 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a\": container with ID starting with e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a not found: ID does not exist" containerID="e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a" Dec 03 22:26:09.312800 master-0 kubenswrapper[36504]: I1203 22:26:09.310558 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a"} err="failed to get container status \"e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a\": rpc error: code = NotFound desc = could not find container \"e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a\": container with ID starting with e3351cc2a0e1738864286504da08294c44dc56e1c26680a66520984656b1d22a not found: ID does not exist" Dec 03 22:26:09.312800 master-0 kubenswrapper[36504]: I1203 22:26:09.310588 36504 scope.go:117] "RemoveContainer" containerID="33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc" Dec 03 22:26:09.312800 master-0 kubenswrapper[36504]: E1203 22:26:09.310858 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc\": container with ID starting with 33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc not found: ID does not exist" containerID="33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc" Dec 03 22:26:09.312800 master-0 kubenswrapper[36504]: I1203 22:26:09.310876 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc"} err="failed to get container status \"33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc\": rpc error: code = NotFound desc = could not find container \"33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc\": container with ID starting with 33fc76a7478f1d1fc74c5c9f61b7ae07e6623549685c9b10b2361a37e8d34bdc not found: ID does not exist" Dec 03 22:26:09.960797 master-0 kubenswrapper[36504]: I1203 22:26:09.957942 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-scheduler-0" event={"ID":"738da7ca-4034-4602-a382-696d5416843b","Type":"ContainerStarted","Data":"afdfc457c6e3cb3ecbf9defae4529d79c9317e65d3408be84fce5ffb9712b33d"} Dec 03 22:26:09.973796 master-0 kubenswrapper[36504]: I1203 22:26:09.971296 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerStarted","Data":"2af1e8281d53c0fea991fee2eb07fa8e8b5dd879bd4163846dc853263f052f0c"} Dec 03 22:26:09.973796 master-0 kubenswrapper[36504]: I1203 22:26:09.971935 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:26:09.977790 master-0 kubenswrapper[36504]: I1203 22:26:09.976109 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"33dab541-cf59-45a8-9abf-389189f691d2","Type":"ContainerStarted","Data":"a8073a4558416abb35c6a87db360768c3a8f12c96602d5d969a861dd430ff529"} Dec 03 22:26:09.977790 master-0 kubenswrapper[36504]: I1203 22:26:09.976182 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-baebb-backup-0" event={"ID":"33dab541-cf59-45a8-9abf-389189f691d2","Type":"ContainerStarted","Data":"e918207e4fc531592d3235c24b7d99e8c9112922140d0faba2aff94c44a8d2b5"} Dec 03 22:26:10.059998 master-0 kubenswrapper[36504]: I1203 22:26:10.059709 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-scheduler-0" podStartSLOduration=5.059671717 podStartE2EDuration="5.059671717s" podCreationTimestamp="2025-12-03 22:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:10.001264052 +0000 UTC m=+935.221036059" watchObservedRunningTime="2025-12-03 22:26:10.059671717 +0000 UTC m=+935.279443724" Dec 03 22:26:10.097450 master-0 kubenswrapper[36504]: I1203 22:26:10.097273 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:26:10.161048 master-0 kubenswrapper[36504]: I1203 22:26:10.160924 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.492014169 podStartE2EDuration="9.160892549s" podCreationTimestamp="2025-12-03 22:26:01 +0000 UTC" firstStartedPulling="2025-12-03 22:26:03.228175426 +0000 UTC m=+928.447947433" lastFinishedPulling="2025-12-03 22:26:08.897053806 +0000 UTC m=+934.116825813" observedRunningTime="2025-12-03 22:26:10.082165924 +0000 UTC m=+935.301937931" watchObservedRunningTime="2025-12-03 22:26:10.160892549 +0000 UTC m=+935.380664556" Dec 03 22:26:10.193456 master-0 kubenswrapper[36504]: I1203 22:26:10.193356 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-baebb-backup-0" podStartSLOduration=3.193332448 podStartE2EDuration="3.193332448s" podCreationTimestamp="2025-12-03 22:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:10.122345927 +0000 UTC m=+935.342117934" watchObservedRunningTime="2025-12-03 22:26:10.193332448 +0000 UTC m=+935.413104455" Dec 03 22:26:10.843065 master-0 kubenswrapper[36504]: I1203 22:26:10.842930 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:11.136367 master-0 kubenswrapper[36504]: I1203 22:26:11.136228 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" path="/var/lib/kubelet/pods/1293e094-4f1e-4c59-8050-7eeb11695c40/volumes" Dec 03 22:26:12.581912 master-0 kubenswrapper[36504]: I1203 22:26:12.581854 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.183360 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-65dc9c979b-4jgcr"] Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: E1203 22:26:13.184084 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api-log" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.184110 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api-log" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: E1203 22:26:13.184149 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.184160 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.184683 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.184703 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1293e094-4f1e-4c59-8050-7eeb11695c40" containerName="barbican-api-log" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.186876 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.189801 master-0 kubenswrapper[36504]: I1203 22:26:13.188726 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-65dc9c979b-4jgcr"] Dec 03 22:26:13.206854 master-0 kubenswrapper[36504]: I1203 22:26:13.192214 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 22:26:13.206854 master-0 kubenswrapper[36504]: I1203 22:26:13.192546 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 22:26:13.206854 master-0 kubenswrapper[36504]: I1203 22:26:13.192711 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 22:26:13.326083 master-0 kubenswrapper[36504]: I1203 22:26:13.325895 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-internal-tls-certs\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.326083 master-0 kubenswrapper[36504]: I1203 22:26:13.325953 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/93b454f7-f13a-4fa7-9b55-06cc102dd59e-etc-swift\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.326083 master-0 kubenswrapper[36504]: I1203 22:26:13.325992 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-config-data\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.326083 master-0 kubenswrapper[36504]: I1203 22:26:13.326065 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-combined-ca-bundle\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.327142 master-0 kubenswrapper[36504]: I1203 22:26:13.326105 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh28g\" (UniqueName: \"kubernetes.io/projected/93b454f7-f13a-4fa7-9b55-06cc102dd59e-kube-api-access-lh28g\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.327142 master-0 kubenswrapper[36504]: I1203 22:26:13.326158 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b454f7-f13a-4fa7-9b55-06cc102dd59e-run-httpd\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.327142 master-0 kubenswrapper[36504]: I1203 22:26:13.326227 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b454f7-f13a-4fa7-9b55-06cc102dd59e-log-httpd\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.327142 master-0 kubenswrapper[36504]: I1203 22:26:13.326264 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-public-tls-certs\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.431849 master-0 kubenswrapper[36504]: I1203 22:26:13.431753 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-internal-tls-certs\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.432197 master-0 kubenswrapper[36504]: I1203 22:26:13.431883 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/93b454f7-f13a-4fa7-9b55-06cc102dd59e-etc-swift\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.432197 master-0 kubenswrapper[36504]: I1203 22:26:13.431984 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-config-data\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.432696 master-0 kubenswrapper[36504]: I1203 22:26:13.432664 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-combined-ca-bundle\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.435031 master-0 kubenswrapper[36504]: I1203 22:26:13.433409 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh28g\" (UniqueName: \"kubernetes.io/projected/93b454f7-f13a-4fa7-9b55-06cc102dd59e-kube-api-access-lh28g\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.435031 master-0 kubenswrapper[36504]: I1203 22:26:13.433681 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b454f7-f13a-4fa7-9b55-06cc102dd59e-run-httpd\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.435031 master-0 kubenswrapper[36504]: I1203 22:26:13.433944 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b454f7-f13a-4fa7-9b55-06cc102dd59e-log-httpd\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.435031 master-0 kubenswrapper[36504]: I1203 22:26:13.434321 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-public-tls-certs\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.435031 master-0 kubenswrapper[36504]: I1203 22:26:13.434577 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b454f7-f13a-4fa7-9b55-06cc102dd59e-run-httpd\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.435861 master-0 kubenswrapper[36504]: I1203 22:26:13.435137 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93b454f7-f13a-4fa7-9b55-06cc102dd59e-log-httpd\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.441834 master-0 kubenswrapper[36504]: I1203 22:26:13.441786 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-public-tls-certs\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.442153 master-0 kubenswrapper[36504]: I1203 22:26:13.441938 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-combined-ca-bundle\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.445105 master-0 kubenswrapper[36504]: I1203 22:26:13.444755 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/93b454f7-f13a-4fa7-9b55-06cc102dd59e-etc-swift\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.446823 master-0 kubenswrapper[36504]: I1203 22:26:13.446783 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-internal-tls-certs\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.452578 master-0 kubenswrapper[36504]: I1203 22:26:13.452523 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93b454f7-f13a-4fa7-9b55-06cc102dd59e-config-data\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.475433 master-0 kubenswrapper[36504]: I1203 22:26:13.469836 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh28g\" (UniqueName: \"kubernetes.io/projected/93b454f7-f13a-4fa7-9b55-06cc102dd59e-kube-api-access-lh28g\") pod \"swift-proxy-65dc9c979b-4jgcr\" (UID: \"93b454f7-f13a-4fa7-9b55-06cc102dd59e\") " pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.520061 master-0 kubenswrapper[36504]: I1203 22:26:13.519983 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:13.946955 master-0 kubenswrapper[36504]: I1203 22:26:13.937819 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:13.946955 master-0 kubenswrapper[36504]: I1203 22:26:13.938910 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-central-agent" containerID="cri-o://39b633a884a9121e66c91ba2fc7e06cf8f415a2a1201873ac497ae4210b224a3" gracePeriod=30 Dec 03 22:26:13.946955 master-0 kubenswrapper[36504]: I1203 22:26:13.939888 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="proxy-httpd" containerID="cri-o://2af1e8281d53c0fea991fee2eb07fa8e8b5dd879bd4163846dc853263f052f0c" gracePeriod=30 Dec 03 22:26:13.946955 master-0 kubenswrapper[36504]: I1203 22:26:13.939937 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="sg-core" containerID="cri-o://47dd760a1f4c95b9511d5891cca800c8be2498b111745e0ac32ab91438c4dc2a" gracePeriod=30 Dec 03 22:26:13.946955 master-0 kubenswrapper[36504]: I1203 22:26:13.939969 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-notification-agent" containerID="cri-o://9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1" gracePeriod=30 Dec 03 22:26:14.152593 master-0 kubenswrapper[36504]: I1203 22:26:14.152417 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-65dc9c979b-4jgcr"] Dec 03 22:26:14.399848 master-0 kubenswrapper[36504]: I1203 22:26:14.398072 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-baebb-volume-lvm-iscsi-0" Dec 03 22:26:15.100073 master-0 kubenswrapper[36504]: E1203 22:26:15.099984 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod089feb3e_dc06_4ffc_b27d_661e6eb95f1a.slice/crio-conmon-9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod089feb3e_dc06_4ffc_b27d_661e6eb95f1a.slice/crio-9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191645 36504 generic.go:334] "Generic (PLEG): container finished" podID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerID="2af1e8281d53c0fea991fee2eb07fa8e8b5dd879bd4163846dc853263f052f0c" exitCode=0 Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191722 36504 generic.go:334] "Generic (PLEG): container finished" podID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerID="47dd760a1f4c95b9511d5891cca800c8be2498b111745e0ac32ab91438c4dc2a" exitCode=2 Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191737 36504 generic.go:334] "Generic (PLEG): container finished" podID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerID="9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1" exitCode=0 Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191745 36504 generic.go:334] "Generic (PLEG): container finished" podID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerID="39b633a884a9121e66c91ba2fc7e06cf8f415a2a1201873ac497ae4210b224a3" exitCode=0 Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191828 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerDied","Data":"2af1e8281d53c0fea991fee2eb07fa8e8b5dd879bd4163846dc853263f052f0c"} Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191877 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerDied","Data":"47dd760a1f4c95b9511d5891cca800c8be2498b111745e0ac32ab91438c4dc2a"} Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerDied","Data":"9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1"} Dec 03 22:26:15.202800 master-0 kubenswrapper[36504]: I1203 22:26:15.191903 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerDied","Data":"39b633a884a9121e66c91ba2fc7e06cf8f415a2a1201873ac497ae4210b224a3"} Dec 03 22:26:15.224005 master-0 kubenswrapper[36504]: I1203 22:26:15.210508 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-65dc9c979b-4jgcr" event={"ID":"93b454f7-f13a-4fa7-9b55-06cc102dd59e","Type":"ContainerStarted","Data":"10ef283cf71c6aac49a86e6472f2b8111aa4406f0974d996342991c93cfab588"} Dec 03 22:26:15.224005 master-0 kubenswrapper[36504]: I1203 22:26:15.210606 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-65dc9c979b-4jgcr" event={"ID":"93b454f7-f13a-4fa7-9b55-06cc102dd59e","Type":"ContainerStarted","Data":"9edb079f1f6e65ff005a9801600bebe60933ebcbf5fd11c5af197fdcd98fab11"} Dec 03 22:26:15.224005 master-0 kubenswrapper[36504]: I1203 22:26:15.210623 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-65dc9c979b-4jgcr" event={"ID":"93b454f7-f13a-4fa7-9b55-06cc102dd59e","Type":"ContainerStarted","Data":"dd8b10e388e649f373f3a48d28293cef3a64887aea90d93f2dada7c6384f67eb"} Dec 03 22:26:15.224005 master-0 kubenswrapper[36504]: I1203 22:26:15.211303 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:15.224005 master-0 kubenswrapper[36504]: I1203 22:26:15.211347 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:15.265081 master-0 kubenswrapper[36504]: I1203 22:26:15.264882 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:15.287808 master-0 kubenswrapper[36504]: I1203 22:26:15.286180 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-65dc9c979b-4jgcr" podStartSLOduration=2.286149933 podStartE2EDuration="2.286149933s" podCreationTimestamp="2025-12-03 22:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:15.253256129 +0000 UTC m=+940.473028156" watchObservedRunningTime="2025-12-03 22:26:15.286149933 +0000 UTC m=+940.505921940" Dec 03 22:26:15.366858 master-0 kubenswrapper[36504]: I1203 22:26:15.366807 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-log-httpd\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.367504 master-0 kubenswrapper[36504]: I1203 22:26:15.367173 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-combined-ca-bundle\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.367504 master-0 kubenswrapper[36504]: I1203 22:26:15.367323 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-run-httpd\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.367639 master-0 kubenswrapper[36504]: I1203 22:26:15.367503 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-sg-core-conf-yaml\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.367639 master-0 kubenswrapper[36504]: I1203 22:26:15.367555 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-scripts\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.367639 master-0 kubenswrapper[36504]: I1203 22:26:15.367622 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klh6k\" (UniqueName: \"kubernetes.io/projected/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-kube-api-access-klh6k\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.367810 master-0 kubenswrapper[36504]: I1203 22:26:15.367651 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-config-data\") pod \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\" (UID: \"089feb3e-dc06-4ffc-b27d-661e6eb95f1a\") " Dec 03 22:26:15.368785 master-0 kubenswrapper[36504]: I1203 22:26:15.368070 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:15.368785 master-0 kubenswrapper[36504]: I1203 22:26:15.368186 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:15.369085 master-0 kubenswrapper[36504]: I1203 22:26:15.369062 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:15.369222 master-0 kubenswrapper[36504]: I1203 22:26:15.369207 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:15.383132 master-0 kubenswrapper[36504]: I1203 22:26:15.381661 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-kube-api-access-klh6k" (OuterVolumeSpecName: "kube-api-access-klh6k") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "kube-api-access-klh6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:15.387166 master-0 kubenswrapper[36504]: I1203 22:26:15.387073 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-scripts" (OuterVolumeSpecName: "scripts") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:15.423618 master-0 kubenswrapper[36504]: I1203 22:26:15.423549 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:15.473633 master-0 kubenswrapper[36504]: I1203 22:26:15.473575 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:15.473985 master-0 kubenswrapper[36504]: I1203 22:26:15.473972 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:15.474059 master-0 kubenswrapper[36504]: I1203 22:26:15.474048 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klh6k\" (UniqueName: \"kubernetes.io/projected/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-kube-api-access-klh6k\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:15.543062 master-0 kubenswrapper[36504]: I1203 22:26:15.542979 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:15.544581 master-0 kubenswrapper[36504]: I1203 22:26:15.544515 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-config-data" (OuterVolumeSpecName: "config-data") pod "089feb3e-dc06-4ffc-b27d-661e6eb95f1a" (UID: "089feb3e-dc06-4ffc-b27d-661e6eb95f1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:15.577655 master-0 kubenswrapper[36504]: I1203 22:26:15.577548 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:15.577655 master-0 kubenswrapper[36504]: I1203 22:26:15.577610 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/089feb3e-dc06-4ffc-b27d-661e6eb95f1a-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:16.096444 master-0 kubenswrapper[36504]: I1203 22:26:16.096162 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:26:16.194980 master-0 kubenswrapper[36504]: I1203 22:26:16.194905 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-baebb-scheduler-0" Dec 03 22:26:16.310825 master-0 kubenswrapper[36504]: I1203 22:26:16.310228 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:16.310825 master-0 kubenswrapper[36504]: I1203 22:26:16.310338 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"089feb3e-dc06-4ffc-b27d-661e6eb95f1a","Type":"ContainerDied","Data":"48759c024e96c437dea4715e25b0b13e01eeefc864e663712b06f57f8952a715"} Dec 03 22:26:16.310825 master-0 kubenswrapper[36504]: I1203 22:26:16.310442 36504 scope.go:117] "RemoveContainer" containerID="2af1e8281d53c0fea991fee2eb07fa8e8b5dd879bd4163846dc853263f052f0c" Dec 03 22:26:16.468227 master-0 kubenswrapper[36504]: I1203 22:26:16.468127 36504 scope.go:117] "RemoveContainer" containerID="47dd760a1f4c95b9511d5891cca800c8be2498b111745e0ac32ab91438c4dc2a" Dec 03 22:26:16.508902 master-0 kubenswrapper[36504]: I1203 22:26:16.508824 36504 scope.go:117] "RemoveContainer" containerID="9d1880855cae54518b13082bf80b8b1d33fd67efb0d20af3423da71e912ac9d1" Dec 03 22:26:16.604491 master-0 kubenswrapper[36504]: I1203 22:26:16.604444 36504 scope.go:117] "RemoveContainer" containerID="39b633a884a9121e66c91ba2fc7e06cf8f415a2a1201873ac497ae4210b224a3" Dec 03 22:26:16.863056 master-0 kubenswrapper[36504]: I1203 22:26:16.861780 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:16.901967 master-0 kubenswrapper[36504]: I1203 22:26:16.900735 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:16.931909 master-0 kubenswrapper[36504]: I1203 22:26:16.931811 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:16.932846 master-0 kubenswrapper[36504]: E1203 22:26:16.932788 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-central-agent" Dec 03 22:26:16.932846 master-0 kubenswrapper[36504]: I1203 22:26:16.932820 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-central-agent" Dec 03 22:26:16.932981 master-0 kubenswrapper[36504]: E1203 22:26:16.932900 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="sg-core" Dec 03 22:26:16.932981 master-0 kubenswrapper[36504]: I1203 22:26:16.932913 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="sg-core" Dec 03 22:26:16.932981 master-0 kubenswrapper[36504]: E1203 22:26:16.932959 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="proxy-httpd" Dec 03 22:26:16.932981 master-0 kubenswrapper[36504]: I1203 22:26:16.932970 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="proxy-httpd" Dec 03 22:26:16.933166 master-0 kubenswrapper[36504]: E1203 22:26:16.932997 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-notification-agent" Dec 03 22:26:16.933166 master-0 kubenswrapper[36504]: I1203 22:26:16.933007 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-notification-agent" Dec 03 22:26:16.933391 master-0 kubenswrapper[36504]: I1203 22:26:16.933363 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="proxy-httpd" Dec 03 22:26:16.933456 master-0 kubenswrapper[36504]: I1203 22:26:16.933393 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-notification-agent" Dec 03 22:26:16.933456 master-0 kubenswrapper[36504]: I1203 22:26:16.933418 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="sg-core" Dec 03 22:26:16.933545 master-0 kubenswrapper[36504]: I1203 22:26:16.933467 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" containerName="ceilometer-central-agent" Dec 03 22:26:16.947637 master-0 kubenswrapper[36504]: I1203 22:26:16.947554 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:16.950727 master-0 kubenswrapper[36504]: I1203 22:26:16.950667 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:26:16.951093 master-0 kubenswrapper[36504]: I1203 22:26:16.951070 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:26:16.952729 master-0 kubenswrapper[36504]: I1203 22:26:16.952674 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004609 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-log-httpd\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004681 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-run-httpd\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004742 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-config-data\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004784 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77bf\" (UniqueName: \"kubernetes.io/projected/31234ea0-415e-4bc1-9581-af21cdc9db72-kube-api-access-s77bf\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004875 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-scripts\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004935 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.005311 master-0 kubenswrapper[36504]: I1203 22:26:17.004994 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.108518 master-0 kubenswrapper[36504]: I1203 22:26:17.108411 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-config-data\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.108518 master-0 kubenswrapper[36504]: I1203 22:26:17.108494 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77bf\" (UniqueName: \"kubernetes.io/projected/31234ea0-415e-4bc1-9581-af21cdc9db72-kube-api-access-s77bf\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.109023 master-0 kubenswrapper[36504]: I1203 22:26:17.108636 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-scripts\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.109023 master-0 kubenswrapper[36504]: I1203 22:26:17.108696 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.109023 master-0 kubenswrapper[36504]: I1203 22:26:17.108837 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.109214 master-0 kubenswrapper[36504]: I1203 22:26:17.109032 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-log-httpd\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.109214 master-0 kubenswrapper[36504]: I1203 22:26:17.109092 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-run-httpd\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.110351 master-0 kubenswrapper[36504]: I1203 22:26:17.109835 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-run-httpd\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.110351 master-0 kubenswrapper[36504]: I1203 22:26:17.110214 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-log-httpd\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.115063 master-0 kubenswrapper[36504]: I1203 22:26:17.113259 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-scripts\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.118954 master-0 kubenswrapper[36504]: I1203 22:26:17.116538 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.118954 master-0 kubenswrapper[36504]: I1203 22:26:17.117566 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-config-data\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.122635 master-0 kubenswrapper[36504]: I1203 22:26:17.122585 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089feb3e-dc06-4ffc-b27d-661e6eb95f1a" path="/var/lib/kubelet/pods/089feb3e-dc06-4ffc-b27d-661e6eb95f1a/volumes" Dec 03 22:26:17.123135 master-0 kubenswrapper[36504]: I1203 22:26:17.122990 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.159552 master-0 kubenswrapper[36504]: I1203 22:26:17.159395 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77bf\" (UniqueName: \"kubernetes.io/projected/31234ea0-415e-4bc1-9581-af21cdc9db72-kube-api-access-s77bf\") pod \"ceilometer-0\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " pod="openstack/ceilometer-0" Dec 03 22:26:17.279920 master-0 kubenswrapper[36504]: I1203 22:26:17.278902 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:17.281057 master-0 kubenswrapper[36504]: I1203 22:26:17.280852 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:17.914408 master-0 kubenswrapper[36504]: I1203 22:26:17.914320 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-baebb-backup-0" Dec 03 22:26:18.857428 master-0 kubenswrapper[36504]: I1203 22:26:18.857198 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-48g2l"] Dec 03 22:26:18.863201 master-0 kubenswrapper[36504]: I1203 22:26:18.860910 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:18.893917 master-0 kubenswrapper[36504]: I1203 22:26:18.893351 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-catalog-content\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:18.893917 master-0 kubenswrapper[36504]: I1203 22:26:18.893419 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfvv\" (UniqueName: \"kubernetes.io/projected/21102a00-2eed-4a6b-84e6-07403fbda79c-kube-api-access-vkfvv\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:18.893917 master-0 kubenswrapper[36504]: I1203 22:26:18.893486 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-utilities\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:18.893917 master-0 kubenswrapper[36504]: I1203 22:26:18.893635 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48g2l"] Dec 03 22:26:18.996923 master-0 kubenswrapper[36504]: I1203 22:26:18.996756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-catalog-content\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:18.997246 master-0 kubenswrapper[36504]: I1203 22:26:18.996953 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfvv\" (UniqueName: \"kubernetes.io/projected/21102a00-2eed-4a6b-84e6-07403fbda79c-kube-api-access-vkfvv\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:18.997711 master-0 kubenswrapper[36504]: I1203 22:26:18.997690 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-utilities\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:19.022232 master-0 kubenswrapper[36504]: I1203 22:26:19.016516 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-catalog-content\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:19.022232 master-0 kubenswrapper[36504]: I1203 22:26:19.016516 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-utilities\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:19.043930 master-0 kubenswrapper[36504]: I1203 22:26:19.042011 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfvv\" (UniqueName: \"kubernetes.io/projected/21102a00-2eed-4a6b-84e6-07403fbda79c-kube-api-access-vkfvv\") pod \"certified-operators-48g2l\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:19.237209 master-0 kubenswrapper[36504]: I1203 22:26:19.236519 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:23.533465 master-0 kubenswrapper[36504]: I1203 22:26:23.533371 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:23.539821 master-0 kubenswrapper[36504]: I1203 22:26:23.539745 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-65dc9c979b-4jgcr" Dec 03 22:26:24.095801 master-0 kubenswrapper[36504]: I1203 22:26:24.092327 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5cd76d4c4c-pwkqf"] Dec 03 22:26:24.095801 master-0 kubenswrapper[36504]: I1203 22:26:24.095366 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.102899 master-0 kubenswrapper[36504]: I1203 22:26:24.100439 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 03 22:26:24.131803 master-0 kubenswrapper[36504]: I1203 22:26:24.123175 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 22:26:24.160040 master-0 kubenswrapper[36504]: I1203 22:26:24.158280 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5cd76d4c4c-pwkqf"] Dec 03 22:26:24.308833 master-0 kubenswrapper[36504]: I1203 22:26:24.300483 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-combined-ca-bundle\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.310983 master-0 kubenswrapper[36504]: I1203 22:26:24.310915 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.311132 master-0 kubenswrapper[36504]: I1203 22:26:24.311108 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvph\" (UniqueName: \"kubernetes.io/projected/df6c2617-5f7b-41c0-a97a-534404895c92-kube-api-access-jsvph\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.328300 master-0 kubenswrapper[36504]: I1203 22:26:24.328209 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data-custom\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.393929 master-0 kubenswrapper[36504]: I1203 22:26:24.386950 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-747f594c7b-dwjx4"] Dec 03 22:26:24.398055 master-0 kubenswrapper[36504]: I1203 22:26:24.397134 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.408696 master-0 kubenswrapper[36504]: I1203 22:26:24.406229 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 03 22:26:24.433028 master-0 kubenswrapper[36504]: I1203 22:26:24.432898 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-combined-ca-bundle\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.433028 master-0 kubenswrapper[36504]: I1203 22:26:24.432985 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.433028 master-0 kubenswrapper[36504]: I1203 22:26:24.433047 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvph\" (UniqueName: \"kubernetes.io/projected/df6c2617-5f7b-41c0-a97a-534404895c92-kube-api-access-jsvph\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.433456 master-0 kubenswrapper[36504]: I1203 22:26:24.433428 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data-custom\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.465659 master-0 kubenswrapper[36504]: I1203 22:26:24.465607 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.469342 master-0 kubenswrapper[36504]: I1203 22:26:24.469257 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-747f594c7b-dwjx4"] Dec 03 22:26:24.492710 master-0 kubenswrapper[36504]: I1203 22:26:24.473607 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvph\" (UniqueName: \"kubernetes.io/projected/df6c2617-5f7b-41c0-a97a-534404895c92-kube-api-access-jsvph\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.492710 master-0 kubenswrapper[36504]: I1203 22:26:24.486877 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-combined-ca-bundle\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.502810 master-0 kubenswrapper[36504]: I1203 22:26:24.496905 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data-custom\") pod \"heat-engine-5cd76d4c4c-pwkqf\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.529053 master-0 kubenswrapper[36504]: I1203 22:26:24.528978 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d65b86965-zlwpd"] Dec 03 22:26:24.535408 master-0 kubenswrapper[36504]: I1203 22:26:24.535352 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.539538 master-0 kubenswrapper[36504]: I1203 22:26:24.536851 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnmmj\" (UniqueName: \"kubernetes.io/projected/7fad818c-c25b-40e8-aeea-e4e9dae1b839-kube-api-access-qnmmj\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.539538 master-0 kubenswrapper[36504]: I1203 22:26:24.537765 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-combined-ca-bundle\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.539538 master-0 kubenswrapper[36504]: I1203 22:26:24.538057 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data-custom\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.539538 master-0 kubenswrapper[36504]: I1203 22:26:24.538710 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.579802 master-0 kubenswrapper[36504]: I1203 22:26:24.568873 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-64967779f-xr54t"] Dec 03 22:26:24.590840 master-0 kubenswrapper[36504]: I1203 22:26:24.585518 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.590840 master-0 kubenswrapper[36504]: I1203 22:26:24.589964 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 03 22:26:24.610931 master-0 kubenswrapper[36504]: I1203 22:26:24.608973 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d65b86965-zlwpd"] Dec 03 22:26:24.631322 master-0 kubenswrapper[36504]: I1203 22:26:24.628174 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.642316 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-config\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.642413 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfzd9\" (UniqueName: \"kubernetes.io/projected/edd801a2-0ddf-475b-81d5-8421a26d7011-kube-api-access-lfzd9\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.642466 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkkv\" (UniqueName: \"kubernetes.io/projected/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-kube-api-access-zqkkv\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.642750 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data-custom\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.642792 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.642930 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-swift-storage-0\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643124 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-svc\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643175 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-nb\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643243 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643325 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-combined-ca-bundle\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643438 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnmmj\" (UniqueName: \"kubernetes.io/projected/7fad818c-c25b-40e8-aeea-e4e9dae1b839-kube-api-access-qnmmj\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643463 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-edpm\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643687 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-combined-ca-bundle\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643793 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data-custom\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.643849 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-sb\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.648471 master-0 kubenswrapper[36504]: I1203 22:26:24.645492 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-64967779f-xr54t"] Dec 03 22:26:24.649611 master-0 kubenswrapper[36504]: I1203 22:26:24.648897 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-combined-ca-bundle\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.655816 master-0 kubenswrapper[36504]: I1203 22:26:24.651425 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data-custom\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.655816 master-0 kubenswrapper[36504]: I1203 22:26:24.652604 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.682150 master-0 kubenswrapper[36504]: I1203 22:26:24.680800 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnmmj\" (UniqueName: \"kubernetes.io/projected/7fad818c-c25b-40e8-aeea-e4e9dae1b839-kube-api-access-qnmmj\") pod \"heat-cfnapi-747f594c7b-dwjx4\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.752523 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-sb\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.752689 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-config\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.752788 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfzd9\" (UniqueName: \"kubernetes.io/projected/edd801a2-0ddf-475b-81d5-8421a26d7011-kube-api-access-lfzd9\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.754681 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-sb\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.754789 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkkv\" (UniqueName: \"kubernetes.io/projected/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-kube-api-access-zqkkv\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.754901 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data-custom\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.754922 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755068 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-swift-storage-0\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755080 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-config\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755290 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-svc\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755328 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-nb\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755464 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-combined-ca-bundle\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755679 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-edpm\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.755783 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-swift-storage-0\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.756848 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-svc\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.757447 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-nb\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.762762 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-edpm\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.774985 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data-custom\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.778014 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-combined-ca-bundle\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.781374 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.802107 master-0 kubenswrapper[36504]: I1203 22:26:24.793424 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfzd9\" (UniqueName: \"kubernetes.io/projected/edd801a2-0ddf-475b-81d5-8421a26d7011-kube-api-access-lfzd9\") pod \"dnsmasq-dns-6d65b86965-zlwpd\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:24.826285 master-0 kubenswrapper[36504]: I1203 22:26:24.810021 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkkv\" (UniqueName: \"kubernetes.io/projected/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-kube-api-access-zqkkv\") pod \"heat-api-64967779f-xr54t\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:24.835802 master-0 kubenswrapper[36504]: I1203 22:26:24.832467 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:25.076585 master-0 kubenswrapper[36504]: I1203 22:26:25.076429 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:25.108131 master-0 kubenswrapper[36504]: I1203 22:26:25.107760 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:25.559898 master-0 kubenswrapper[36504]: I1203 22:26:25.559808 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"60c66f18-6a0e-47e1-8b5d-964324197521","Type":"ContainerStarted","Data":"7f664dcf943cb58f4e90a292a80283cfdea13abca5edc2e57c3b350bf0bbfe31"} Dec 03 22:26:25.597807 master-0 kubenswrapper[36504]: I1203 22:26:25.597606 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:25.622935 master-0 kubenswrapper[36504]: W1203 22:26:25.622853 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21102a00_2eed_4a6b_84e6_07403fbda79c.slice/crio-31492ae3c7c9dfbe4e71bc9dce8f250311c64a5c9f9117012ac25ee869ac15b5 WatchSource:0}: Error finding container 31492ae3c7c9dfbe4e71bc9dce8f250311c64a5c9f9117012ac25ee869ac15b5: Status 404 returned error can't find the container with id 31492ae3c7c9dfbe4e71bc9dce8f250311c64a5c9f9117012ac25ee869ac15b5 Dec 03 22:26:25.628640 master-0 kubenswrapper[36504]: I1203 22:26:25.628575 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48g2l"] Dec 03 22:26:25.741343 master-0 kubenswrapper[36504]: I1203 22:26:25.741210 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.886852776 podStartE2EDuration="18.741181899s" podCreationTimestamp="2025-12-03 22:26:07 +0000 UTC" firstStartedPulling="2025-12-03 22:26:08.676748333 +0000 UTC m=+933.896520340" lastFinishedPulling="2025-12-03 22:26:24.531077456 +0000 UTC m=+949.750849463" observedRunningTime="2025-12-03 22:26:25.730004838 +0000 UTC m=+950.949776845" watchObservedRunningTime="2025-12-03 22:26:25.741181899 +0000 UTC m=+950.960953906" Dec 03 22:26:26.466796 master-0 kubenswrapper[36504]: W1203 22:26:26.464047 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c74d2a7_a3b0_4f33_8abc_fea1aaca54a8.slice/crio-9bcb666a695155355cf19d51ad022971318f6208d6a3d8da68d77e00ff62db2f WatchSource:0}: Error finding container 9bcb666a695155355cf19d51ad022971318f6208d6a3d8da68d77e00ff62db2f: Status 404 returned error can't find the container with id 9bcb666a695155355cf19d51ad022971318f6208d6a3d8da68d77e00ff62db2f Dec 03 22:26:26.472039 master-0 kubenswrapper[36504]: I1203 22:26:26.471561 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5cd76d4c4c-pwkqf"] Dec 03 22:26:26.518937 master-0 kubenswrapper[36504]: I1203 22:26:26.518856 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-747f594c7b-dwjx4"] Dec 03 22:26:26.577800 master-0 kubenswrapper[36504]: I1203 22:26:26.577345 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-64967779f-xr54t"] Dec 03 22:26:26.603803 master-0 kubenswrapper[36504]: I1203 22:26:26.600158 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d65b86965-zlwpd"] Dec 03 22:26:26.604673 master-0 kubenswrapper[36504]: I1203 22:26:26.603897 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" event={"ID":"df6c2617-5f7b-41c0-a97a-534404895c92","Type":"ContainerStarted","Data":"b59ecfc52fa9311940d33af8d69b44977538658bdb35adeb1d1430457ed38ee1"} Dec 03 22:26:26.621803 master-0 kubenswrapper[36504]: I1203 22:26:26.614030 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" event={"ID":"7fad818c-c25b-40e8-aeea-e4e9dae1b839","Type":"ContainerStarted","Data":"052b5fd2b6cada2cb4a19ff714354d7d254ddc442197f2725e94596f5d0d5158"} Dec 03 22:26:26.625795 master-0 kubenswrapper[36504]: I1203 22:26:26.622440 36504 generic.go:334] "Generic (PLEG): container finished" podID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerID="c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554" exitCode=0 Dec 03 22:26:26.625795 master-0 kubenswrapper[36504]: I1203 22:26:26.623366 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerDied","Data":"c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554"} Dec 03 22:26:26.625795 master-0 kubenswrapper[36504]: I1203 22:26:26.623496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerStarted","Data":"31492ae3c7c9dfbe4e71bc9dce8f250311c64a5c9f9117012ac25ee869ac15b5"} Dec 03 22:26:26.636211 master-0 kubenswrapper[36504]: I1203 22:26:26.631008 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" event={"ID":"edd801a2-0ddf-475b-81d5-8421a26d7011","Type":"ContainerStarted","Data":"5e6cb4f14b56ad949c576e714976697830a41cb41e9a1fe3e48315f72cef06da"} Dec 03 22:26:26.643827 master-0 kubenswrapper[36504]: I1203 22:26:26.640222 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-64967779f-xr54t" event={"ID":"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8","Type":"ContainerStarted","Data":"9bcb666a695155355cf19d51ad022971318f6208d6a3d8da68d77e00ff62db2f"} Dec 03 22:26:26.660826 master-0 kubenswrapper[36504]: I1203 22:26:26.657056 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerStarted","Data":"1f9053a17daacb9ebc58dd0216413ecb234aba833ccd3b34fb112e460040d7fd"} Dec 03 22:26:27.685610 master-0 kubenswrapper[36504]: I1203 22:26:27.685149 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerStarted","Data":"2e8794d524e657298018ea78b1996c2e7c5772cf22d07e3ef88a9fe77101fe84"} Dec 03 22:26:27.696615 master-0 kubenswrapper[36504]: I1203 22:26:27.694247 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" event={"ID":"df6c2617-5f7b-41c0-a97a-534404895c92","Type":"ContainerStarted","Data":"4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176"} Dec 03 22:26:27.696615 master-0 kubenswrapper[36504]: I1203 22:26:27.694382 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:27.705995 master-0 kubenswrapper[36504]: I1203 22:26:27.705927 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerStarted","Data":"268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe"} Dec 03 22:26:27.713361 master-0 kubenswrapper[36504]: I1203 22:26:27.713295 36504 generic.go:334] "Generic (PLEG): container finished" podID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerID="8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6" exitCode=0 Dec 03 22:26:27.713461 master-0 kubenswrapper[36504]: I1203 22:26:27.713384 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" event={"ID":"edd801a2-0ddf-475b-81d5-8421a26d7011","Type":"ContainerDied","Data":"8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6"} Dec 03 22:26:27.735853 master-0 kubenswrapper[36504]: I1203 22:26:27.735722 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" podStartSLOduration=4.735690275 podStartE2EDuration="4.735690275s" podCreationTimestamp="2025-12-03 22:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:27.729329285 +0000 UTC m=+952.949101312" watchObservedRunningTime="2025-12-03 22:26:27.735690275 +0000 UTC m=+952.955462282" Dec 03 22:26:28.735485 master-0 kubenswrapper[36504]: I1203 22:26:28.735386 36504 generic.go:334] "Generic (PLEG): container finished" podID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerID="268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe" exitCode=0 Dec 03 22:26:28.736295 master-0 kubenswrapper[36504]: I1203 22:26:28.735522 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerDied","Data":"268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe"} Dec 03 22:26:29.760181 master-0 kubenswrapper[36504]: I1203 22:26:29.756737 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-64967779f-xr54t" event={"ID":"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8","Type":"ContainerStarted","Data":"979a6f05ddb40cd49bdfca416b102462f2afe9382b6f0e57dc84fd3f58196ff9"} Dec 03 22:26:29.760181 master-0 kubenswrapper[36504]: I1203 22:26:29.760149 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:29.764179 master-0 kubenswrapper[36504]: I1203 22:26:29.764006 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerStarted","Data":"9cb97a288a8cc6c60b03f48b83b8cb5462e833944a2d877db6eeba08104849e1"} Dec 03 22:26:29.770807 master-0 kubenswrapper[36504]: I1203 22:26:29.769841 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" event={"ID":"7fad818c-c25b-40e8-aeea-e4e9dae1b839","Type":"ContainerStarted","Data":"56cf6bcaa752a1320bb456cb3340911d8ab586f6773586d8d08b4660feb3f087"} Dec 03 22:26:29.774799 master-0 kubenswrapper[36504]: I1203 22:26:29.771134 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:29.774799 master-0 kubenswrapper[36504]: I1203 22:26:29.774411 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerStarted","Data":"71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b"} Dec 03 22:26:29.778800 master-0 kubenswrapper[36504]: I1203 22:26:29.777715 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" event={"ID":"edd801a2-0ddf-475b-81d5-8421a26d7011","Type":"ContainerStarted","Data":"cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f"} Dec 03 22:26:29.778800 master-0 kubenswrapper[36504]: I1203 22:26:29.778445 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:30.484550 master-0 kubenswrapper[36504]: I1203 22:26:30.484244 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-64967779f-xr54t" podStartSLOduration=3.95673075 podStartE2EDuration="6.484214759s" podCreationTimestamp="2025-12-03 22:26:24 +0000 UTC" firstStartedPulling="2025-12-03 22:26:26.466818594 +0000 UTC m=+951.686590601" lastFinishedPulling="2025-12-03 22:26:28.994302603 +0000 UTC m=+954.214074610" observedRunningTime="2025-12-03 22:26:30.457340335 +0000 UTC m=+955.677112342" watchObservedRunningTime="2025-12-03 22:26:30.484214759 +0000 UTC m=+955.703986766" Dec 03 22:26:30.517914 master-0 kubenswrapper[36504]: I1203 22:26:30.517815 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-48g2l" podStartSLOduration=9.87523531 podStartE2EDuration="12.517792855s" podCreationTimestamp="2025-12-03 22:26:18 +0000 UTC" firstStartedPulling="2025-12-03 22:26:26.629341863 +0000 UTC m=+951.849113870" lastFinishedPulling="2025-12-03 22:26:29.271899408 +0000 UTC m=+954.491671415" observedRunningTime="2025-12-03 22:26:30.515640867 +0000 UTC m=+955.735412894" watchObservedRunningTime="2025-12-03 22:26:30.517792855 +0000 UTC m=+955.737564862" Dec 03 22:26:30.562133 master-0 kubenswrapper[36504]: I1203 22:26:30.562029 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" podStartSLOduration=4.040272368 podStartE2EDuration="6.562007515s" podCreationTimestamp="2025-12-03 22:26:24 +0000 UTC" firstStartedPulling="2025-12-03 22:26:26.472680799 +0000 UTC m=+951.692452806" lastFinishedPulling="2025-12-03 22:26:28.994415946 +0000 UTC m=+954.214187953" observedRunningTime="2025-12-03 22:26:30.559829866 +0000 UTC m=+955.779601883" watchObservedRunningTime="2025-12-03 22:26:30.562007515 +0000 UTC m=+955.781779522" Dec 03 22:26:30.618804 master-0 kubenswrapper[36504]: I1203 22:26:30.617717 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" podStartSLOduration=6.617677075 podStartE2EDuration="6.617677075s" podCreationTimestamp="2025-12-03 22:26:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:30.61150143 +0000 UTC m=+955.831273457" watchObservedRunningTime="2025-12-03 22:26:30.617677075 +0000 UTC m=+955.837449082" Dec 03 22:26:30.895791 master-0 kubenswrapper[36504]: I1203 22:26:30.895622 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerStarted","Data":"af142d4c97eacef14a1e881d0f2f458ecba634b394d8a57d05e355f156e654aa"} Dec 03 22:26:31.927000 master-0 kubenswrapper[36504]: I1203 22:26:31.926917 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerStarted","Data":"8f44fa923472c223312e20d78df81a2ebc042fede7ad28a3979557212d5e90c8"} Dec 03 22:26:31.927959 master-0 kubenswrapper[36504]: I1203 22:26:31.927209 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-central-agent" containerID="cri-o://2e8794d524e657298018ea78b1996c2e7c5772cf22d07e3ef88a9fe77101fe84" gracePeriod=30 Dec 03 22:26:31.927959 master-0 kubenswrapper[36504]: I1203 22:26:31.927365 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:26:31.927959 master-0 kubenswrapper[36504]: I1203 22:26:31.927424 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="sg-core" containerID="cri-o://af142d4c97eacef14a1e881d0f2f458ecba634b394d8a57d05e355f156e654aa" gracePeriod=30 Dec 03 22:26:31.927959 master-0 kubenswrapper[36504]: I1203 22:26:31.927406 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="proxy-httpd" containerID="cri-o://8f44fa923472c223312e20d78df81a2ebc042fede7ad28a3979557212d5e90c8" gracePeriod=30 Dec 03 22:26:31.927959 master-0 kubenswrapper[36504]: I1203 22:26:31.927497 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-notification-agent" containerID="cri-o://9cb97a288a8cc6c60b03f48b83b8cb5462e833944a2d877db6eeba08104849e1" gracePeriod=30 Dec 03 22:26:31.973377 master-0 kubenswrapper[36504]: I1203 22:26:31.961904 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=10.157975739 podStartE2EDuration="15.961877692s" podCreationTimestamp="2025-12-03 22:26:16 +0000 UTC" firstStartedPulling="2025-12-03 22:26:25.618164293 +0000 UTC m=+950.837936300" lastFinishedPulling="2025-12-03 22:26:31.422066246 +0000 UTC m=+956.641838253" observedRunningTime="2025-12-03 22:26:31.958428454 +0000 UTC m=+957.178200461" watchObservedRunningTime="2025-12-03 22:26:31.961877692 +0000 UTC m=+957.181649709" Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.955880 36504 generic.go:334] "Generic (PLEG): container finished" podID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerID="8f44fa923472c223312e20d78df81a2ebc042fede7ad28a3979557212d5e90c8" exitCode=0 Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.955944 36504 generic.go:334] "Generic (PLEG): container finished" podID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerID="af142d4c97eacef14a1e881d0f2f458ecba634b394d8a57d05e355f156e654aa" exitCode=2 Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.955954 36504 generic.go:334] "Generic (PLEG): container finished" podID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerID="9cb97a288a8cc6c60b03f48b83b8cb5462e833944a2d877db6eeba08104849e1" exitCode=0 Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.955962 36504 generic.go:334] "Generic (PLEG): container finished" podID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerID="2e8794d524e657298018ea78b1996c2e7c5772cf22d07e3ef88a9fe77101fe84" exitCode=0 Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.955991 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerDied","Data":"8f44fa923472c223312e20d78df81a2ebc042fede7ad28a3979557212d5e90c8"} Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.956029 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerDied","Data":"af142d4c97eacef14a1e881d0f2f458ecba634b394d8a57d05e355f156e654aa"} Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.956039 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerDied","Data":"9cb97a288a8cc6c60b03f48b83b8cb5462e833944a2d877db6eeba08104849e1"} Dec 03 22:26:32.956057 master-0 kubenswrapper[36504]: I1203 22:26:32.956050 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerDied","Data":"2e8794d524e657298018ea78b1996c2e7c5772cf22d07e3ef88a9fe77101fe84"} Dec 03 22:26:33.127864 master-0 kubenswrapper[36504]: I1203 22:26:33.127791 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:33.234627 master-0 kubenswrapper[36504]: I1203 22:26:33.234391 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-log-httpd\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.234627 master-0 kubenswrapper[36504]: I1203 22:26:33.234528 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-run-httpd\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.235061 master-0 kubenswrapper[36504]: I1203 22:26:33.234786 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-sg-core-conf-yaml\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.235228 master-0 kubenswrapper[36504]: I1203 22:26:33.235158 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-scripts\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.235228 master-0 kubenswrapper[36504]: I1203 22:26:33.235150 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:33.235692 master-0 kubenswrapper[36504]: I1203 22:26:33.235252 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-config-data\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.235692 master-0 kubenswrapper[36504]: I1203 22:26:33.235287 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s77bf\" (UniqueName: \"kubernetes.io/projected/31234ea0-415e-4bc1-9581-af21cdc9db72-kube-api-access-s77bf\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.235692 master-0 kubenswrapper[36504]: I1203 22:26:33.235418 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-combined-ca-bundle\") pod \"31234ea0-415e-4bc1-9581-af21cdc9db72\" (UID: \"31234ea0-415e-4bc1-9581-af21cdc9db72\") " Dec 03 22:26:33.236854 master-0 kubenswrapper[36504]: I1203 22:26:33.236817 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.240754 master-0 kubenswrapper[36504]: I1203 22:26:33.240666 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:33.249585 master-0 kubenswrapper[36504]: I1203 22:26:33.249506 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31234ea0-415e-4bc1-9581-af21cdc9db72-kube-api-access-s77bf" (OuterVolumeSpecName: "kube-api-access-s77bf") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "kube-api-access-s77bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:33.258348 master-0 kubenswrapper[36504]: I1203 22:26:33.258235 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-scripts" (OuterVolumeSpecName: "scripts") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33.290798 master-0 kubenswrapper[36504]: I1203 22:26:33.286951 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33.352809 master-0 kubenswrapper[36504]: I1203 22:26:33.346358 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31234ea0-415e-4bc1-9581-af21cdc9db72-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.352809 master-0 kubenswrapper[36504]: I1203 22:26:33.346423 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.352809 master-0 kubenswrapper[36504]: I1203 22:26:33.346437 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.352809 master-0 kubenswrapper[36504]: I1203 22:26:33.346451 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s77bf\" (UniqueName: \"kubernetes.io/projected/31234ea0-415e-4bc1-9581-af21cdc9db72-kube-api-access-s77bf\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.400584 master-0 kubenswrapper[36504]: I1203 22:26:33.400495 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33.450543 master-0 kubenswrapper[36504]: I1203 22:26:33.449698 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.458193 master-0 kubenswrapper[36504]: I1203 22:26:33.458085 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-config-data" (OuterVolumeSpecName: "config-data") pod "31234ea0-415e-4bc1-9581-af21cdc9db72" (UID: "31234ea0-415e-4bc1-9581-af21cdc9db72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:33.462696 master-0 kubenswrapper[36504]: I1203 22:26:33.462560 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6clwd"] Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: E1203 22:26:33.463593 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-central-agent" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: I1203 22:26:33.463625 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-central-agent" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: E1203 22:26:33.463726 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="proxy-httpd" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: I1203 22:26:33.463736 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="proxy-httpd" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: E1203 22:26:33.463797 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="sg-core" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: I1203 22:26:33.463808 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="sg-core" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: E1203 22:26:33.463833 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-notification-agent" Dec 03 22:26:33.463906 master-0 kubenswrapper[36504]: I1203 22:26:33.463842 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-notification-agent" Dec 03 22:26:33.464216 master-0 kubenswrapper[36504]: I1203 22:26:33.464178 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="proxy-httpd" Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.464256 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-central-agent" Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.464282 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="ceilometer-notification-agent" Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.464314 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" containerName="sg-core" Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.475189 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.475652 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-baebb-default-external-api-0" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-log" containerID="cri-o://1134b64cbd728364a1ecbcc5af5ff8ad9874f65a096416a5ff243337bacd7f54" gracePeriod=30 Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.475899 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.485186 master-0 kubenswrapper[36504]: I1203 22:26:33.477278 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-baebb-default-external-api-0" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-httpd" containerID="cri-o://3a0cd1e926d33bb474ede50ac54ad7dac595519714f4608d71bc2d26bdc7afa9" gracePeriod=30 Dec 03 22:26:33.542194 master-0 kubenswrapper[36504]: I1203 22:26:33.516324 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6clwd"] Dec 03 22:26:33.553785 master-0 kubenswrapper[36504]: I1203 22:26:33.553681 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkzfs\" (UniqueName: \"kubernetes.io/projected/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-kube-api-access-pkzfs\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.554093 master-0 kubenswrapper[36504]: I1203 22:26:33.553919 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-utilities\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.554093 master-0 kubenswrapper[36504]: I1203 22:26:33.554078 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-catalog-content\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.554289 master-0 kubenswrapper[36504]: I1203 22:26:33.554193 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31234ea0-415e-4bc1-9581-af21cdc9db72-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:33.662507 master-0 kubenswrapper[36504]: I1203 22:26:33.659237 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-utilities\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.662507 master-0 kubenswrapper[36504]: I1203 22:26:33.659459 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-catalog-content\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.662507 master-0 kubenswrapper[36504]: I1203 22:26:33.659520 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkzfs\" (UniqueName: \"kubernetes.io/projected/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-kube-api-access-pkzfs\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.662507 master-0 kubenswrapper[36504]: I1203 22:26:33.660582 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-utilities\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.662507 master-0 kubenswrapper[36504]: I1203 22:26:33.660590 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-catalog-content\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.684810 master-0 kubenswrapper[36504]: I1203 22:26:33.678607 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkzfs\" (UniqueName: \"kubernetes.io/projected/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-kube-api-access-pkzfs\") pod \"redhat-operators-6clwd\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.819348 master-0 kubenswrapper[36504]: I1203 22:26:33.815276 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:33.921799 master-0 kubenswrapper[36504]: I1203 22:26:33.921610 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6bbd7fd869-fccfd"] Dec 03 22:26:33.929799 master-0 kubenswrapper[36504]: I1203 22:26:33.924185 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.005898 master-0 kubenswrapper[36504]: I1203 22:26:34.004329 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-config-data-custom\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.005898 master-0 kubenswrapper[36504]: I1203 22:26:34.004484 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-config-data\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.005898 master-0 kubenswrapper[36504]: I1203 22:26:34.004610 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp7tj\" (UniqueName: \"kubernetes.io/projected/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-kube-api-access-xp7tj\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.005898 master-0 kubenswrapper[36504]: I1203 22:26:34.004809 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-combined-ca-bundle\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.055703 master-0 kubenswrapper[36504]: I1203 22:26:34.052661 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6bbd7fd869-fccfd"] Dec 03 22:26:34.127245 master-0 kubenswrapper[36504]: I1203 22:26:34.112090 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-combined-ca-bundle\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.127245 master-0 kubenswrapper[36504]: I1203 22:26:34.112210 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-config-data-custom\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.127245 master-0 kubenswrapper[36504]: I1203 22:26:34.112329 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-config-data\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.143078 master-0 kubenswrapper[36504]: I1203 22:26:34.136173 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-combined-ca-bundle\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.143448 master-0 kubenswrapper[36504]: I1203 22:26:34.143082 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-config-data\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.159873 master-0 kubenswrapper[36504]: I1203 22:26:34.112459 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp7tj\" (UniqueName: \"kubernetes.io/projected/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-kube-api-access-xp7tj\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.171820 master-0 kubenswrapper[36504]: I1203 22:26:34.171286 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp7tj\" (UniqueName: \"kubernetes.io/projected/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-kube-api-access-xp7tj\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.183812 master-0 kubenswrapper[36504]: I1203 22:26:34.173856 36504 generic.go:334] "Generic (PLEG): container finished" podID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerID="1134b64cbd728364a1ecbcc5af5ff8ad9874f65a096416a5ff243337bacd7f54" exitCode=143 Dec 03 22:26:34.183812 master-0 kubenswrapper[36504]: I1203 22:26:34.174013 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"dc35c451-c612-46b7-8328-9c8bfa7891e7","Type":"ContainerDied","Data":"1134b64cbd728364a1ecbcc5af5ff8ad9874f65a096416a5ff243337bacd7f54"} Dec 03 22:26:34.183812 master-0 kubenswrapper[36504]: I1203 22:26:34.175100 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-768fc575ff-n7992"] Dec 03 22:26:34.183812 master-0 kubenswrapper[36504]: I1203 22:26:34.178317 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ee74ea5-dc86-4509-92eb-3aae671aa8cf-config-data-custom\") pod \"heat-engine-6bbd7fd869-fccfd\" (UID: \"4ee74ea5-dc86-4509-92eb-3aae671aa8cf\") " pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.183812 master-0 kubenswrapper[36504]: I1203 22:26:34.181362 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.208803 master-0 kubenswrapper[36504]: I1203 22:26:34.197349 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31234ea0-415e-4bc1-9581-af21cdc9db72","Type":"ContainerDied","Data":"1f9053a17daacb9ebc58dd0216413ecb234aba833ccd3b34fb112e460040d7fd"} Dec 03 22:26:34.208803 master-0 kubenswrapper[36504]: I1203 22:26:34.197509 36504 scope.go:117] "RemoveContainer" containerID="8f44fa923472c223312e20d78df81a2ebc042fede7ad28a3979557212d5e90c8" Dec 03 22:26:34.208803 master-0 kubenswrapper[36504]: I1203 22:26:34.203182 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:34.208803 master-0 kubenswrapper[36504]: I1203 22:26:34.208245 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-768fc575ff-n7992"] Dec 03 22:26:34.237762 master-0 kubenswrapper[36504]: I1203 22:26:34.237681 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-64cdb44f7-mvlf8"] Dec 03 22:26:34.283236 master-0 kubenswrapper[36504]: I1203 22:26:34.283174 36504 scope.go:117] "RemoveContainer" containerID="af142d4c97eacef14a1e881d0f2f458ecba634b394d8a57d05e355f156e654aa" Dec 03 22:26:34.315218 master-0 kubenswrapper[36504]: I1203 22:26:34.315108 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gg6\" (UniqueName: \"kubernetes.io/projected/28b645fa-2039-4145-9f32-a07ee0028539-kube-api-access-w6gg6\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.316994 master-0 kubenswrapper[36504]: I1203 22:26:34.315320 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.316994 master-0 kubenswrapper[36504]: I1203 22:26:34.315937 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b95hb\" (UniqueName: \"kubernetes.io/projected/8423bf36-7022-4d07-b528-2deb47088921-kube-api-access-b95hb\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.316994 master-0 kubenswrapper[36504]: I1203 22:26:34.316028 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-combined-ca-bundle\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.316994 master-0 kubenswrapper[36504]: I1203 22:26:34.316254 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-combined-ca-bundle\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.316994 master-0 kubenswrapper[36504]: I1203 22:26:34.316491 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data-custom\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.316994 master-0 kubenswrapper[36504]: I1203 22:26:34.316660 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.317434 master-0 kubenswrapper[36504]: I1203 22:26:34.317166 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data-custom\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.321125 master-0 kubenswrapper[36504]: I1203 22:26:34.321074 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.345702 master-0 kubenswrapper[36504]: I1203 22:26:34.344784 36504 scope.go:117] "RemoveContainer" containerID="9cb97a288a8cc6c60b03f48b83b8cb5462e833944a2d877db6eeba08104849e1" Dec 03 22:26:34.345702 master-0 kubenswrapper[36504]: I1203 22:26:34.344867 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64cdb44f7-mvlf8"] Dec 03 22:26:34.386665 master-0 kubenswrapper[36504]: I1203 22:26:34.386594 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:34.412504 master-0 kubenswrapper[36504]: I1203 22:26:34.410291 36504 scope.go:117] "RemoveContainer" containerID="2e8794d524e657298018ea78b1996c2e7c5772cf22d07e3ef88a9fe77101fe84" Dec 03 22:26:34.413451 master-0 kubenswrapper[36504]: I1203 22:26:34.413379 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421066 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gg6\" (UniqueName: \"kubernetes.io/projected/28b645fa-2039-4145-9f32-a07ee0028539-kube-api-access-w6gg6\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421151 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421192 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-combined-ca-bundle\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421212 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b95hb\" (UniqueName: \"kubernetes.io/projected/8423bf36-7022-4d07-b528-2deb47088921-kube-api-access-b95hb\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421261 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-combined-ca-bundle\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421316 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data-custom\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.421569 master-0 kubenswrapper[36504]: I1203 22:26:34.421356 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.423178 master-0 kubenswrapper[36504]: I1203 22:26:34.423139 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data-custom\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.431509 master-0 kubenswrapper[36504]: I1203 22:26:34.431446 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data-custom\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.432924 master-0 kubenswrapper[36504]: I1203 22:26:34.432369 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data-custom\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.432924 master-0 kubenswrapper[36504]: I1203 22:26:34.432378 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-combined-ca-bundle\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.432924 master-0 kubenswrapper[36504]: I1203 22:26:34.432499 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.448103 master-0 kubenswrapper[36504]: I1203 22:26:34.436857 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-combined-ca-bundle\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.448103 master-0 kubenswrapper[36504]: I1203 22:26:34.436968 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:34.464941 master-0 kubenswrapper[36504]: I1203 22:26:34.461840 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b95hb\" (UniqueName: \"kubernetes.io/projected/8423bf36-7022-4d07-b528-2deb47088921-kube-api-access-b95hb\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.464941 master-0 kubenswrapper[36504]: I1203 22:26:34.463540 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data\") pod \"heat-cfnapi-64cdb44f7-mvlf8\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.481835 master-0 kubenswrapper[36504]: I1203 22:26:34.481750 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gg6\" (UniqueName: \"kubernetes.io/projected/28b645fa-2039-4145-9f32-a07ee0028539-kube-api-access-w6gg6\") pod \"heat-api-768fc575ff-n7992\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.498869 master-0 kubenswrapper[36504]: I1203 22:26:34.496653 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:34.502418 master-0 kubenswrapper[36504]: I1203 22:26:34.502370 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:34.510164 master-0 kubenswrapper[36504]: I1203 22:26:34.510100 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:26:34.511660 master-0 kubenswrapper[36504]: I1203 22:26:34.511511 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:34.515731 master-0 kubenswrapper[36504]: I1203 22:26:34.515669 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:26:34.524486 master-0 kubenswrapper[36504]: I1203 22:26:34.524404 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptzqt\" (UniqueName: \"kubernetes.io/projected/dabfa833-e6ab-4beb-a721-454726a03e5e-kube-api-access-ptzqt\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.524796 master-0 kubenswrapper[36504]: I1203 22:26:34.524495 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.524796 master-0 kubenswrapper[36504]: I1203 22:26:34.524531 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-run-httpd\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.524796 master-0 kubenswrapper[36504]: I1203 22:26:34.524576 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-log-httpd\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.524796 master-0 kubenswrapper[36504]: I1203 22:26:34.524615 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-config-data\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.524796 master-0 kubenswrapper[36504]: I1203 22:26:34.524632 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-scripts\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.524796 master-0 kubenswrapper[36504]: I1203 22:26:34.524659 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.563521 master-0 kubenswrapper[36504]: I1203 22:26:34.561184 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:34.629584 master-0 kubenswrapper[36504]: I1203 22:26:34.629491 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.629959 master-0 kubenswrapper[36504]: I1203 22:26:34.629823 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptzqt\" (UniqueName: \"kubernetes.io/projected/dabfa833-e6ab-4beb-a721-454726a03e5e-kube-api-access-ptzqt\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.629959 master-0 kubenswrapper[36504]: I1203 22:26:34.629910 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.630048 master-0 kubenswrapper[36504]: I1203 22:26:34.629967 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-run-httpd\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.630048 master-0 kubenswrapper[36504]: I1203 22:26:34.630028 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-log-httpd\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.630166 master-0 kubenswrapper[36504]: I1203 22:26:34.630086 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-config-data\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.630166 master-0 kubenswrapper[36504]: I1203 22:26:34.630105 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-scripts\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.636361 master-0 kubenswrapper[36504]: I1203 22:26:34.633906 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-run-httpd\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.636361 master-0 kubenswrapper[36504]: I1203 22:26:34.634021 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-log-httpd\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.641910 master-0 kubenswrapper[36504]: I1203 22:26:34.638620 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-scripts\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.641910 master-0 kubenswrapper[36504]: I1203 22:26:34.639125 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.641910 master-0 kubenswrapper[36504]: I1203 22:26:34.641870 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-config-data\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.648853 master-0 kubenswrapper[36504]: I1203 22:26:34.646490 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.656824 master-0 kubenswrapper[36504]: I1203 22:26:34.654680 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6clwd"] Dec 03 22:26:34.677741 master-0 kubenswrapper[36504]: I1203 22:26:34.677684 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptzqt\" (UniqueName: \"kubernetes.io/projected/dabfa833-e6ab-4beb-a721-454726a03e5e-kube-api-access-ptzqt\") pod \"ceilometer-0\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " pod="openstack/ceilometer-0" Dec 03 22:26:34.685889 master-0 kubenswrapper[36504]: I1203 22:26:34.681480 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:34.933588 master-0 kubenswrapper[36504]: I1203 22:26:34.933505 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:26:35.219811 master-0 kubenswrapper[36504]: I1203 22:26:35.218968 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31234ea0-415e-4bc1-9581-af21cdc9db72" path="/var/lib/kubelet/pods/31234ea0-415e-4bc1-9581-af21cdc9db72/volumes" Dec 03 22:26:35.220502 master-0 kubenswrapper[36504]: I1203 22:26:35.220398 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:26:35.255832 master-0 kubenswrapper[36504]: I1203 22:26:35.225091 36504 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc2f5cca7-d1b3-4572-859e-e77f1f4055ae"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc2f5cca7-d1b3-4572-859e-e77f1f4055ae] : Timed out while waiting for systemd to remove kubepods-besteffort-podc2f5cca7_d1b3_4572_859e_e77f1f4055ae.slice" Dec 03 22:26:35.255832 master-0 kubenswrapper[36504]: I1203 22:26:35.230686 36504 generic.go:334] "Generic (PLEG): container finished" podID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerID="6e837fe13ea120260e23266e0eb7beb38aeedfd5f5de9cf44e30e99e132605a4" exitCode=0 Dec 03 22:26:35.255832 master-0 kubenswrapper[36504]: I1203 22:26:35.230761 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerDied","Data":"6e837fe13ea120260e23266e0eb7beb38aeedfd5f5de9cf44e30e99e132605a4"} Dec 03 22:26:35.255832 master-0 kubenswrapper[36504]: I1203 22:26:35.230811 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerStarted","Data":"934a0720c431c88071a3e46909f80f2d0ea6c3fe7c39a7ab5e6169acef99cdb4"} Dec 03 22:26:35.383800 master-0 kubenswrapper[36504]: I1203 22:26:35.383604 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6bbd7fd869-fccfd"] Dec 03 22:26:35.516801 master-0 kubenswrapper[36504]: I1203 22:26:35.511804 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5759b6fcdf-jh9hb"] Dec 03 22:26:35.516801 master-0 kubenswrapper[36504]: I1203 22:26:35.512262 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerName="dnsmasq-dns" containerID="cri-o://c205221b927124c9b9a8263ebb568182ce030dfce8c7d6b4a27fa1fdafc43a70" gracePeriod=10 Dec 03 22:26:35.803445 master-0 kubenswrapper[36504]: I1203 22:26:35.803277 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64cdb44f7-mvlf8"] Dec 03 22:26:35.875175 master-0 kubenswrapper[36504]: I1203 22:26:35.871841 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-768fc575ff-n7992"] Dec 03 22:26:35.958965 master-0 kubenswrapper[36504]: I1203 22:26:35.958849 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:35.980464 master-0 kubenswrapper[36504]: W1203 22:26:35.980390 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28b645fa_2039_4145_9f32_a07ee0028539.slice/crio-289a7a77cd156289dfff5ac72aec9491bd5efb87d5e421d5eef0bea5aa494aa0 WatchSource:0}: Error finding container 289a7a77cd156289dfff5ac72aec9491bd5efb87d5e421d5eef0bea5aa494aa0: Status 404 returned error can't find the container with id 289a7a77cd156289dfff5ac72aec9491bd5efb87d5e421d5eef0bea5aa494aa0 Dec 03 22:26:36.029310 master-0 kubenswrapper[36504]: E1203 22:26:36.029122 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:26:36.361513 master-0 kubenswrapper[36504]: I1203 22:26:36.337843 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" event={"ID":"8423bf36-7022-4d07-b528-2deb47088921","Type":"ContainerStarted","Data":"baeca8efd1a0fbae599d0ffad89e4d9090ab3bee9cc6fabd6abcd859272eb723"} Dec 03 22:26:36.401216 master-0 kubenswrapper[36504]: I1203 22:26:36.401145 36504 generic.go:334] "Generic (PLEG): container finished" podID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerID="c205221b927124c9b9a8263ebb568182ce030dfce8c7d6b4a27fa1fdafc43a70" exitCode=0 Dec 03 22:26:36.401681 master-0 kubenswrapper[36504]: I1203 22:26:36.401653 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" event={"ID":"d7bf8267-d5ba-4e78-b1aa-d73517d74730","Type":"ContainerDied","Data":"c205221b927124c9b9a8263ebb568182ce030dfce8c7d6b4a27fa1fdafc43a70"} Dec 03 22:26:36.428870 master-0 kubenswrapper[36504]: I1203 22:26:36.428790 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bbd7fd869-fccfd" event={"ID":"4ee74ea5-dc86-4509-92eb-3aae671aa8cf","Type":"ContainerStarted","Data":"2b0859a3c21153cb6cd1125b2222535f9ff726037ebd21ee328bacb57e04a133"} Dec 03 22:26:36.429242 master-0 kubenswrapper[36504]: I1203 22:26:36.429225 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6bbd7fd869-fccfd" event={"ID":"4ee74ea5-dc86-4509-92eb-3aae671aa8cf","Type":"ContainerStarted","Data":"2d8858f642915fa91e10930eac8eb6a1cf315e2e0784f983a2eb246f35b6dda0"} Dec 03 22:26:36.429678 master-0 kubenswrapper[36504]: I1203 22:26:36.429661 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:36.477807 master-0 kubenswrapper[36504]: I1203 22:26:36.477621 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-768fc575ff-n7992" event={"ID":"28b645fa-2039-4145-9f32-a07ee0028539","Type":"ContainerStarted","Data":"289a7a77cd156289dfff5ac72aec9491bd5efb87d5e421d5eef0bea5aa494aa0"} Dec 03 22:26:36.497796 master-0 kubenswrapper[36504]: I1203 22:26:36.497668 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerStarted","Data":"038802ca96e4e4466028e596698ba5c96e49932850999914a86b878e768cc958"} Dec 03 22:26:36.513674 master-0 kubenswrapper[36504]: I1203 22:26:36.513022 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6bbd7fd869-fccfd" podStartSLOduration=3.512993941 podStartE2EDuration="3.512993941s" podCreationTimestamp="2025-12-03 22:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:36.50881554 +0000 UTC m=+961.728587547" watchObservedRunningTime="2025-12-03 22:26:36.512993941 +0000 UTC m=+961.732765948" Dec 03 22:26:37.063733 master-0 kubenswrapper[36504]: I1203 22:26:37.063528 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:26:37.197386 master-0 kubenswrapper[36504]: I1203 22:26:37.195977 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.197386 master-0 kubenswrapper[36504]: I1203 22:26:37.196264 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-config\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.197386 master-0 kubenswrapper[36504]: I1203 22:26:37.196342 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-svc\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.214811 master-0 kubenswrapper[36504]: I1203 22:26:37.212133 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6xbn\" (UniqueName: \"kubernetes.io/projected/d7bf8267-d5ba-4e78-b1aa-d73517d74730-kube-api-access-d6xbn\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.214811 master-0 kubenswrapper[36504]: I1203 22:26:37.212291 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.214811 master-0 kubenswrapper[36504]: I1203 22:26:37.212446 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.214811 master-0 kubenswrapper[36504]: I1203 22:26:37.212631 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:37.233344 master-0 kubenswrapper[36504]: I1203 22:26:37.220184 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bf8267-d5ba-4e78-b1aa-d73517d74730-kube-api-access-d6xbn" (OuterVolumeSpecName: "kube-api-access-d6xbn") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "kube-api-access-d6xbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:37.280677 master-0 kubenswrapper[36504]: I1203 22:26:37.279795 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-whrpw"] Dec 03 22:26:37.281103 master-0 kubenswrapper[36504]: E1203 22:26:37.281066 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerName="init" Dec 03 22:26:37.281103 master-0 kubenswrapper[36504]: I1203 22:26:37.281103 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerName="init" Dec 03 22:26:37.281255 master-0 kubenswrapper[36504]: E1203 22:26:37.281186 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerName="dnsmasq-dns" Dec 03 22:26:37.281255 master-0 kubenswrapper[36504]: I1203 22:26:37.281198 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerName="dnsmasq-dns" Dec 03 22:26:37.281565 master-0 kubenswrapper[36504]: I1203 22:26:37.281536 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" containerName="dnsmasq-dns" Dec 03 22:26:37.286331 master-0 kubenswrapper[36504]: I1203 22:26:37.283891 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.322322 master-0 kubenswrapper[36504]: I1203 22:26:37.322237 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6xbn\" (UniqueName: \"kubernetes.io/projected/d7bf8267-d5ba-4e78-b1aa-d73517d74730-kube-api-access-d6xbn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:37.426962 master-0 kubenswrapper[36504]: I1203 22:26:37.426115 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-whrpw"] Dec 03 22:26:37.426962 master-0 kubenswrapper[36504]: I1203 22:26:37.429396 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjc79\" (UniqueName: \"kubernetes.io/projected/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-kube-api-access-zjc79\") pod \"nova-api-db-create-whrpw\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.426962 master-0 kubenswrapper[36504]: I1203 22:26:37.433595 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-operator-scripts\") pod \"nova-api-db-create-whrpw\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.552324 master-0 kubenswrapper[36504]: I1203 22:26:37.552238 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjc79\" (UniqueName: \"kubernetes.io/projected/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-kube-api-access-zjc79\") pod \"nova-api-db-create-whrpw\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.552519 master-0 kubenswrapper[36504]: I1203 22:26:37.552408 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-operator-scripts\") pod \"nova-api-db-create-whrpw\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.556156 master-0 kubenswrapper[36504]: I1203 22:26:37.553527 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-operator-scripts\") pod \"nova-api-db-create-whrpw\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.569363 master-0 kubenswrapper[36504]: I1203 22:26:37.569198 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-config" (OuterVolumeSpecName: "config") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:37.605140 master-0 kubenswrapper[36504]: I1203 22:26:37.594825 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerStarted","Data":"1665e2cd852155cf522a89f2d86a7767e3ee7e15ae5b99b3dd16367fc196584d"} Dec 03 22:26:37.610563 master-0 kubenswrapper[36504]: I1203 22:26:37.605732 36504 generic.go:334] "Generic (PLEG): container finished" podID="8423bf36-7022-4d07-b528-2deb47088921" containerID="aa7367ca326fb2ef55a6f82b1af50d71950c398c99b930c2ee66bff2d631d4ca" exitCode=1 Dec 03 22:26:37.610563 master-0 kubenswrapper[36504]: I1203 22:26:37.606008 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" event={"ID":"8423bf36-7022-4d07-b528-2deb47088921","Type":"ContainerDied","Data":"aa7367ca326fb2ef55a6f82b1af50d71950c398c99b930c2ee66bff2d631d4ca"} Dec 03 22:26:37.610563 master-0 kubenswrapper[36504]: I1203 22:26:37.607490 36504 scope.go:117] "RemoveContainer" containerID="aa7367ca326fb2ef55a6f82b1af50d71950c398c99b930c2ee66bff2d631d4ca" Dec 03 22:26:37.640538 master-0 kubenswrapper[36504]: I1203 22:26:37.640446 36504 generic.go:334] "Generic (PLEG): container finished" podID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerID="3a0cd1e926d33bb474ede50ac54ad7dac595519714f4608d71bc2d26bdc7afa9" exitCode=0 Dec 03 22:26:37.640946 master-0 kubenswrapper[36504]: I1203 22:26:37.640640 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"dc35c451-c612-46b7-8328-9c8bfa7891e7","Type":"ContainerDied","Data":"3a0cd1e926d33bb474ede50ac54ad7dac595519714f4608d71bc2d26bdc7afa9"} Dec 03 22:26:37.643992 master-0 kubenswrapper[36504]: I1203 22:26:37.643903 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" event={"ID":"d7bf8267-d5ba-4e78-b1aa-d73517d74730","Type":"ContainerDied","Data":"734107ca3c92d21497bb97dec6aea2fb61825d02af3113cf7ab38a6c14b820a7"} Dec 03 22:26:37.643992 master-0 kubenswrapper[36504]: I1203 22:26:37.643966 36504 scope.go:117] "RemoveContainer" containerID="c205221b927124c9b9a8263ebb568182ce030dfce8c7d6b4a27fa1fdafc43a70" Dec 03 22:26:37.644265 master-0 kubenswrapper[36504]: I1203 22:26:37.644239 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5759b6fcdf-jh9hb" Dec 03 22:26:37.648182 master-0 kubenswrapper[36504]: I1203 22:26:37.646813 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjc79\" (UniqueName: \"kubernetes.io/projected/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-kube-api-access-zjc79\") pod \"nova-api-db-create-whrpw\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.658143 master-0 kubenswrapper[36504]: I1203 22:26:37.651946 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerStarted","Data":"489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e"} Dec 03 22:26:37.662625 master-0 kubenswrapper[36504]: I1203 22:26:37.661418 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:37.663671 master-0 kubenswrapper[36504]: I1203 22:26:37.663097 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:37.821168 master-0 kubenswrapper[36504]: I1203 22:26:37.820920 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-c69rr"] Dec 03 22:26:37.979394 master-0 kubenswrapper[36504]: I1203 22:26:37.979264 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:37.998627 master-0 kubenswrapper[36504]: I1203 22:26:37.998501 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.011860 master-0 kubenswrapper[36504]: I1203 22:26:38.010598 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.014607 master-0 kubenswrapper[36504]: I1203 22:26:38.014567 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.019004 master-0 kubenswrapper[36504]: I1203 22:26:38.018876 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm" (OuterVolumeSpecName: "edpm") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.027654 master-0 kubenswrapper[36504]: I1203 22:26:38.027535 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:38.027863 master-0 kubenswrapper[36504]: I1203 22:26:38.027808 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:38.028339 master-0 kubenswrapper[36504]: I1203 22:26:38.028308 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:38.028480 master-0 kubenswrapper[36504]: I1203 22:26:38.028431 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0\") pod \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\" (UID: \"d7bf8267-d5ba-4e78-b1aa-d73517d74730\") " Dec 03 22:26:38.030321 master-0 kubenswrapper[36504]: W1203 22:26:38.029887 36504 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d7bf8267-d5ba-4e78-b1aa-d73517d74730/volumes/kubernetes.io~configmap/dns-swift-storage-0 Dec 03 22:26:38.030321 master-0 kubenswrapper[36504]: I1203 22:26:38.029915 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.030321 master-0 kubenswrapper[36504]: W1203 22:26:38.029992 36504 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d7bf8267-d5ba-4e78-b1aa-d73517d74730/volumes/kubernetes.io~configmap/ovsdbserver-sb Dec 03 22:26:38.030321 master-0 kubenswrapper[36504]: I1203 22:26:38.030006 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.030321 master-0 kubenswrapper[36504]: W1203 22:26:38.030059 36504 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d7bf8267-d5ba-4e78-b1aa-d73517d74730/volumes/kubernetes.io~configmap/ovsdbserver-nb Dec 03 22:26:38.030321 master-0 kubenswrapper[36504]: I1203 22:26:38.030069 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.030750 master-0 kubenswrapper[36504]: W1203 22:26:38.029193 36504 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/d7bf8267-d5ba-4e78-b1aa-d73517d74730/volumes/kubernetes.io~configmap/edpm Dec 03 22:26:38.030889 master-0 kubenswrapper[36504]: I1203 22:26:38.030858 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm" (OuterVolumeSpecName: "edpm") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.048402 master-0 kubenswrapper[36504]: I1203 22:26:38.047294 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-3ad6-account-create-update-gj27c"] Dec 03 22:26:38.048402 master-0 kubenswrapper[36504]: I1203 22:26:38.048284 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7bf8267-d5ba-4e78-b1aa-d73517d74730" (UID: "d7bf8267-d5ba-4e78-b1aa-d73517d74730"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:38.053515 master-0 kubenswrapper[36504]: I1203 22:26:38.053475 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:38.053875 master-0 kubenswrapper[36504]: I1203 22:26:38.053846 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.062322 master-0 kubenswrapper[36504]: I1203 22:26:38.056804 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 22:26:38.072492 master-0 kubenswrapper[36504]: I1203 22:26:38.065553 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c69rr"] Dec 03 22:26:38.107408 master-0 kubenswrapper[36504]: I1203 22:26:38.106854 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3ad6-account-create-update-gj27c"] Dec 03 22:26:38.123743 master-0 kubenswrapper[36504]: I1203 22:26:38.120257 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-w6cn8"] Dec 03 22:26:38.161489 master-0 kubenswrapper[36504]: I1203 22:26:38.160853 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.164811 master-0 kubenswrapper[36504]: I1203 22:26:38.164583 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e258-account-create-update-lf7cr"] Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166246 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2h9j\" (UniqueName: \"kubernetes.io/projected/1b698a34-7bd9-4125-ad5c-885db2cf4959-kube-api-access-h2h9j\") pod \"nova-api-3ad6-account-create-update-gj27c\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166324 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrvfc\" (UniqueName: \"kubernetes.io/projected/6193a696-93bd-4303-82fd-bf39ce403d80-kube-api-access-lrvfc\") pod \"nova-cell0-db-create-c69rr\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166518 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6193a696-93bd-4303-82fd-bf39ce403d80-operator-scripts\") pod \"nova-cell0-db-create-c69rr\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166605 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b698a34-7bd9-4125-ad5c-885db2cf4959-operator-scripts\") pod \"nova-api-3ad6-account-create-update-gj27c\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166706 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166720 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166731 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166741 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.166753 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7bf8267-d5ba-4e78-b1aa-d73517d74730-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.167235 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.182879 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.217980 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w6cn8"] Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.242644 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e258-account-create-update-lf7cr"] Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292445 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2h9j\" (UniqueName: \"kubernetes.io/projected/1b698a34-7bd9-4125-ad5c-885db2cf4959-kube-api-access-h2h9j\") pod \"nova-api-3ad6-account-create-update-gj27c\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292516 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrvfc\" (UniqueName: \"kubernetes.io/projected/6193a696-93bd-4303-82fd-bf39ce403d80-kube-api-access-lrvfc\") pod \"nova-cell0-db-create-c69rr\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292577 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6954864a-d445-46ef-8b09-dcb0543b8b23-operator-scripts\") pod \"nova-cell1-db-create-w6cn8\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292659 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a046886b-4b70-4fb1-82ce-ef92db1677f7-operator-scripts\") pod \"nova-cell0-e258-account-create-update-lf7cr\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292742 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6193a696-93bd-4303-82fd-bf39ce403d80-operator-scripts\") pod \"nova-cell0-db-create-c69rr\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292823 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rctkw\" (UniqueName: \"kubernetes.io/projected/a046886b-4b70-4fb1-82ce-ef92db1677f7-kube-api-access-rctkw\") pod \"nova-cell0-e258-account-create-update-lf7cr\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292870 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl6vb\" (UniqueName: \"kubernetes.io/projected/6954864a-d445-46ef-8b09-dcb0543b8b23-kube-api-access-dl6vb\") pod \"nova-cell1-db-create-w6cn8\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.292903 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b698a34-7bd9-4125-ad5c-885db2cf4959-operator-scripts\") pod \"nova-api-3ad6-account-create-update-gj27c\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.293897 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b698a34-7bd9-4125-ad5c-885db2cf4959-operator-scripts\") pod \"nova-api-3ad6-account-create-update-gj27c\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.295857 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6193a696-93bd-4303-82fd-bf39ce403d80-operator-scripts\") pod \"nova-cell0-db-create-c69rr\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.343695 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrvfc\" (UniqueName: \"kubernetes.io/projected/6193a696-93bd-4303-82fd-bf39ce403d80-kube-api-access-lrvfc\") pod \"nova-cell0-db-create-c69rr\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.367410 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3742-account-create-update-frr2z"] Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.370492 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.380081 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2h9j\" (UniqueName: \"kubernetes.io/projected/1b698a34-7bd9-4125-ad5c-885db2cf4959-kube-api-access-h2h9j\") pod \"nova-api-3ad6-account-create-update-gj27c\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.384427 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3742-account-create-update-frr2z"] Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.416337 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6954864a-d445-46ef-8b09-dcb0543b8b23-operator-scripts\") pod \"nova-cell1-db-create-w6cn8\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.416730 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a046886b-4b70-4fb1-82ce-ef92db1677f7-operator-scripts\") pod \"nova-cell0-e258-account-create-update-lf7cr\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.417149 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rctkw\" (UniqueName: \"kubernetes.io/projected/a046886b-4b70-4fb1-82ce-ef92db1677f7-kube-api-access-rctkw\") pod \"nova-cell0-e258-account-create-update-lf7cr\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.417247 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl6vb\" (UniqueName: \"kubernetes.io/projected/6954864a-d445-46ef-8b09-dcb0543b8b23-kube-api-access-dl6vb\") pod \"nova-cell1-db-create-w6cn8\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.418298 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a046886b-4b70-4fb1-82ce-ef92db1677f7-operator-scripts\") pod \"nova-cell0-e258-account-create-update-lf7cr\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.419405 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6954864a-d445-46ef-8b09-dcb0543b8b23-operator-scripts\") pod \"nova-cell1-db-create-w6cn8\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.436069 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.436378 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.465865 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rctkw\" (UniqueName: \"kubernetes.io/projected/a046886b-4b70-4fb1-82ce-ef92db1677f7-kube-api-access-rctkw\") pod \"nova-cell0-e258-account-create-update-lf7cr\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.469081 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl6vb\" (UniqueName: \"kubernetes.io/projected/6954864a-d445-46ef-8b09-dcb0543b8b23-kube-api-access-dl6vb\") pod \"nova-cell1-db-create-w6cn8\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.476897 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.480012 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-baebb-default-internal-api-0" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-log" containerID="cri-o://c28adb8871b8d1df40a0ebc7d67cde64a957e8e8087902237524a1d002d90584" gracePeriod=30 Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.480411 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-baebb-default-internal-api-0" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-httpd" containerID="cri-o://41577981d3efce579ffaefd5b26dba879a677a4785d74d6355a99e12e822d046" gracePeriod=30 Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.520690 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-operator-scripts\") pod \"nova-cell1-3742-account-create-update-frr2z\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.548651 master-0 kubenswrapper[36504]: I1203 22:26:38.520906 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8b98\" (UniqueName: \"kubernetes.io/projected/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-kube-api-access-k8b98\") pod \"nova-cell1-3742-account-create-update-frr2z\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.650864 master-0 kubenswrapper[36504]: I1203 22:26:38.650799 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-operator-scripts\") pod \"nova-cell1-3742-account-create-update-frr2z\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.651204 master-0 kubenswrapper[36504]: I1203 22:26:38.651181 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8b98\" (UniqueName: \"kubernetes.io/projected/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-kube-api-access-k8b98\") pod \"nova-cell1-3742-account-create-update-frr2z\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.660162 master-0 kubenswrapper[36504]: E1203 22:26:38.660112 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8447ec5c_34fd_44ce_8bd8_2174ca072e6a.slice/crio-conmon-1665e2cd852155cf522a89f2d86a7767e3ee7e15ae5b99b3dd16367fc196584d.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:26:38.675160 master-0 kubenswrapper[36504]: I1203 22:26:38.675085 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"dc35c451-c612-46b7-8328-9c8bfa7891e7","Type":"ContainerDied","Data":"11a26d88261e713a0b7c6cbe9e545c9bc81ce59c9cc3b35e5b72318e70b98f24"} Dec 03 22:26:38.675160 master-0 kubenswrapper[36504]: I1203 22:26:38.675174 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11a26d88261e713a0b7c6cbe9e545c9bc81ce59c9cc3b35e5b72318e70b98f24" Dec 03 22:26:38.676894 master-0 kubenswrapper[36504]: I1203 22:26:38.676833 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-64967779f-xr54t"] Dec 03 22:26:38.682745 master-0 kubenswrapper[36504]: I1203 22:26:38.682488 36504 generic.go:334] "Generic (PLEG): container finished" podID="28b645fa-2039-4145-9f32-a07ee0028539" containerID="91ce4107475da686fe17eff8003505fc7acc50ed25cebb28517ea88a4ab63767" exitCode=1 Dec 03 22:26:38.682745 master-0 kubenswrapper[36504]: I1203 22:26:38.682624 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-768fc575ff-n7992" event={"ID":"28b645fa-2039-4145-9f32-a07ee0028539","Type":"ContainerDied","Data":"91ce4107475da686fe17eff8003505fc7acc50ed25cebb28517ea88a4ab63767"} Dec 03 22:26:38.687148 master-0 kubenswrapper[36504]: I1203 22:26:38.687097 36504 generic.go:334] "Generic (PLEG): container finished" podID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerID="c28adb8871b8d1df40a0ebc7d67cde64a957e8e8087902237524a1d002d90584" exitCode=143 Dec 03 22:26:38.687367 master-0 kubenswrapper[36504]: I1203 22:26:38.687189 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"6cb8747f-1989-4028-b62a-5bf9ea57f609","Type":"ContainerDied","Data":"c28adb8871b8d1df40a0ebc7d67cde64a957e8e8087902237524a1d002d90584"} Dec 03 22:26:38.688471 master-0 kubenswrapper[36504]: I1203 22:26:38.688443 36504 scope.go:117] "RemoveContainer" containerID="91ce4107475da686fe17eff8003505fc7acc50ed25cebb28517ea88a4ab63767" Dec 03 22:26:38.691255 master-0 kubenswrapper[36504]: I1203 22:26:38.691211 36504 generic.go:334] "Generic (PLEG): container finished" podID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerID="1665e2cd852155cf522a89f2d86a7767e3ee7e15ae5b99b3dd16367fc196584d" exitCode=0 Dec 03 22:26:38.691474 master-0 kubenswrapper[36504]: I1203 22:26:38.691450 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-64967779f-xr54t" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerName="heat-api" containerID="cri-o://979a6f05ddb40cd49bdfca416b102462f2afe9382b6f0e57dc84fd3f58196ff9" gracePeriod=60 Dec 03 22:26:38.692240 master-0 kubenswrapper[36504]: I1203 22:26:38.692178 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerDied","Data":"1665e2cd852155cf522a89f2d86a7767e3ee7e15ae5b99b3dd16367fc196584d"} Dec 03 22:26:38.698006 master-0 kubenswrapper[36504]: I1203 22:26:38.697917 36504 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-api-64967779f-xr54t" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerName="heat-api" probeResult="failure" output="Get \"http://10.128.1.3:8004/healthcheck\": EOF" Dec 03 22:26:38.702797 master-0 kubenswrapper[36504]: I1203 22:26:38.700935 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-operator-scripts\") pod \"nova-cell1-3742-account-create-update-frr2z\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.726613 master-0 kubenswrapper[36504]: I1203 22:26:38.726510 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-64cdb44f7-mvlf8"] Dec 03 22:26:38.740937 master-0 kubenswrapper[36504]: I1203 22:26:38.740845 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-578bc969b7-l5g64"] Dec 03 22:26:38.744472 master-0 kubenswrapper[36504]: I1203 22:26:38.744421 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.747073 master-0 kubenswrapper[36504]: I1203 22:26:38.747024 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 03 22:26:38.747874 master-0 kubenswrapper[36504]: I1203 22:26:38.747474 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 03 22:26:38.756313 master-0 kubenswrapper[36504]: I1203 22:26:38.756197 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-687b5cdd65-qsjb4"] Dec 03 22:26:38.786941 master-0 kubenswrapper[36504]: I1203 22:26:38.785614 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.790855 master-0 kubenswrapper[36504]: I1203 22:26:38.790802 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 03 22:26:38.802628 master-0 kubenswrapper[36504]: I1203 22:26:38.797012 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 03 22:26:38.804231 master-0 kubenswrapper[36504]: I1203 22:26:38.804157 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-578bc969b7-l5g64"] Dec 03 22:26:38.873752 master-0 kubenswrapper[36504]: I1203 22:26:38.873698 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-687b5cdd65-qsjb4"] Dec 03 22:26:38.876331 master-0 kubenswrapper[36504]: I1203 22:26:38.875911 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8cc\" (UniqueName: \"kubernetes.io/projected/ebe65521-d7e1-4619-839d-a36739b7d286-kube-api-access-hk8cc\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.876603 master-0 kubenswrapper[36504]: I1203 22:26:38.876579 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqp8x\" (UniqueName: \"kubernetes.io/projected/420506d7-d2f5-44f2-8124-c8a488cf2ee7-kube-api-access-sqp8x\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.876853 master-0 kubenswrapper[36504]: I1203 22:26:38.876216 36504 scope.go:117] "RemoveContainer" containerID="14c9e6117142ac738f00be96830ac4be84aafafa5ba4ee95ab41ba1647abae62" Dec 03 22:26:38.877096 master-0 kubenswrapper[36504]: I1203 22:26:38.876836 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-combined-ca-bundle\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.877266 master-0 kubenswrapper[36504]: I1203 22:26:38.877246 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-config-data-custom\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.877708 master-0 kubenswrapper[36504]: I1203 22:26:38.877683 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-public-tls-certs\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.884713 master-0 kubenswrapper[36504]: I1203 22:26:38.884648 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-config-data\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.890528 master-0 kubenswrapper[36504]: I1203 22:26:38.890478 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-internal-tls-certs\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.891075 master-0 kubenswrapper[36504]: I1203 22:26:38.891051 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-internal-tls-certs\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.891242 master-0 kubenswrapper[36504]: I1203 22:26:38.891225 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-config-data\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.891393 master-0 kubenswrapper[36504]: I1203 22:26:38.891380 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-public-tls-certs\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.891628 master-0 kubenswrapper[36504]: I1203 22:26:38.891607 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-combined-ca-bundle\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:38.892432 master-0 kubenswrapper[36504]: I1203 22:26:38.892318 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-config-data-custom\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:38.918621 master-0 kubenswrapper[36504]: I1203 22:26:38.918555 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:38.925701 master-0 kubenswrapper[36504]: I1203 22:26:38.925644 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8b98\" (UniqueName: \"kubernetes.io/projected/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-kube-api-access-k8b98\") pod \"nova-cell1-3742-account-create-update-frr2z\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:38.949990 master-0 kubenswrapper[36504]: I1203 22:26:38.941312 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-whrpw"] Dec 03 22:26:38.997995 master-0 kubenswrapper[36504]: I1203 22:26:38.997902 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:38.998486 master-0 kubenswrapper[36504]: I1203 22:26:38.998463 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-config-data\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:38.998663 master-0 kubenswrapper[36504]: I1203 22:26:38.998646 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-httpd-run\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:38.999245 master-0 kubenswrapper[36504]: I1203 22:26:38.999228 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-public-tls-certs\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:38.999854 master-0 kubenswrapper[36504]: I1203 22:26:38.999834 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-scripts\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:39.000009 master-0 kubenswrapper[36504]: I1203 22:26:38.999995 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-logs\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:39.000106 master-0 kubenswrapper[36504]: I1203 22:26:39.000088 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-combined-ca-bundle\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:39.000223 master-0 kubenswrapper[36504]: I1203 22:26:39.000209 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2hz\" (UniqueName: \"kubernetes.io/projected/dc35c451-c612-46b7-8328-9c8bfa7891e7-kube-api-access-hv2hz\") pod \"dc35c451-c612-46b7-8328-9c8bfa7891e7\" (UID: \"dc35c451-c612-46b7-8328-9c8bfa7891e7\") " Dec 03 22:26:39.007996 master-0 kubenswrapper[36504]: I1203 22:26:38.999140 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:39.007996 master-0 kubenswrapper[36504]: I1203 22:26:39.002537 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-logs" (OuterVolumeSpecName: "logs") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:39.014967 master-0 kubenswrapper[36504]: I1203 22:26:39.014916 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8cc\" (UniqueName: \"kubernetes.io/projected/ebe65521-d7e1-4619-839d-a36739b7d286-kube-api-access-hk8cc\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.015343 master-0 kubenswrapper[36504]: I1203 22:26:39.015294 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqp8x\" (UniqueName: \"kubernetes.io/projected/420506d7-d2f5-44f2-8124-c8a488cf2ee7-kube-api-access-sqp8x\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.015584 master-0 kubenswrapper[36504]: I1203 22:26:39.015552 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-combined-ca-bundle\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.015688 master-0 kubenswrapper[36504]: I1203 22:26:39.015642 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-config-data-custom\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.015898 master-0 kubenswrapper[36504]: I1203 22:26:39.015871 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-public-tls-certs\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.016166 master-0 kubenswrapper[36504]: I1203 22:26:39.016124 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-config-data\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.017572 master-0 kubenswrapper[36504]: I1203 22:26:39.016511 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-internal-tls-certs\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.018103 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-scripts" (OuterVolumeSpecName: "scripts") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.018481 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-internal-tls-certs\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.018593 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-config-data\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.018675 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-public-tls-certs\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.018859 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-combined-ca-bundle\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.018924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-config-data-custom\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.019407 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.019431 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.019445 36504 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc35c451-c612-46b7-8328-9c8bfa7891e7-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.019976 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc35c451-c612-46b7-8328-9c8bfa7891e7-kube-api-access-hv2hz" (OuterVolumeSpecName: "kube-api-access-hv2hz") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "kube-api-access-hv2hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:39.035016 master-0 kubenswrapper[36504]: I1203 22:26:39.033237 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1" (OuterVolumeSpecName: "glance") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:26:39.065958 master-0 kubenswrapper[36504]: I1203 22:26:39.065754 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-internal-tls-certs\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.067965 master-0 kubenswrapper[36504]: I1203 22:26:39.067901 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-combined-ca-bundle\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.072087 master-0 kubenswrapper[36504]: I1203 22:26:39.072027 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-config-data-custom\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.080646 master-0 kubenswrapper[36504]: I1203 22:26:39.075904 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqp8x\" (UniqueName: \"kubernetes.io/projected/420506d7-d2f5-44f2-8124-c8a488cf2ee7-kube-api-access-sqp8x\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.080646 master-0 kubenswrapper[36504]: I1203 22:26:39.076215 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-config-data\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.082059 master-0 kubenswrapper[36504]: I1203 22:26:39.081983 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5759b6fcdf-jh9hb"] Dec 03 22:26:39.082193 master-0 kubenswrapper[36504]: I1203 22:26:39.082065 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-combined-ca-bundle\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.102456 master-0 kubenswrapper[36504]: I1203 22:26:39.101893 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-public-tls-certs\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.116692 master-0 kubenswrapper[36504]: I1203 22:26:39.116628 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-config-data-custom\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.119945 master-0 kubenswrapper[36504]: I1203 22:26:39.119898 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-internal-tls-certs\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.121201 master-0 kubenswrapper[36504]: I1203 22:26:39.121153 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebe65521-d7e1-4619-839d-a36739b7d286-public-tls-certs\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.123076 master-0 kubenswrapper[36504]: I1203 22:26:39.123040 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8cc\" (UniqueName: \"kubernetes.io/projected/ebe65521-d7e1-4619-839d-a36739b7d286-kube-api-access-hk8cc\") pod \"heat-cfnapi-687b5cdd65-qsjb4\" (UID: \"ebe65521-d7e1-4619-839d-a36739b7d286\") " pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.124226 master-0 kubenswrapper[36504]: I1203 22:26:39.124197 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2hz\" (UniqueName: \"kubernetes.io/projected/dc35c451-c612-46b7-8328-9c8bfa7891e7-kube-api-access-hv2hz\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.124368 master-0 kubenswrapper[36504]: I1203 22:26:39.124349 36504 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") on node \"master-0\" " Dec 03 22:26:39.173158 master-0 kubenswrapper[36504]: I1203 22:26:39.170812 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420506d7-d2f5-44f2-8124-c8a488cf2ee7-config-data\") pod \"heat-api-578bc969b7-l5g64\" (UID: \"420506d7-d2f5-44f2-8124-c8a488cf2ee7\") " pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.228821 master-0 kubenswrapper[36504]: I1203 22:26:39.226544 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5759b6fcdf-jh9hb"] Dec 03 22:26:39.238068 master-0 kubenswrapper[36504]: I1203 22:26:39.238006 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:39.238360 master-0 kubenswrapper[36504]: I1203 22:26:39.238341 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:39.301200 master-0 kubenswrapper[36504]: I1203 22:26:39.299296 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:39.319418 master-0 kubenswrapper[36504]: I1203 22:26:39.318496 36504 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:26:39.319418 master-0 kubenswrapper[36504]: I1203 22:26:39.318744 36504 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c" (UniqueName: "kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1") on node "master-0" Dec 03 22:26:39.332602 master-0 kubenswrapper[36504]: I1203 22:26:39.332525 36504 reconciler_common.go:293] "Volume detached for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.338490 master-0 kubenswrapper[36504]: I1203 22:26:39.338443 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:39.341762 master-0 kubenswrapper[36504]: I1203 22:26:39.341679 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:39.352291 master-0 kubenswrapper[36504]: I1203 22:26:39.352219 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:39.352291 master-0 kubenswrapper[36504]: I1203 22:26:39.352242 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:39.395609 master-0 kubenswrapper[36504]: I1203 22:26:39.395542 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:39.401318 master-0 kubenswrapper[36504]: I1203 22:26:39.401216 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:39.409882 master-0 kubenswrapper[36504]: I1203 22:26:39.409818 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:39.416561 master-0 kubenswrapper[36504]: I1203 22:26:39.416480 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-config-data" (OuterVolumeSpecName: "config-data") pod "dc35c451-c612-46b7-8328-9c8bfa7891e7" (UID: "dc35c451-c612-46b7-8328-9c8bfa7891e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:39.439639 master-0 kubenswrapper[36504]: I1203 22:26:39.439578 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.441936 master-0 kubenswrapper[36504]: I1203 22:26:39.441918 36504 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.442163 master-0 kubenswrapper[36504]: I1203 22:26:39.442149 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc35c451-c612-46b7-8328-9c8bfa7891e7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:39.466247 master-0 kubenswrapper[36504]: I1203 22:26:39.466073 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:39.531966 master-0 kubenswrapper[36504]: I1203 22:26:39.531263 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:39.564230 master-0 kubenswrapper[36504]: I1203 22:26:39.563794 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:39.564230 master-0 kubenswrapper[36504]: I1203 22:26:39.563899 36504 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:39.682741 master-0 kubenswrapper[36504]: I1203 22:26:39.682487 36504 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:39.682741 master-0 kubenswrapper[36504]: I1203 22:26:39.682577 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:39.812798 master-0 kubenswrapper[36504]: I1203 22:26:39.810898 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-768fc575ff-n7992" event={"ID":"28b645fa-2039-4145-9f32-a07ee0028539","Type":"ContainerStarted","Data":"d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437"} Dec 03 22:26:39.813022 master-0 kubenswrapper[36504]: I1203 22:26:39.812805 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:39.851559 master-0 kubenswrapper[36504]: I1203 22:26:39.846932 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerStarted","Data":"f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29"} Dec 03 22:26:39.866380 master-0 kubenswrapper[36504]: I1203 22:26:39.865176 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-whrpw" event={"ID":"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1","Type":"ContainerStarted","Data":"201f5fbe76e0dc298839f7cbdb657b234ec60e96e7371f4dc1c1dcd52f284159"} Dec 03 22:26:39.912131 master-0 kubenswrapper[36504]: I1203 22:26:39.909162 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" event={"ID":"8423bf36-7022-4d07-b528-2deb47088921","Type":"ContainerStarted","Data":"2b17fd0d1c421e388e02f8c3354dbe7cb58c4913def3bcf0bb7e542d0ff8bcc5"} Dec 03 22:26:39.912131 master-0 kubenswrapper[36504]: I1203 22:26:39.911124 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" containerID="cri-o://2b17fd0d1c421e388e02f8c3354dbe7cb58c4913def3bcf0bb7e542d0ff8bcc5" gracePeriod=60 Dec 03 22:26:39.912131 master-0 kubenswrapper[36504]: I1203 22:26:39.911225 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:39.912487 master-0 kubenswrapper[36504]: I1203 22:26:39.912454 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:39.996883 master-0 kubenswrapper[36504]: I1203 22:26:39.979853 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-768fc575ff-n7992" podStartSLOduration=6.979819522 podStartE2EDuration="6.979819522s" podCreationTimestamp="2025-12-03 22:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:39.854175373 +0000 UTC m=+965.073947390" watchObservedRunningTime="2025-12-03 22:26:39.979819522 +0000 UTC m=+965.199591529" Dec 03 22:26:39.999889 master-0 kubenswrapper[36504]: I1203 22:26:39.998303 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:40.115908 master-0 kubenswrapper[36504]: I1203 22:26:40.112612 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" podStartSLOduration=7.112558054 podStartE2EDuration="7.112558054s" podCreationTimestamp="2025-12-03 22:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:40.081548189 +0000 UTC m=+965.301320196" watchObservedRunningTime="2025-12-03 22:26:40.112558054 +0000 UTC m=+965.332330081" Dec 03 22:26:40.999980 master-0 kubenswrapper[36504]: I1203 22:26:40.999758 36504 generic.go:334] "Generic (PLEG): container finished" podID="28b645fa-2039-4145-9f32-a07ee0028539" containerID="d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437" exitCode=1 Dec 03 22:26:40.999980 master-0 kubenswrapper[36504]: I1203 22:26:40.999860 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-768fc575ff-n7992" event={"ID":"28b645fa-2039-4145-9f32-a07ee0028539","Type":"ContainerDied","Data":"d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437"} Dec 03 22:26:40.999980 master-0 kubenswrapper[36504]: I1203 22:26:40.999918 36504 scope.go:117] "RemoveContainer" containerID="91ce4107475da686fe17eff8003505fc7acc50ed25cebb28517ea88a4ab63767" Dec 03 22:26:41.001186 master-0 kubenswrapper[36504]: I1203 22:26:41.001161 36504 scope.go:117] "RemoveContainer" containerID="d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437" Dec 03 22:26:41.001653 master-0 kubenswrapper[36504]: E1203 22:26:41.001571 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-768fc575ff-n7992_openstack(28b645fa-2039-4145-9f32-a07ee0028539)\"" pod="openstack/heat-api-768fc575ff-n7992" podUID="28b645fa-2039-4145-9f32-a07ee0028539" Dec 03 22:26:41.007760 master-0 kubenswrapper[36504]: I1203 22:26:41.007608 36504 generic.go:334] "Generic (PLEG): container finished" podID="8423bf36-7022-4d07-b528-2deb47088921" containerID="2b17fd0d1c421e388e02f8c3354dbe7cb58c4913def3bcf0bb7e542d0ff8bcc5" exitCode=1 Dec 03 22:26:41.010587 master-0 kubenswrapper[36504]: I1203 22:26:41.010561 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" event={"ID":"8423bf36-7022-4d07-b528-2deb47088921","Type":"ContainerDied","Data":"2b17fd0d1c421e388e02f8c3354dbe7cb58c4913def3bcf0bb7e542d0ff8bcc5"} Dec 03 22:26:41.140822 master-0 kubenswrapper[36504]: I1203 22:26:41.140701 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bf8267-d5ba-4e78-b1aa-d73517d74730" path="/var/lib/kubelet/pods/d7bf8267-d5ba-4e78-b1aa-d73517d74730/volumes" Dec 03 22:26:41.645811 master-0 kubenswrapper[36504]: I1203 22:26:41.645364 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3742-account-create-update-frr2z"] Dec 03 22:26:41.660463 master-0 kubenswrapper[36504]: I1203 22:26:41.660246 36504 scope.go:117] "RemoveContainer" containerID="aa7367ca326fb2ef55a6f82b1af50d71950c398c99b930c2ee66bff2d631d4ca" Dec 03 22:26:41.788113 master-0 kubenswrapper[36504]: I1203 22:26:41.774520 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:26:41.788113 master-0 kubenswrapper[36504]: I1203 22:26:41.774634 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-578bc969b7-l5g64"] Dec 03 22:26:41.795741 master-0 kubenswrapper[36504]: I1203 22:26:41.795659 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-3ad6-account-create-update-gj27c"] Dec 03 22:26:41.797945 master-0 kubenswrapper[36504]: W1203 22:26:41.796241 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b698a34_7bd9_4125_ad5c_885db2cf4959.slice/crio-38abee6b9ea8aa55e492adbfb90adbc8592471c8318cef5b13da366b2557da56 WatchSource:0}: Error finding container 38abee6b9ea8aa55e492adbfb90adbc8592471c8318cef5b13da366b2557da56: Status 404 returned error can't find the container with id 38abee6b9ea8aa55e492adbfb90adbc8592471c8318cef5b13da366b2557da56 Dec 03 22:26:41.808875 master-0 kubenswrapper[36504]: I1203 22:26:41.807665 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-w6cn8"] Dec 03 22:26:41.824435 master-0 kubenswrapper[36504]: I1203 22:26:41.824257 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e258-account-create-update-lf7cr"] Dec 03 22:26:41.859118 master-0 kubenswrapper[36504]: I1203 22:26:41.859032 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:26:41.898081 master-0 kubenswrapper[36504]: I1203 22:26:41.884272 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-c69rr"] Dec 03 22:26:41.904344 master-0 kubenswrapper[36504]: I1203 22:26:41.904283 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-687b5cdd65-qsjb4"] Dec 03 22:26:41.957810 master-0 kubenswrapper[36504]: I1203 22:26:41.940505 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:26:41.962809 master-0 kubenswrapper[36504]: I1203 22:26:41.960922 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48g2l"] Dec 03 22:26:41.992944 master-0 kubenswrapper[36504]: I1203 22:26:41.992853 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:26:41.994080 master-0 kubenswrapper[36504]: E1203 22:26:41.994048 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-log" Dec 03 22:26:41.994156 master-0 kubenswrapper[36504]: I1203 22:26:41.994083 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-log" Dec 03 22:26:41.994156 master-0 kubenswrapper[36504]: E1203 22:26:41.994111 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-httpd" Dec 03 22:26:41.994156 master-0 kubenswrapper[36504]: I1203 22:26:41.994125 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-httpd" Dec 03 22:26:41.994559 master-0 kubenswrapper[36504]: I1203 22:26:41.994529 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-log" Dec 03 22:26:41.994655 master-0 kubenswrapper[36504]: I1203 22:26:41.994635 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" containerName="glance-httpd" Dec 03 22:26:41.996640 master-0 kubenswrapper[36504]: I1203 22:26:41.996606 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.010145 master-0 kubenswrapper[36504]: I1203 22:26:42.005403 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-baebb-default-external-config-data" Dec 03 22:26:42.010145 master-0 kubenswrapper[36504]: I1203 22:26:42.005544 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 22:26:42.055219 master-0 kubenswrapper[36504]: I1203 22:26:42.053128 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:26:42.096608 master-0 kubenswrapper[36504]: I1203 22:26:42.094530 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerStarted","Data":"82b6b4d2789b2a777ecf613850368bf695b90d7a90bec3bf2d1c73de6c234082"} Dec 03 22:26:42.097006 master-0 kubenswrapper[36504]: I1203 22:26:42.096908 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w6cn8" event={"ID":"6954864a-d445-46ef-8b09-dcb0543b8b23","Type":"ContainerStarted","Data":"3592e458010f28657f8ee972756f9459af5ba631b6d743c08e6babb385cb1c88"} Dec 03 22:26:42.107391 master-0 kubenswrapper[36504]: I1203 22:26:42.107273 36504 scope.go:117] "RemoveContainer" containerID="d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437" Dec 03 22:26:42.108200 master-0 kubenswrapper[36504]: E1203 22:26:42.107617 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-768fc575ff-n7992_openstack(28b645fa-2039-4145-9f32-a07ee0028539)\"" pod="openstack/heat-api-768fc575ff-n7992" podUID="28b645fa-2039-4145-9f32-a07ee0028539" Dec 03 22:26:42.110979 master-0 kubenswrapper[36504]: I1203 22:26:42.110915 36504 generic.go:334] "Generic (PLEG): container finished" podID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerID="41577981d3efce579ffaefd5b26dba879a677a4785d74d6355a99e12e822d046" exitCode=0 Dec 03 22:26:42.111074 master-0 kubenswrapper[36504]: I1203 22:26:42.111023 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"6cb8747f-1989-4028-b62a-5bf9ea57f609","Type":"ContainerDied","Data":"41577981d3efce579ffaefd5b26dba879a677a4785d74d6355a99e12e822d046"} Dec 03 22:26:42.113924 master-0 kubenswrapper[36504]: I1203 22:26:42.113846 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-578bc969b7-l5g64" event={"ID":"420506d7-d2f5-44f2-8124-c8a488cf2ee7","Type":"ContainerStarted","Data":"73a394ccc7d4ce5eded0f3d03ae1f92c2a902b07b51a328d95ea6f04f4db03a4"} Dec 03 22:26:42.121271 master-0 kubenswrapper[36504]: I1203 22:26:42.120977 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ad6-account-create-update-gj27c" event={"ID":"1b698a34-7bd9-4125-ad5c-885db2cf4959","Type":"ContainerStarted","Data":"38abee6b9ea8aa55e492adbfb90adbc8592471c8318cef5b13da366b2557da56"} Dec 03 22:26:42.133899 master-0 kubenswrapper[36504]: I1203 22:26:42.133842 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" event={"ID":"ebe65521-d7e1-4619-839d-a36739b7d286","Type":"ContainerStarted","Data":"97dd1fff042d3b2fb64444d6fa1de97ad91a0b80232b48fa746f55db9db3a139"} Dec 03 22:26:42.134923 master-0 kubenswrapper[36504]: I1203 22:26:42.134362 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6clwd" podStartSLOduration=4.478426844 podStartE2EDuration="9.134333627s" podCreationTimestamp="2025-12-03 22:26:33 +0000 UTC" firstStartedPulling="2025-12-03 22:26:35.237881865 +0000 UTC m=+960.457653872" lastFinishedPulling="2025-12-03 22:26:39.893788648 +0000 UTC m=+965.113560655" observedRunningTime="2025-12-03 22:26:42.119077208 +0000 UTC m=+967.338849225" watchObservedRunningTime="2025-12-03 22:26:42.134333627 +0000 UTC m=+967.354105634" Dec 03 22:26:42.137643 master-0 kubenswrapper[36504]: I1203 22:26:42.137572 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" event={"ID":"a046886b-4b70-4fb1-82ce-ef92db1677f7","Type":"ContainerStarted","Data":"fd288fbe5e903bd90133b39ddc9e5e6eae57685afd63a13075c36e258ce57d15"} Dec 03 22:26:42.138753 master-0 kubenswrapper[36504]: I1203 22:26:42.138672 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3742-account-create-update-frr2z" event={"ID":"ad30900d-dcdc-49cb-ab6e-601c1d18a77c","Type":"ContainerStarted","Data":"1965ec6d0163d4d84d6411b3238e5708fb91a265594a72017fbc23c8ef749b70"} Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.172276 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-whrpw" event={"ID":"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1","Type":"ContainerStarted","Data":"a4978bc2541d6eea4f37344e4ff6e83411dd9e7f06b5a532e0f86331d9b5593b"} Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174127 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ml5\" (UniqueName: \"kubernetes.io/projected/17576109-b684-4aa1-8895-6283460be452-kube-api-access-f4ml5\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174435 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174495 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-public-tls-certs\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174539 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-combined-ca-bundle\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174577 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17576109-b684-4aa1-8895-6283460be452-httpd-run\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174730 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17576109-b684-4aa1-8895-6283460be452-logs\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.178334 master-0 kubenswrapper[36504]: I1203 22:26:42.174919 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-config-data\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.181820 master-0 kubenswrapper[36504]: I1203 22:26:42.179161 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-scripts\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.182826 master-0 kubenswrapper[36504]: I1203 22:26:42.182705 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-48g2l" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="registry-server" containerID="cri-o://71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b" gracePeriod=2 Dec 03 22:26:42.183069 master-0 kubenswrapper[36504]: I1203 22:26:42.182982 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c69rr" event={"ID":"6193a696-93bd-4303-82fd-bf39ce403d80","Type":"ContainerStarted","Data":"1c1040767e42cf973894d98b7797f8cf62e8c91ba4884517113ca34b32a7124f"} Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282373 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-config-data\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282600 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-scripts\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282645 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ml5\" (UniqueName: \"kubernetes.io/projected/17576109-b684-4aa1-8895-6283460be452-kube-api-access-f4ml5\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282746 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282793 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-public-tls-certs\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282819 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-combined-ca-bundle\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282860 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17576109-b684-4aa1-8895-6283460be452-httpd-run\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.283532 master-0 kubenswrapper[36504]: I1203 22:26:42.282922 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17576109-b684-4aa1-8895-6283460be452-logs\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.286255 master-0 kubenswrapper[36504]: I1203 22:26:42.286171 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17576109-b684-4aa1-8895-6283460be452-logs\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.286560 master-0 kubenswrapper[36504]: I1203 22:26:42.286515 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/17576109-b684-4aa1-8895-6283460be452-httpd-run\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.297384 master-0 kubenswrapper[36504]: I1203 22:26:42.290510 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-combined-ca-bundle\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.297384 master-0 kubenswrapper[36504]: I1203 22:26:42.291112 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:42.297384 master-0 kubenswrapper[36504]: I1203 22:26:42.291174 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a7e57e719e859925dd772afc983e918084ef377c946533a5bd384e3451c0939d/globalmount\"" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.297384 master-0 kubenswrapper[36504]: I1203 22:26:42.291221 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-public-tls-certs\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.297384 master-0 kubenswrapper[36504]: I1203 22:26:42.293133 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-scripts\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.307149 master-0 kubenswrapper[36504]: I1203 22:26:42.307076 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17576109-b684-4aa1-8895-6283460be452-config-data\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.322993 master-0 kubenswrapper[36504]: I1203 22:26:42.322903 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ml5\" (UniqueName: \"kubernetes.io/projected/17576109-b684-4aa1-8895-6283460be452-kube-api-access-f4ml5\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:42.543839 master-0 kubenswrapper[36504]: I1203 22:26:42.543064 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:42.698898 master-0 kubenswrapper[36504]: I1203 22:26:42.695835 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data-custom\") pod \"8423bf36-7022-4d07-b528-2deb47088921\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " Dec 03 22:26:42.698898 master-0 kubenswrapper[36504]: I1203 22:26:42.695972 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-combined-ca-bundle\") pod \"8423bf36-7022-4d07-b528-2deb47088921\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " Dec 03 22:26:42.698898 master-0 kubenswrapper[36504]: I1203 22:26:42.695997 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b95hb\" (UniqueName: \"kubernetes.io/projected/8423bf36-7022-4d07-b528-2deb47088921-kube-api-access-b95hb\") pod \"8423bf36-7022-4d07-b528-2deb47088921\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " Dec 03 22:26:42.698898 master-0 kubenswrapper[36504]: I1203 22:26:42.696420 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data\") pod \"8423bf36-7022-4d07-b528-2deb47088921\" (UID: \"8423bf36-7022-4d07-b528-2deb47088921\") " Dec 03 22:26:42.717957 master-0 kubenswrapper[36504]: I1203 22:26:42.717243 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8423bf36-7022-4d07-b528-2deb47088921" (UID: "8423bf36-7022-4d07-b528-2deb47088921"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:42.727302 master-0 kubenswrapper[36504]: I1203 22:26:42.726704 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8423bf36-7022-4d07-b528-2deb47088921-kube-api-access-b95hb" (OuterVolumeSpecName: "kube-api-access-b95hb") pod "8423bf36-7022-4d07-b528-2deb47088921" (UID: "8423bf36-7022-4d07-b528-2deb47088921"). InnerVolumeSpecName "kube-api-access-b95hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:42.755612 master-0 kubenswrapper[36504]: I1203 22:26:42.755512 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8423bf36-7022-4d07-b528-2deb47088921" (UID: "8423bf36-7022-4d07-b528-2deb47088921"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:42.802112 master-0 kubenswrapper[36504]: I1203 22:26:42.800579 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:42.802112 master-0 kubenswrapper[36504]: I1203 22:26:42.800638 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:42.802112 master-0 kubenswrapper[36504]: I1203 22:26:42.800656 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b95hb\" (UniqueName: \"kubernetes.io/projected/8423bf36-7022-4d07-b528-2deb47088921-kube-api-access-b95hb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:42.805892 master-0 kubenswrapper[36504]: I1203 22:26:42.805787 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data" (OuterVolumeSpecName: "config-data") pod "8423bf36-7022-4d07-b528-2deb47088921" (UID: "8423bf36-7022-4d07-b528-2deb47088921"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:42.905433 master-0 kubenswrapper[36504]: I1203 22:26:42.904682 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8423bf36-7022-4d07-b528-2deb47088921-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.132946 master-0 kubenswrapper[36504]: I1203 22:26:43.132892 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:43.205932 master-0 kubenswrapper[36504]: I1203 22:26:43.190012 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc35c451-c612-46b7-8328-9c8bfa7891e7" path="/var/lib/kubelet/pods/dc35c451-c612-46b7-8328-9c8bfa7891e7/volumes" Dec 03 22:26:43.238622 master-0 kubenswrapper[36504]: I1203 22:26:43.231093 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:43.238622 master-0 kubenswrapper[36504]: I1203 22:26:43.231843 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" event={"ID":"8423bf36-7022-4d07-b528-2deb47088921","Type":"ContainerDied","Data":"baeca8efd1a0fbae599d0ffad89e4d9090ab3bee9cc6fabd6abcd859272eb723"} Dec 03 22:26:43.238622 master-0 kubenswrapper[36504]: I1203 22:26:43.231917 36504 scope.go:117] "RemoveContainer" containerID="2b17fd0d1c421e388e02f8c3354dbe7cb58c4913def3bcf0bb7e542d0ff8bcc5" Dec 03 22:26:43.238622 master-0 kubenswrapper[36504]: I1203 22:26:43.232076 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cdb44f7-mvlf8" Dec 03 22:26:43.252361 master-0 kubenswrapper[36504]: I1203 22:26:43.250051 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-combined-ca-bundle\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.268738 master-0 kubenswrapper[36504]: I1203 22:26:43.268212 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-64967779f-xr54t" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerName="heat-api" probeResult="failure" output="Get \"http://10.128.1.3:8004/healthcheck\": read tcp 10.128.0.2:60218->10.128.1.3:8004: read: connection reset by peer" Dec 03 22:26:43.287217 master-0 kubenswrapper[36504]: I1203 22:26:43.287144 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-internal-tls-certs\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.287520 master-0 kubenswrapper[36504]: I1203 22:26:43.287367 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-logs\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.287657 master-0 kubenswrapper[36504]: I1203 22:26:43.287633 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.287793 master-0 kubenswrapper[36504]: I1203 22:26:43.287760 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-scripts\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.287987 master-0 kubenswrapper[36504]: I1203 22:26:43.287964 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-config-data\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.288122 master-0 kubenswrapper[36504]: I1203 22:26:43.288078 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-572qt\" (UniqueName: \"kubernetes.io/projected/6cb8747f-1989-4028-b62a-5bf9ea57f609-kube-api-access-572qt\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.288198 master-0 kubenswrapper[36504]: I1203 22:26:43.288137 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-httpd-run\") pod \"6cb8747f-1989-4028-b62a-5bf9ea57f609\" (UID: \"6cb8747f-1989-4028-b62a-5bf9ea57f609\") " Dec 03 22:26:43.288404 master-0 kubenswrapper[36504]: I1203 22:26:43.288344 36504 generic.go:334] "Generic (PLEG): container finished" podID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerID="71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b" exitCode=0 Dec 03 22:26:43.290098 master-0 kubenswrapper[36504]: I1203 22:26:43.289207 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-logs" (OuterVolumeSpecName: "logs") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:43.290349 master-0 kubenswrapper[36504]: I1203 22:26:43.290328 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48g2l" Dec 03 22:26:43.291881 master-0 kubenswrapper[36504]: I1203 22:26:43.291527 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerDied","Data":"71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b"} Dec 03 22:26:43.292043 master-0 kubenswrapper[36504]: I1203 22:26:43.292015 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48g2l" event={"ID":"21102a00-2eed-4a6b-84e6-07403fbda79c","Type":"ContainerDied","Data":"31492ae3c7c9dfbe4e71bc9dce8f250311c64a5c9f9117012ac25ee869ac15b5"} Dec 03 22:26:43.292157 master-0 kubenswrapper[36504]: I1203 22:26:43.292141 36504 scope.go:117] "RemoveContainer" containerID="71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b" Dec 03 22:26:43.292340 master-0 kubenswrapper[36504]: I1203 22:26:43.292167 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:43.294036 master-0 kubenswrapper[36504]: I1203 22:26:43.294010 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkfvv\" (UniqueName: \"kubernetes.io/projected/21102a00-2eed-4a6b-84e6-07403fbda79c-kube-api-access-vkfvv\") pod \"21102a00-2eed-4a6b-84e6-07403fbda79c\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " Dec 03 22:26:43.296239 master-0 kubenswrapper[36504]: I1203 22:26:43.296208 36504 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-httpd-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.296669 master-0 kubenswrapper[36504]: I1203 22:26:43.296653 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cb8747f-1989-4028-b62a-5bf9ea57f609-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.299539 master-0 kubenswrapper[36504]: I1203 22:26:43.299355 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-scripts" (OuterVolumeSpecName: "scripts") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:43.310971 master-0 kubenswrapper[36504]: I1203 22:26:43.310244 36504 generic.go:334] "Generic (PLEG): container finished" podID="76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" containerID="a4978bc2541d6eea4f37344e4ff6e83411dd9e7f06b5a532e0f86331d9b5593b" exitCode=0 Dec 03 22:26:43.310971 master-0 kubenswrapper[36504]: I1203 22:26:43.310426 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-whrpw" event={"ID":"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1","Type":"ContainerDied","Data":"a4978bc2541d6eea4f37344e4ff6e83411dd9e7f06b5a532e0f86331d9b5593b"} Dec 03 22:26:43.326458 master-0 kubenswrapper[36504]: I1203 22:26:43.317007 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c69rr" event={"ID":"6193a696-93bd-4303-82fd-bf39ce403d80","Type":"ContainerStarted","Data":"f086d403457a9dff4d9e008bda0b8e7a8feda5e5f24a4bb4be068d364eaf654d"} Dec 03 22:26:43.326458 master-0 kubenswrapper[36504]: I1203 22:26:43.323926 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb8747f-1989-4028-b62a-5bf9ea57f609-kube-api-access-572qt" (OuterVolumeSpecName: "kube-api-access-572qt") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "kube-api-access-572qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:43.326458 master-0 kubenswrapper[36504]: I1203 22:26:43.324740 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:43.326458 master-0 kubenswrapper[36504]: I1203 22:26:43.325107 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"6cb8747f-1989-4028-b62a-5bf9ea57f609","Type":"ContainerDied","Data":"26e5f37107d13dae26d6bb143d727e2399393df0bf5e4ea3e311026466f6d9a9"} Dec 03 22:26:43.344094 master-0 kubenswrapper[36504]: I1203 22:26:43.343681 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21102a00-2eed-4a6b-84e6-07403fbda79c-kube-api-access-vkfvv" (OuterVolumeSpecName: "kube-api-access-vkfvv") pod "21102a00-2eed-4a6b-84e6-07403fbda79c" (UID: "21102a00-2eed-4a6b-84e6-07403fbda79c"). InnerVolumeSpecName "kube-api-access-vkfvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:43.404809 master-0 kubenswrapper[36504]: I1203 22:26:43.404714 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-catalog-content\") pod \"21102a00-2eed-4a6b-84e6-07403fbda79c\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " Dec 03 22:26:43.405140 master-0 kubenswrapper[36504]: I1203 22:26:43.404953 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-utilities\") pod \"21102a00-2eed-4a6b-84e6-07403fbda79c\" (UID: \"21102a00-2eed-4a6b-84e6-07403fbda79c\") " Dec 03 22:26:43.406235 master-0 kubenswrapper[36504]: I1203 22:26:43.406180 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.406235 master-0 kubenswrapper[36504]: I1203 22:26:43.406217 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-572qt\" (UniqueName: \"kubernetes.io/projected/6cb8747f-1989-4028-b62a-5bf9ea57f609-kube-api-access-572qt\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.406235 master-0 kubenswrapper[36504]: I1203 22:26:43.406236 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkfvv\" (UniqueName: \"kubernetes.io/projected/21102a00-2eed-4a6b-84e6-07403fbda79c-kube-api-access-vkfvv\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.409937 master-0 kubenswrapper[36504]: I1203 22:26:43.409875 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-utilities" (OuterVolumeSpecName: "utilities") pod "21102a00-2eed-4a6b-84e6-07403fbda79c" (UID: "21102a00-2eed-4a6b-84e6-07403fbda79c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:43.511205 master-0 kubenswrapper[36504]: I1203 22:26:43.511027 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:43.815761 master-0 kubenswrapper[36504]: I1203 22:26:43.815645 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:43.815761 master-0 kubenswrapper[36504]: I1203 22:26:43.815705 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:26:43.877912 master-0 kubenswrapper[36504]: I1203 22:26:43.870125 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21102a00-2eed-4a6b-84e6-07403fbda79c" (UID: "21102a00-2eed-4a6b-84e6-07403fbda79c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:26:43.936789 master-0 kubenswrapper[36504]: I1203 22:26:43.936724 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21102a00-2eed-4a6b-84e6-07403fbda79c-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:44.025251 master-0 kubenswrapper[36504]: I1203 22:26:44.025160 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:44.025571 master-0 kubenswrapper[36504]: I1203 22:26:44.025455 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334" (OuterVolumeSpecName: "glance") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:26:44.047355 master-0 kubenswrapper[36504]: I1203 22:26:44.047267 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:44.047355 master-0 kubenswrapper[36504]: I1203 22:26:44.047370 36504 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") on node \"master-0\" " Dec 03 22:26:44.160920 master-0 kubenswrapper[36504]: I1203 22:26:44.160300 36504 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 22:26:44.160920 master-0 kubenswrapper[36504]: I1203 22:26:44.160545 36504 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465" (UniqueName: "kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334") on node "master-0" Dec 03 22:26:44.236164 master-0 kubenswrapper[36504]: I1203 22:26:44.235898 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-config-data" (OuterVolumeSpecName: "config-data") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:44.250107 master-0 kubenswrapper[36504]: I1203 22:26:44.249922 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6cb8747f-1989-4028-b62a-5bf9ea57f609" (UID: "6cb8747f-1989-4028-b62a-5bf9ea57f609"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:44.284007 master-0 kubenswrapper[36504]: I1203 22:26:44.283912 36504 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:44.284007 master-0 kubenswrapper[36504]: I1203 22:26:44.283986 36504 reconciler_common.go:293] "Volume detached for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:44.284007 master-0 kubenswrapper[36504]: I1203 22:26:44.284011 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cb8747f-1989-4028-b62a-5bf9ea57f609-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:44.300848 master-0 kubenswrapper[36504]: I1203 22:26:44.300489 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6fcbeb2a-5b44-4f40-a263-b35390af0e0c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^17133a71-d34d-426c-92cb-6e9f8b5523d1\") pod \"glance-baebb-default-external-api-0\" (UID: \"17576109-b684-4aa1-8895-6283460be452\") " pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:44.408710 master-0 kubenswrapper[36504]: I1203 22:26:44.406439 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ad6-account-create-update-gj27c" event={"ID":"1b698a34-7bd9-4125-ad5c-885db2cf4959","Type":"ContainerStarted","Data":"e3dc98d1f10f75204ba94845d8f275947fd6f668502cc1755ae53718f00e44e5"} Dec 03 22:26:44.413475 master-0 kubenswrapper[36504]: I1203 22:26:44.409262 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" event={"ID":"ebe65521-d7e1-4619-839d-a36739b7d286","Type":"ContainerStarted","Data":"2794647b8fa1ffeb21f216edd54536ba15998cf36a99c6e6a9f79f93a1e5f72d"} Dec 03 22:26:44.413475 master-0 kubenswrapper[36504]: I1203 22:26:44.411106 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:44.416673 master-0 kubenswrapper[36504]: I1203 22:26:44.416597 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" event={"ID":"a046886b-4b70-4fb1-82ce-ef92db1677f7","Type":"ContainerStarted","Data":"82aa7af98106cf32c5d8e139da882a326a2f55e112fee3d8177eecbc252ab61d"} Dec 03 22:26:44.429440 master-0 kubenswrapper[36504]: I1203 22:26:44.428062 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w6cn8" event={"ID":"6954864a-d445-46ef-8b09-dcb0543b8b23","Type":"ContainerStarted","Data":"556fdebdc51cb83ab8b96b22316114542b4ce68660f4fcb52774343ec06bac65"} Dec 03 22:26:44.435424 master-0 kubenswrapper[36504]: I1203 22:26:44.435331 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-whrpw" event={"ID":"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1","Type":"ContainerDied","Data":"201f5fbe76e0dc298839f7cbdb657b234ec60e96e7371f4dc1c1dcd52f284159"} Dec 03 22:26:44.435689 master-0 kubenswrapper[36504]: I1203 22:26:44.435416 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="201f5fbe76e0dc298839f7cbdb657b234ec60e96e7371f4dc1c1dcd52f284159" Dec 03 22:26:44.444145 master-0 kubenswrapper[36504]: I1203 22:26:44.444004 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-3ad6-account-create-update-gj27c" podStartSLOduration=7.443976698 podStartE2EDuration="7.443976698s" podCreationTimestamp="2025-12-03 22:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:44.435731239 +0000 UTC m=+969.655503266" watchObservedRunningTime="2025-12-03 22:26:44.443976698 +0000 UTC m=+969.663748705" Dec 03 22:26:44.449221 master-0 kubenswrapper[36504]: I1203 22:26:44.449151 36504 generic.go:334] "Generic (PLEG): container finished" podID="6193a696-93bd-4303-82fd-bf39ce403d80" containerID="f086d403457a9dff4d9e008bda0b8e7a8feda5e5f24a4bb4be068d364eaf654d" exitCode=0 Dec 03 22:26:44.449493 master-0 kubenswrapper[36504]: I1203 22:26:44.449461 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c69rr" event={"ID":"6193a696-93bd-4303-82fd-bf39ce403d80","Type":"ContainerDied","Data":"f086d403457a9dff4d9e008bda0b8e7a8feda5e5f24a4bb4be068d364eaf654d"} Dec 03 22:26:44.472493 master-0 kubenswrapper[36504]: I1203 22:26:44.471728 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" podStartSLOduration=7.4716995 podStartE2EDuration="7.4716995s" podCreationTimestamp="2025-12-03 22:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:44.470466861 +0000 UTC m=+969.690238868" watchObservedRunningTime="2025-12-03 22:26:44.4716995 +0000 UTC m=+969.691471507" Dec 03 22:26:44.475449 master-0 kubenswrapper[36504]: I1203 22:26:44.475388 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3742-account-create-update-frr2z" event={"ID":"ad30900d-dcdc-49cb-ab6e-601c1d18a77c","Type":"ContainerStarted","Data":"821cb11c51b954e4d57104a6195f336f4d6ac6d743668fa80deb00a7ba68d5cc"} Dec 03 22:26:44.476357 master-0 kubenswrapper[36504]: I1203 22:26:44.476321 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:44.485093 master-0 kubenswrapper[36504]: I1203 22:26:44.485017 36504 generic.go:334] "Generic (PLEG): container finished" podID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerID="979a6f05ddb40cd49bdfca416b102462f2afe9382b6f0e57dc84fd3f58196ff9" exitCode=0 Dec 03 22:26:44.485431 master-0 kubenswrapper[36504]: I1203 22:26:44.485146 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-64967779f-xr54t" event={"ID":"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8","Type":"ContainerDied","Data":"979a6f05ddb40cd49bdfca416b102462f2afe9382b6f0e57dc84fd3f58196ff9"} Dec 03 22:26:44.485431 master-0 kubenswrapper[36504]: I1203 22:26:44.485187 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-64967779f-xr54t" event={"ID":"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8","Type":"ContainerDied","Data":"9bcb666a695155355cf19d51ad022971318f6208d6a3d8da68d77e00ff62db2f"} Dec 03 22:26:44.485431 master-0 kubenswrapper[36504]: I1203 22:26:44.485215 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bcb666a695155355cf19d51ad022971318f6208d6a3d8da68d77e00ff62db2f" Dec 03 22:26:44.542981 master-0 kubenswrapper[36504]: I1203 22:26:44.534154 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-w6cn8" podStartSLOduration=7.534126442 podStartE2EDuration="7.534126442s" podCreationTimestamp="2025-12-03 22:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:44.492719071 +0000 UTC m=+969.712491078" watchObservedRunningTime="2025-12-03 22:26:44.534126442 +0000 UTC m=+969.753898449" Dec 03 22:26:44.567795 master-0 kubenswrapper[36504]: I1203 22:26:44.564977 36504 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:44.567795 master-0 kubenswrapper[36504]: I1203 22:26:44.566965 36504 scope.go:117] "RemoveContainer" containerID="d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437" Dec 03 22:26:44.567795 master-0 kubenswrapper[36504]: E1203 22:26:44.567414 36504 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-768fc575ff-n7992_openstack(28b645fa-2039-4145-9f32-a07ee0028539)\"" pod="openstack/heat-api-768fc575ff-n7992" podUID="28b645fa-2039-4145-9f32-a07ee0028539" Dec 03 22:26:44.588027 master-0 kubenswrapper[36504]: I1203 22:26:44.587919 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" podStartSLOduration=6.587886382 podStartE2EDuration="6.587886382s" podCreationTimestamp="2025-12-03 22:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:44.54201167 +0000 UTC m=+969.761783677" watchObservedRunningTime="2025-12-03 22:26:44.587886382 +0000 UTC m=+969.807658389" Dec 03 22:26:44.593006 master-0 kubenswrapper[36504]: I1203 22:26:44.592901 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-3742-account-create-update-frr2z" podStartSLOduration=7.592869358 podStartE2EDuration="7.592869358s" podCreationTimestamp="2025-12-03 22:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:44.583473612 +0000 UTC m=+969.803245619" watchObservedRunningTime="2025-12-03 22:26:44.592869358 +0000 UTC m=+969.812641365" Dec 03 22:26:44.731800 master-0 kubenswrapper[36504]: I1203 22:26:44.730902 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:26:44.832342 master-0 kubenswrapper[36504]: I1203 22:26:44.832279 36504 scope.go:117] "RemoveContainer" containerID="268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe" Dec 03 22:26:44.846004 master-0 kubenswrapper[36504]: I1203 22:26:44.845953 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:44.880689 master-0 kubenswrapper[36504]: I1203 22:26:44.876571 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:44.902723 master-0 kubenswrapper[36504]: I1203 22:26:44.902128 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-64cdb44f7-mvlf8"] Dec 03 22:26:44.924843 master-0 kubenswrapper[36504]: I1203 22:26:44.924724 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqkkv\" (UniqueName: \"kubernetes.io/projected/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-kube-api-access-zqkkv\") pod \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " Dec 03 22:26:44.925461 master-0 kubenswrapper[36504]: I1203 22:26:44.925242 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-combined-ca-bundle\") pod \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " Dec 03 22:26:44.925461 master-0 kubenswrapper[36504]: I1203 22:26:44.925310 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjc79\" (UniqueName: \"kubernetes.io/projected/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-kube-api-access-zjc79\") pod \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " Dec 03 22:26:44.925461 master-0 kubenswrapper[36504]: I1203 22:26:44.925413 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-operator-scripts\") pod \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\" (UID: \"76c228f8-3d7c-49c0-b63c-ce25dbe64ca1\") " Dec 03 22:26:44.925808 master-0 kubenswrapper[36504]: I1203 22:26:44.925561 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data-custom\") pod \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " Dec 03 22:26:44.925808 master-0 kubenswrapper[36504]: I1203 22:26:44.925735 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data\") pod \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\" (UID: \"9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8\") " Dec 03 22:26:44.935801 master-0 kubenswrapper[36504]: I1203 22:26:44.928985 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" (UID: "76c228f8-3d7c-49c0-b63c-ce25dbe64ca1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:44.976837 master-0 kubenswrapper[36504]: I1203 22:26:44.969643 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-kube-api-access-zjc79" (OuterVolumeSpecName: "kube-api-access-zjc79") pod "76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" (UID: "76c228f8-3d7c-49c0-b63c-ce25dbe64ca1"). InnerVolumeSpecName "kube-api-access-zjc79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:44.993040 master-0 kubenswrapper[36504]: I1203 22:26:44.991558 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-kube-api-access-zqkkv" (OuterVolumeSpecName: "kube-api-access-zqkkv") pod "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" (UID: "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8"). InnerVolumeSpecName "kube-api-access-zqkkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:45.007012 master-0 kubenswrapper[36504]: I1203 22:26:45.006903 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" (UID: "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:45.016485 master-0 kubenswrapper[36504]: I1203 22:26:45.014861 36504 scope.go:117] "RemoveContainer" containerID="c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554" Dec 03 22:26:45.030338 master-0 kubenswrapper[36504]: I1203 22:26:45.030206 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-64cdb44f7-mvlf8"] Dec 03 22:26:45.033176 master-0 kubenswrapper[36504]: I1203 22:26:45.032438 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqkkv\" (UniqueName: \"kubernetes.io/projected/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-kube-api-access-zqkkv\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.033176 master-0 kubenswrapper[36504]: I1203 22:26:45.032499 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjc79\" (UniqueName: \"kubernetes.io/projected/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-kube-api-access-zjc79\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.033176 master-0 kubenswrapper[36504]: I1203 22:26:45.032515 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.033176 master-0 kubenswrapper[36504]: I1203 22:26:45.032530 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.076191 master-0 kubenswrapper[36504]: I1203 22:26:45.076094 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48g2l"] Dec 03 22:26:45.109444 master-0 kubenswrapper[36504]: I1203 22:26:45.104077 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-48g2l"] Dec 03 22:26:45.111393 master-0 kubenswrapper[36504]: I1203 22:26:45.111254 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" (UID: "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:45.136563 master-0 kubenswrapper[36504]: I1203 22:26:45.136496 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.160247 master-0 kubenswrapper[36504]: I1203 22:26:45.159507 36504 scope.go:117] "RemoveContainer" containerID="71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: E1203 22:26:45.160615 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b\": container with ID starting with 71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b not found: ID does not exist" containerID="71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: I1203 22:26:45.160708 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b"} err="failed to get container status \"71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b\": rpc error: code = NotFound desc = could not find container \"71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b\": container with ID starting with 71238a2ca797f8c9d3c3511a1fd149aab9acc9f3473bdf04bee5ff628e2a871b not found: ID does not exist" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: I1203 22:26:45.160796 36504 scope.go:117] "RemoveContainer" containerID="268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: E1203 22:26:45.161329 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe\": container with ID starting with 268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe not found: ID does not exist" containerID="268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: I1203 22:26:45.161390 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe"} err="failed to get container status \"268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe\": rpc error: code = NotFound desc = could not find container \"268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe\": container with ID starting with 268a28978cccd0b65a1eb903ff5c4fc1ab4f2b18d8db4591c01d16cedd8d0efe not found: ID does not exist" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: I1203 22:26:45.161432 36504 scope.go:117] "RemoveContainer" containerID="c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: E1203 22:26:45.162323 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554\": container with ID starting with c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554 not found: ID does not exist" containerID="c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: I1203 22:26:45.162344 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554"} err="failed to get container status \"c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554\": rpc error: code = NotFound desc = could not find container \"c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554\": container with ID starting with c7d55926792f3d50507610f5a43f257bfda4bc6e749ff89f72075b84a5567554 not found: ID does not exist" Dec 03 22:26:45.164116 master-0 kubenswrapper[36504]: I1203 22:26:45.162358 36504 scope.go:117] "RemoveContainer" containerID="41577981d3efce579ffaefd5b26dba879a677a4785d74d6355a99e12e822d046" Dec 03 22:26:45.189618 master-0 kubenswrapper[36504]: I1203 22:26:45.188777 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:45.189618 master-0 kubenswrapper[36504]: I1203 22:26:45.189153 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" path="/var/lib/kubelet/pods/21102a00-2eed-4a6b-84e6-07403fbda79c/volumes" Dec 03 22:26:45.202181 master-0 kubenswrapper[36504]: I1203 22:26:45.198148 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8423bf36-7022-4d07-b528-2deb47088921" path="/var/lib/kubelet/pods/8423bf36-7022-4d07-b528-2deb47088921/volumes" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.226144 36504 scope.go:117] "RemoveContainer" containerID="c28adb8871b8d1df40a0ebc7d67cde64a957e8e8087902237524a1d002d90584" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.228838 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.228888 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.228909 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.229561 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="extract-content" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.229579 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="extract-content" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.229618 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-log" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.229625 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-log" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.229655 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="extract-utilities" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.229662 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="extract-utilities" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.229683 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.229690 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.229720 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6193a696-93bd-4303-82fd-bf39ce403d80" containerName="mariadb-database-create" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.229728 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6193a696-93bd-4303-82fd-bf39ce403d80" containerName="mariadb-database-create" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.229743 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" containerName="mariadb-database-create" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.229752 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" containerName="mariadb-database-create" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.231608 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="registry-server" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.231622 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="registry-server" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.231641 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerName="heat-api" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.231651 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerName="heat-api" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.231689 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-httpd" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.231696 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-httpd" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232001 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="21102a00-2eed-4a6b-84e6-07403fbda79c" containerName="registry-server" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232018 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232048 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232061 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6193a696-93bd-4303-82fd-bf39ce403d80" containerName="mariadb-database-create" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232078 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" containerName="heat-api" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232091 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-httpd" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232112 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" containerName="mariadb-database-create" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232140 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" containerName="glance-log" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: E1203 22:26:45.232395 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.232404 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8423bf36-7022-4d07-b528-2deb47088921" containerName="heat-cfnapi" Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: I1203 22:26:45.234031 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6clwd" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="registry-server" probeResult="failure" output=< Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: timeout: failed to connect service ":50051" within 1s Dec 03 22:26:45.235901 master-0 kubenswrapper[36504]: > Dec 03 22:26:45.237267 master-0 kubenswrapper[36504]: I1203 22:26:45.236537 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.272697 master-0 kubenswrapper[36504]: I1203 22:26:45.260490 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:26:45.289500 master-0 kubenswrapper[36504]: I1203 22:26:45.284624 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-baebb-default-internal-config-data" Dec 03 22:26:45.289500 master-0 kubenswrapper[36504]: I1203 22:26:45.286328 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 22:26:45.320868 master-0 kubenswrapper[36504]: I1203 22:26:45.320795 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data" (OuterVolumeSpecName: "config-data") pod "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" (UID: "9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:45.345815 master-0 kubenswrapper[36504]: I1203 22:26:45.345646 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6193a696-93bd-4303-82fd-bf39ce403d80-operator-scripts\") pod \"6193a696-93bd-4303-82fd-bf39ce403d80\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " Dec 03 22:26:45.345815 master-0 kubenswrapper[36504]: I1203 22:26:45.345716 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrvfc\" (UniqueName: \"kubernetes.io/projected/6193a696-93bd-4303-82fd-bf39ce403d80-kube-api-access-lrvfc\") pod \"6193a696-93bd-4303-82fd-bf39ce403d80\" (UID: \"6193a696-93bd-4303-82fd-bf39ce403d80\") " Dec 03 22:26:45.346802 master-0 kubenswrapper[36504]: I1203 22:26:45.346590 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69b8b4a-7bc2-407c-80f3-ea88c8467153-httpd-run\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.346802 master-0 kubenswrapper[36504]: I1203 22:26:45.346672 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds5k7\" (UniqueName: \"kubernetes.io/projected/e69b8b4a-7bc2-407c-80f3-ea88c8467153-kube-api-access-ds5k7\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.346802 master-0 kubenswrapper[36504]: I1203 22:26:45.346747 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-scripts\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.346953 master-0 kubenswrapper[36504]: I1203 22:26:45.346868 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-config-data\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.346953 master-0 kubenswrapper[36504]: I1203 22:26:45.346917 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-combined-ca-bundle\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.346953 master-0 kubenswrapper[36504]: I1203 22:26:45.346944 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69b8b4a-7bc2-407c-80f3-ea88c8467153-logs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.347047 master-0 kubenswrapper[36504]: I1203 22:26:45.346988 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-internal-tls-certs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.347083 master-0 kubenswrapper[36504]: I1203 22:26:45.347044 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.347171 master-0 kubenswrapper[36504]: I1203 22:26:45.347151 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.348895 master-0 kubenswrapper[36504]: I1203 22:26:45.348858 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6193a696-93bd-4303-82fd-bf39ce403d80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6193a696-93bd-4303-82fd-bf39ce403d80" (UID: "6193a696-93bd-4303-82fd-bf39ce403d80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:45.352915 master-0 kubenswrapper[36504]: I1203 22:26:45.352870 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6193a696-93bd-4303-82fd-bf39ce403d80-kube-api-access-lrvfc" (OuterVolumeSpecName: "kube-api-access-lrvfc") pod "6193a696-93bd-4303-82fd-bf39ce403d80" (UID: "6193a696-93bd-4303-82fd-bf39ce403d80"). InnerVolumeSpecName "kube-api-access-lrvfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:45.455340 master-0 kubenswrapper[36504]: I1203 22:26:45.454864 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-combined-ca-bundle\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.455340 master-0 kubenswrapper[36504]: I1203 22:26:45.454961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69b8b4a-7bc2-407c-80f3-ea88c8467153-logs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.455340 master-0 kubenswrapper[36504]: I1203 22:26:45.455072 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-internal-tls-certs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.456617 master-0 kubenswrapper[36504]: I1203 22:26:45.456592 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69b8b4a-7bc2-407c-80f3-ea88c8467153-logs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.456683 master-0 kubenswrapper[36504]: I1203 22:26:45.455198 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.474404 master-0 kubenswrapper[36504]: I1203 22:26:45.467485 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69b8b4a-7bc2-407c-80f3-ea88c8467153-httpd-run\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.475015 master-0 kubenswrapper[36504]: I1203 22:26:45.472321 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e69b8b4a-7bc2-407c-80f3-ea88c8467153-httpd-run\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.475015 master-0 kubenswrapper[36504]: I1203 22:26:45.474668 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds5k7\" (UniqueName: \"kubernetes.io/projected/e69b8b4a-7bc2-407c-80f3-ea88c8467153-kube-api-access-ds5k7\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.479996 master-0 kubenswrapper[36504]: I1203 22:26:45.476196 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-scripts\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.479996 master-0 kubenswrapper[36504]: I1203 22:26:45.476471 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-config-data\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.479996 master-0 kubenswrapper[36504]: I1203 22:26:45.476804 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6193a696-93bd-4303-82fd-bf39ce403d80-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.479996 master-0 kubenswrapper[36504]: I1203 22:26:45.476821 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrvfc\" (UniqueName: \"kubernetes.io/projected/6193a696-93bd-4303-82fd-bf39ce403d80-kube-api-access-lrvfc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:45.499491 master-0 kubenswrapper[36504]: I1203 22:26:45.499435 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-internal-tls-certs\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.499985 master-0 kubenswrapper[36504]: I1203 22:26:45.499796 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-combined-ca-bundle\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.500607 master-0 kubenswrapper[36504]: I1203 22:26:45.500557 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:26:45.500663 master-0 kubenswrapper[36504]: I1203 22:26:45.500631 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2072068328f8c9fdd4d126c82ed7a5705e28f66beec6cc4a6e19712168c74b28/globalmount\"" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.500735 master-0 kubenswrapper[36504]: I1203 22:26:45.500685 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-config-data\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.500828 master-0 kubenswrapper[36504]: I1203 22:26:45.500763 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e69b8b4a-7bc2-407c-80f3-ea88c8467153-scripts\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.513353 master-0 kubenswrapper[36504]: I1203 22:26:45.513275 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds5k7\" (UniqueName: \"kubernetes.io/projected/e69b8b4a-7bc2-407c-80f3-ea88c8467153-kube-api-access-ds5k7\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:45.516941 master-0 kubenswrapper[36504]: I1203 22:26:45.516866 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-external-api-0"] Dec 03 22:26:45.594145 master-0 kubenswrapper[36504]: I1203 22:26:45.592909 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"17576109-b684-4aa1-8895-6283460be452","Type":"ContainerStarted","Data":"6d06953ee3fbc7cfe5fcfdf2d771c8fbd96c548767522d4d8e257e1998a06b5f"} Dec 03 22:26:45.638816 master-0 kubenswrapper[36504]: I1203 22:26:45.637058 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerStarted","Data":"8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0"} Dec 03 22:26:45.645818 master-0 kubenswrapper[36504]: I1203 22:26:45.642343 36504 generic.go:334] "Generic (PLEG): container finished" podID="6954864a-d445-46ef-8b09-dcb0543b8b23" containerID="556fdebdc51cb83ab8b96b22316114542b4ce68660f4fcb52774343ec06bac65" exitCode=0 Dec 03 22:26:45.645818 master-0 kubenswrapper[36504]: I1203 22:26:45.642450 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w6cn8" event={"ID":"6954864a-d445-46ef-8b09-dcb0543b8b23","Type":"ContainerDied","Data":"556fdebdc51cb83ab8b96b22316114542b4ce68660f4fcb52774343ec06bac65"} Dec 03 22:26:45.668316 master-0 kubenswrapper[36504]: I1203 22:26:45.665857 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-c69rr" Dec 03 22:26:45.668316 master-0 kubenswrapper[36504]: I1203 22:26:45.665895 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-c69rr" event={"ID":"6193a696-93bd-4303-82fd-bf39ce403d80","Type":"ContainerDied","Data":"1c1040767e42cf973894d98b7797f8cf62e8c91ba4884517113ca34b32a7124f"} Dec 03 22:26:45.668316 master-0 kubenswrapper[36504]: I1203 22:26:45.665986 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1040767e42cf973894d98b7797f8cf62e8c91ba4884517113ca34b32a7124f" Dec 03 22:26:45.703253 master-0 kubenswrapper[36504]: I1203 22:26:45.702511 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-578bc969b7-l5g64" event={"ID":"420506d7-d2f5-44f2-8124-c8a488cf2ee7","Type":"ContainerStarted","Data":"6e53b2df3d1753d001a2614005895b1579d88725d8a56ce7d90cc2d7133a7b7a"} Dec 03 22:26:45.703253 master-0 kubenswrapper[36504]: I1203 22:26:45.702605 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:45.715620 master-0 kubenswrapper[36504]: I1203 22:26:45.714589 36504 generic.go:334] "Generic (PLEG): container finished" podID="1b698a34-7bd9-4125-ad5c-885db2cf4959" containerID="e3dc98d1f10f75204ba94845d8f275947fd6f668502cc1755ae53718f00e44e5" exitCode=0 Dec 03 22:26:45.715620 master-0 kubenswrapper[36504]: I1203 22:26:45.714697 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ad6-account-create-update-gj27c" event={"ID":"1b698a34-7bd9-4125-ad5c-885db2cf4959","Type":"ContainerDied","Data":"e3dc98d1f10f75204ba94845d8f275947fd6f668502cc1755ae53718f00e44e5"} Dec 03 22:26:45.721782 master-0 kubenswrapper[36504]: I1203 22:26:45.721274 36504 generic.go:334] "Generic (PLEG): container finished" podID="a046886b-4b70-4fb1-82ce-ef92db1677f7" containerID="82aa7af98106cf32c5d8e139da882a326a2f55e112fee3d8177eecbc252ab61d" exitCode=0 Dec 03 22:26:45.721782 master-0 kubenswrapper[36504]: I1203 22:26:45.721352 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" event={"ID":"a046886b-4b70-4fb1-82ce-ef92db1677f7","Type":"ContainerDied","Data":"82aa7af98106cf32c5d8e139da882a326a2f55e112fee3d8177eecbc252ab61d"} Dec 03 22:26:45.723002 master-0 kubenswrapper[36504]: I1203 22:26:45.722960 36504 generic.go:334] "Generic (PLEG): container finished" podID="ad30900d-dcdc-49cb-ab6e-601c1d18a77c" containerID="821cb11c51b954e4d57104a6195f336f4d6ac6d743668fa80deb00a7ba68d5cc" exitCode=0 Dec 03 22:26:45.723083 master-0 kubenswrapper[36504]: I1203 22:26:45.723042 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-64967779f-xr54t" Dec 03 22:26:45.730953 master-0 kubenswrapper[36504]: I1203 22:26:45.730525 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3742-account-create-update-frr2z" event={"ID":"ad30900d-dcdc-49cb-ab6e-601c1d18a77c","Type":"ContainerDied","Data":"821cb11c51b954e4d57104a6195f336f4d6ac6d743668fa80deb00a7ba68d5cc"} Dec 03 22:26:45.731318 master-0 kubenswrapper[36504]: I1203 22:26:45.731209 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-whrpw" Dec 03 22:26:45.748011 master-0 kubenswrapper[36504]: I1203 22:26:45.747899 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-578bc969b7-l5g64" podStartSLOduration=7.747866599 podStartE2EDuration="7.747866599s" podCreationTimestamp="2025-12-03 22:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:45.730804852 +0000 UTC m=+970.950576869" watchObservedRunningTime="2025-12-03 22:26:45.747866599 +0000 UTC m=+970.967638606" Dec 03 22:26:45.885317 master-0 kubenswrapper[36504]: I1203 22:26:45.885101 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-64967779f-xr54t"] Dec 03 22:26:45.903442 master-0 kubenswrapper[36504]: I1203 22:26:45.903344 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-64967779f-xr54t"] Dec 03 22:26:46.451903 master-0 kubenswrapper[36504]: I1203 22:26:46.451366 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-08878fdb-89ab-4989-8ae2-b19ea6a70465\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb7f42b1-c240-4b0e-a779-3b622215a334\") pod \"glance-baebb-default-internal-api-0\" (UID: \"e69b8b4a-7bc2-407c-80f3-ea88c8467153\") " pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:46.536001 master-0 kubenswrapper[36504]: I1203 22:26:46.535929 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:46.874981 master-0 kubenswrapper[36504]: I1203 22:26:46.874893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"17576109-b684-4aa1-8895-6283460be452","Type":"ContainerStarted","Data":"8eb5dcd5a0a5483002ed91a7af435c7e393ec9e626a2ad00b68b6af6e12475ff"} Dec 03 22:26:46.903448 master-0 kubenswrapper[36504]: I1203 22:26:46.903294 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-central-agent" containerID="cri-o://489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e" gracePeriod=30 Dec 03 22:26:46.905063 master-0 kubenswrapper[36504]: I1203 22:26:46.904978 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="sg-core" containerID="cri-o://8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0" gracePeriod=30 Dec 03 22:26:46.905270 master-0 kubenswrapper[36504]: I1203 22:26:46.905227 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="proxy-httpd" containerID="cri-o://ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f" gracePeriod=30 Dec 03 22:26:46.908928 master-0 kubenswrapper[36504]: I1203 22:26:46.908863 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:26:46.910135 master-0 kubenswrapper[36504]: I1203 22:26:46.910004 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-notification-agent" containerID="cri-o://f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29" gracePeriod=30 Dec 03 22:26:47.096656 master-0 kubenswrapper[36504]: I1203 22:26:47.096476 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.639995093 podStartE2EDuration="13.096440094s" podCreationTimestamp="2025-12-03 22:26:34 +0000 UTC" firstStartedPulling="2025-12-03 22:26:35.997690146 +0000 UTC m=+961.217462153" lastFinishedPulling="2025-12-03 22:26:46.454135147 +0000 UTC m=+971.673907154" observedRunningTime="2025-12-03 22:26:47.046539425 +0000 UTC m=+972.266311622" watchObservedRunningTime="2025-12-03 22:26:47.096440094 +0000 UTC m=+972.316212091" Dec 03 22:26:47.133419 master-0 kubenswrapper[36504]: I1203 22:26:47.133305 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cb8747f-1989-4028-b62a-5bf9ea57f609" path="/var/lib/kubelet/pods/6cb8747f-1989-4028-b62a-5bf9ea57f609/volumes" Dec 03 22:26:47.134392 master-0 kubenswrapper[36504]: I1203 22:26:47.134359 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8" path="/var/lib/kubelet/pods/9c74d2a7-a3b0-4f33-8abc-fea1aaca54a8/volumes" Dec 03 22:26:47.279711 master-0 kubenswrapper[36504]: I1203 22:26:47.279612 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-baebb-default-internal-api-0"] Dec 03 22:26:47.785225 master-0 kubenswrapper[36504]: I1203 22:26:47.768119 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:47.861895 master-0 kubenswrapper[36504]: I1203 22:26:47.861334 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8b98\" (UniqueName: \"kubernetes.io/projected/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-kube-api-access-k8b98\") pod \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " Dec 03 22:26:47.861895 master-0 kubenswrapper[36504]: I1203 22:26:47.861535 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-operator-scripts\") pod \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\" (UID: \"ad30900d-dcdc-49cb-ab6e-601c1d18a77c\") " Dec 03 22:26:47.863883 master-0 kubenswrapper[36504]: I1203 22:26:47.863328 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad30900d-dcdc-49cb-ab6e-601c1d18a77c" (UID: "ad30900d-dcdc-49cb-ab6e-601c1d18a77c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:47.881261 master-0 kubenswrapper[36504]: I1203 22:26:47.880285 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-kube-api-access-k8b98" (OuterVolumeSpecName: "kube-api-access-k8b98") pod "ad30900d-dcdc-49cb-ab6e-601c1d18a77c" (UID: "ad30900d-dcdc-49cb-ab6e-601c1d18a77c"). InnerVolumeSpecName "kube-api-access-k8b98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:47.934961 master-0 kubenswrapper[36504]: I1203 22:26:47.934892 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:47.937142 master-0 kubenswrapper[36504]: I1203 22:26:47.937068 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-external-api-0" event={"ID":"17576109-b684-4aa1-8895-6283460be452","Type":"ContainerStarted","Data":"26f00a9d077d4b3d235e5aa8b13bb688c455ebd30f20ade4e97e26152a34f674"} Dec 03 22:26:47.941643 master-0 kubenswrapper[36504]: I1203 22:26:47.941596 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" event={"ID":"a046886b-4b70-4fb1-82ce-ef92db1677f7","Type":"ContainerDied","Data":"fd288fbe5e903bd90133b39ddc9e5e6eae57685afd63a13075c36e258ce57d15"} Dec 03 22:26:47.941643 master-0 kubenswrapper[36504]: I1203 22:26:47.941637 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd288fbe5e903bd90133b39ddc9e5e6eae57685afd63a13075c36e258ce57d15" Dec 03 22:26:47.941799 master-0 kubenswrapper[36504]: I1203 22:26:47.941730 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e258-account-create-update-lf7cr" Dec 03 22:26:47.942422 master-0 kubenswrapper[36504]: I1203 22:26:47.942388 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:47.943963 master-0 kubenswrapper[36504]: I1203 22:26:47.943851 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3742-account-create-update-frr2z" event={"ID":"ad30900d-dcdc-49cb-ab6e-601c1d18a77c","Type":"ContainerDied","Data":"1965ec6d0163d4d84d6411b3238e5708fb91a265594a72017fbc23c8ef749b70"} Dec 03 22:26:47.943963 master-0 kubenswrapper[36504]: I1203 22:26:47.943885 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1965ec6d0163d4d84d6411b3238e5708fb91a265594a72017fbc23c8ef749b70" Dec 03 22:26:47.943963 master-0 kubenswrapper[36504]: I1203 22:26:47.943964 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3742-account-create-update-frr2z" Dec 03 22:26:47.953997 master-0 kubenswrapper[36504]: I1203 22:26:47.953929 36504 generic.go:334] "Generic (PLEG): container finished" podID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerID="8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0" exitCode=2 Dec 03 22:26:47.953997 master-0 kubenswrapper[36504]: I1203 22:26:47.953988 36504 generic.go:334] "Generic (PLEG): container finished" podID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerID="f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29" exitCode=0 Dec 03 22:26:47.953997 master-0 kubenswrapper[36504]: I1203 22:26:47.953998 36504 generic.go:334] "Generic (PLEG): container finished" podID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerID="489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e" exitCode=0 Dec 03 22:26:47.954264 master-0 kubenswrapper[36504]: I1203 22:26:47.954067 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerStarted","Data":"ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f"} Dec 03 22:26:47.954264 master-0 kubenswrapper[36504]: I1203 22:26:47.954109 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerDied","Data":"8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0"} Dec 03 22:26:47.954264 master-0 kubenswrapper[36504]: I1203 22:26:47.954123 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerDied","Data":"f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29"} Dec 03 22:26:47.954264 master-0 kubenswrapper[36504]: I1203 22:26:47.954136 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerDied","Data":"489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e"} Dec 03 22:26:47.971633 master-0 kubenswrapper[36504]: I1203 22:26:47.971548 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"e69b8b4a-7bc2-407c-80f3-ea88c8467153","Type":"ContainerStarted","Data":"4202d9fc666b9561631c46142ad441253e451ff627db329f16ad787a5cb241d5"} Dec 03 22:26:47.973153 master-0 kubenswrapper[36504]: I1203 22:26:47.973105 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8b98\" (UniqueName: \"kubernetes.io/projected/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-kube-api-access-k8b98\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:47.973221 master-0 kubenswrapper[36504]: I1203 22:26:47.973174 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad30900d-dcdc-49cb-ab6e-601c1d18a77c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:47.998869 master-0 kubenswrapper[36504]: I1203 22:26:47.998819 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:48.074024 master-0 kubenswrapper[36504]: I1203 22:26:48.073892 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-baebb-default-external-api-0" podStartSLOduration=7.073859445 podStartE2EDuration="7.073859445s" podCreationTimestamp="2025-12-03 22:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:48.064091407 +0000 UTC m=+973.283863414" watchObservedRunningTime="2025-12-03 22:26:48.073859445 +0000 UTC m=+973.293631452" Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.076187 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b698a34-7bd9-4125-ad5c-885db2cf4959-operator-scripts\") pod \"1b698a34-7bd9-4125-ad5c-885db2cf4959\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.076892 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b698a34-7bd9-4125-ad5c-885db2cf4959-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b698a34-7bd9-4125-ad5c-885db2cf4959" (UID: "1b698a34-7bd9-4125-ad5c-885db2cf4959"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.076988 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6954864a-d445-46ef-8b09-dcb0543b8b23-operator-scripts\") pod \"6954864a-d445-46ef-8b09-dcb0543b8b23\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.077031 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2h9j\" (UniqueName: \"kubernetes.io/projected/1b698a34-7bd9-4125-ad5c-885db2cf4959-kube-api-access-h2h9j\") pod \"1b698a34-7bd9-4125-ad5c-885db2cf4959\" (UID: \"1b698a34-7bd9-4125-ad5c-885db2cf4959\") " Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.077132 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a046886b-4b70-4fb1-82ce-ef92db1677f7-operator-scripts\") pod \"a046886b-4b70-4fb1-82ce-ef92db1677f7\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.077221 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rctkw\" (UniqueName: \"kubernetes.io/projected/a046886b-4b70-4fb1-82ce-ef92db1677f7-kube-api-access-rctkw\") pod \"a046886b-4b70-4fb1-82ce-ef92db1677f7\" (UID: \"a046886b-4b70-4fb1-82ce-ef92db1677f7\") " Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.077252 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl6vb\" (UniqueName: \"kubernetes.io/projected/6954864a-d445-46ef-8b09-dcb0543b8b23-kube-api-access-dl6vb\") pod \"6954864a-d445-46ef-8b09-dcb0543b8b23\" (UID: \"6954864a-d445-46ef-8b09-dcb0543b8b23\") " Dec 03 22:26:48.078941 master-0 kubenswrapper[36504]: I1203 22:26:48.078446 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6954864a-d445-46ef-8b09-dcb0543b8b23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6954864a-d445-46ef-8b09-dcb0543b8b23" (UID: "6954864a-d445-46ef-8b09-dcb0543b8b23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:48.085369 master-0 kubenswrapper[36504]: I1203 22:26:48.078990 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b698a34-7bd9-4125-ad5c-885db2cf4959-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:48.085369 master-0 kubenswrapper[36504]: I1203 22:26:48.079008 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6954864a-d445-46ef-8b09-dcb0543b8b23-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:48.085369 master-0 kubenswrapper[36504]: I1203 22:26:48.080424 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a046886b-4b70-4fb1-82ce-ef92db1677f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a046886b-4b70-4fb1-82ce-ef92db1677f7" (UID: "a046886b-4b70-4fb1-82ce-ef92db1677f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:26:48.085369 master-0 kubenswrapper[36504]: I1203 22:26:48.083146 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a046886b-4b70-4fb1-82ce-ef92db1677f7-kube-api-access-rctkw" (OuterVolumeSpecName: "kube-api-access-rctkw") pod "a046886b-4b70-4fb1-82ce-ef92db1677f7" (UID: "a046886b-4b70-4fb1-82ce-ef92db1677f7"). InnerVolumeSpecName "kube-api-access-rctkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:48.100263 master-0 kubenswrapper[36504]: I1203 22:26:48.099371 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6954864a-d445-46ef-8b09-dcb0543b8b23-kube-api-access-dl6vb" (OuterVolumeSpecName: "kube-api-access-dl6vb") pod "6954864a-d445-46ef-8b09-dcb0543b8b23" (UID: "6954864a-d445-46ef-8b09-dcb0543b8b23"). InnerVolumeSpecName "kube-api-access-dl6vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:48.107252 master-0 kubenswrapper[36504]: I1203 22:26:48.107179 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b698a34-7bd9-4125-ad5c-885db2cf4959-kube-api-access-h2h9j" (OuterVolumeSpecName: "kube-api-access-h2h9j") pod "1b698a34-7bd9-4125-ad5c-885db2cf4959" (UID: "1b698a34-7bd9-4125-ad5c-885db2cf4959"). InnerVolumeSpecName "kube-api-access-h2h9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:48.182907 master-0 kubenswrapper[36504]: I1203 22:26:48.182811 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2h9j\" (UniqueName: \"kubernetes.io/projected/1b698a34-7bd9-4125-ad5c-885db2cf4959-kube-api-access-h2h9j\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:48.182907 master-0 kubenswrapper[36504]: I1203 22:26:48.182883 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a046886b-4b70-4fb1-82ce-ef92db1677f7-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:48.182907 master-0 kubenswrapper[36504]: I1203 22:26:48.182901 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rctkw\" (UniqueName: \"kubernetes.io/projected/a046886b-4b70-4fb1-82ce-ef92db1677f7-kube-api-access-rctkw\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:48.182907 master-0 kubenswrapper[36504]: I1203 22:26:48.182918 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl6vb\" (UniqueName: \"kubernetes.io/projected/6954864a-d445-46ef-8b09-dcb0543b8b23-kube-api-access-dl6vb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:48.998798 master-0 kubenswrapper[36504]: I1203 22:26:48.995872 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-w6cn8" Dec 03 22:26:48.998798 master-0 kubenswrapper[36504]: I1203 22:26:48.998173 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-w6cn8" event={"ID":"6954864a-d445-46ef-8b09-dcb0543b8b23","Type":"ContainerDied","Data":"3592e458010f28657f8ee972756f9459af5ba631b6d743c08e6babb385cb1c88"} Dec 03 22:26:48.998798 master-0 kubenswrapper[36504]: I1203 22:26:48.998268 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3592e458010f28657f8ee972756f9459af5ba631b6d743c08e6babb385cb1c88" Dec 03 22:26:49.003116 master-0 kubenswrapper[36504]: I1203 22:26:49.003062 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"e69b8b4a-7bc2-407c-80f3-ea88c8467153","Type":"ContainerStarted","Data":"31d9a9122ee5a6770a4b5d95520622508b59148ecc3f3bfba4ee6991560359de"} Dec 03 22:26:49.007699 master-0 kubenswrapper[36504]: I1203 22:26:49.006341 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-3ad6-account-create-update-gj27c" event={"ID":"1b698a34-7bd9-4125-ad5c-885db2cf4959","Type":"ContainerDied","Data":"38abee6b9ea8aa55e492adbfb90adbc8592471c8318cef5b13da366b2557da56"} Dec 03 22:26:49.007699 master-0 kubenswrapper[36504]: I1203 22:26:49.006390 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-3ad6-account-create-update-gj27c" Dec 03 22:26:49.007699 master-0 kubenswrapper[36504]: I1203 22:26:49.006433 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38abee6b9ea8aa55e492adbfb90adbc8592471c8318cef5b13da366b2557da56" Dec 03 22:26:50.034452 master-0 kubenswrapper[36504]: I1203 22:26:50.034366 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-baebb-default-internal-api-0" event={"ID":"e69b8b4a-7bc2-407c-80f3-ea88c8467153","Type":"ContainerStarted","Data":"178b461b97920305e49416a82f36afc9bd35bc9ab45d99efc4062917b7201074"} Dec 03 22:26:50.075536 master-0 kubenswrapper[36504]: I1203 22:26:50.075409 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-baebb-default-internal-api-0" podStartSLOduration=6.075375631 podStartE2EDuration="6.075375631s" podCreationTimestamp="2025-12-03 22:26:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:26:50.063105345 +0000 UTC m=+975.282877362" watchObservedRunningTime="2025-12-03 22:26:50.075375631 +0000 UTC m=+975.295147638" Dec 03 22:26:50.987413 master-0 kubenswrapper[36504]: I1203 22:26:50.987135 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-687b5cdd65-qsjb4" Dec 03 22:26:51.151536 master-0 kubenswrapper[36504]: I1203 22:26:51.151411 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-747f594c7b-dwjx4"] Dec 03 22:26:51.152431 master-0 kubenswrapper[36504]: I1203 22:26:51.152092 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" containerID="cri-o://56cf6bcaa752a1320bb456cb3340911d8ab586f6773586d8d08b4660feb3f087" gracePeriod=60 Dec 03 22:26:51.163430 master-0 kubenswrapper[36504]: I1203 22:26:51.162829 36504 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.128.1.1:8000/healthcheck\": EOF" Dec 03 22:26:51.163430 master-0 kubenswrapper[36504]: I1203 22:26:51.162934 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.128.1.1:8000/healthcheck\": EOF" Dec 03 22:26:52.966130 master-0 kubenswrapper[36504]: I1203 22:26:52.966030 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h9tlt"] Dec 03 22:26:52.967167 master-0 kubenswrapper[36504]: E1203 22:26:52.967133 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad30900d-dcdc-49cb-ab6e-601c1d18a77c" containerName="mariadb-account-create-update" Dec 03 22:26:52.967167 master-0 kubenswrapper[36504]: I1203 22:26:52.967162 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad30900d-dcdc-49cb-ab6e-601c1d18a77c" containerName="mariadb-account-create-update" Dec 03 22:26:52.967320 master-0 kubenswrapper[36504]: E1203 22:26:52.967205 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6954864a-d445-46ef-8b09-dcb0543b8b23" containerName="mariadb-database-create" Dec 03 22:26:52.967320 master-0 kubenswrapper[36504]: I1203 22:26:52.967215 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6954864a-d445-46ef-8b09-dcb0543b8b23" containerName="mariadb-database-create" Dec 03 22:26:52.967320 master-0 kubenswrapper[36504]: E1203 22:26:52.967279 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a046886b-4b70-4fb1-82ce-ef92db1677f7" containerName="mariadb-account-create-update" Dec 03 22:26:52.967320 master-0 kubenswrapper[36504]: I1203 22:26:52.967290 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a046886b-4b70-4fb1-82ce-ef92db1677f7" containerName="mariadb-account-create-update" Dec 03 22:26:52.967504 master-0 kubenswrapper[36504]: E1203 22:26:52.967343 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b698a34-7bd9-4125-ad5c-885db2cf4959" containerName="mariadb-account-create-update" Dec 03 22:26:52.967504 master-0 kubenswrapper[36504]: I1203 22:26:52.967352 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b698a34-7bd9-4125-ad5c-885db2cf4959" containerName="mariadb-account-create-update" Dec 03 22:26:52.967707 master-0 kubenswrapper[36504]: I1203 22:26:52.967679 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad30900d-dcdc-49cb-ab6e-601c1d18a77c" containerName="mariadb-account-create-update" Dec 03 22:26:52.967797 master-0 kubenswrapper[36504]: I1203 22:26:52.967727 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6954864a-d445-46ef-8b09-dcb0543b8b23" containerName="mariadb-database-create" Dec 03 22:26:52.967797 master-0 kubenswrapper[36504]: I1203 22:26:52.967740 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b698a34-7bd9-4125-ad5c-885db2cf4959" containerName="mariadb-account-create-update" Dec 03 22:26:52.967797 master-0 kubenswrapper[36504]: I1203 22:26:52.967783 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a046886b-4b70-4fb1-82ce-ef92db1677f7" containerName="mariadb-account-create-update" Dec 03 22:26:52.969218 master-0 kubenswrapper[36504]: I1203 22:26:52.969184 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:52.972556 master-0 kubenswrapper[36504]: I1203 22:26:52.972491 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 22:26:52.973271 master-0 kubenswrapper[36504]: I1203 22:26:52.973237 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 22:26:52.981763 master-0 kubenswrapper[36504]: I1203 22:26:52.981667 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h9tlt"] Dec 03 22:26:53.060922 master-0 kubenswrapper[36504]: I1203 22:26:53.060831 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhns\" (UniqueName: \"kubernetes.io/projected/900943d0-fa02-401a-ab6a-7ed811802669-kube-api-access-xdhns\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.060922 master-0 kubenswrapper[36504]: I1203 22:26:53.060935 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-config-data\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.061281 master-0 kubenswrapper[36504]: I1203 22:26:53.061000 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-scripts\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.061281 master-0 kubenswrapper[36504]: I1203 22:26:53.061183 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.168390 master-0 kubenswrapper[36504]: I1203 22:26:53.168292 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhns\" (UniqueName: \"kubernetes.io/projected/900943d0-fa02-401a-ab6a-7ed811802669-kube-api-access-xdhns\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.168390 master-0 kubenswrapper[36504]: I1203 22:26:53.168404 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-config-data\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.168798 master-0 kubenswrapper[36504]: I1203 22:26:53.168544 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-scripts\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.168798 master-0 kubenswrapper[36504]: I1203 22:26:53.168683 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.173367 master-0 kubenswrapper[36504]: I1203 22:26:53.173309 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.173531 master-0 kubenswrapper[36504]: I1203 22:26:53.173374 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-config-data\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.177790 master-0 kubenswrapper[36504]: I1203 22:26:53.175438 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-scripts\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.193308 master-0 kubenswrapper[36504]: I1203 22:26:53.193238 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhns\" (UniqueName: \"kubernetes.io/projected/900943d0-fa02-401a-ab6a-7ed811802669-kube-api-access-xdhns\") pod \"nova-cell0-conductor-db-sync-h9tlt\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.321190 master-0 kubenswrapper[36504]: I1203 22:26:53.320960 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:26:53.879064 master-0 kubenswrapper[36504]: I1203 22:26:53.878851 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h9tlt"] Dec 03 22:26:53.892847 master-0 kubenswrapper[36504]: W1203 22:26:53.892714 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod900943d0_fa02_401a_ab6a_7ed811802669.slice/crio-060905482e1d11119343323b1a22294f13e729979271e06575a301545c5463b9 WatchSource:0}: Error finding container 060905482e1d11119343323b1a22294f13e729979271e06575a301545c5463b9: Status 404 returned error can't find the container with id 060905482e1d11119343323b1a22294f13e729979271e06575a301545c5463b9 Dec 03 22:26:54.131110 master-0 kubenswrapper[36504]: I1203 22:26:54.130942 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" event={"ID":"900943d0-fa02-401a-ab6a-7ed811802669","Type":"ContainerStarted","Data":"060905482e1d11119343323b1a22294f13e729979271e06575a301545c5463b9"} Dec 03 22:26:54.438081 master-0 kubenswrapper[36504]: I1203 22:26:54.437895 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6bbd7fd869-fccfd" Dec 03 22:26:54.477392 master-0 kubenswrapper[36504]: I1203 22:26:54.477313 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:54.477392 master-0 kubenswrapper[36504]: I1203 22:26:54.477388 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:54.535809 master-0 kubenswrapper[36504]: I1203 22:26:54.529581 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5cd76d4c4c-pwkqf"] Dec 03 22:26:54.535809 master-0 kubenswrapper[36504]: I1203 22:26:54.530647 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" containerName="heat-engine" containerID="cri-o://4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" gracePeriod=60 Dec 03 22:26:54.564320 master-0 kubenswrapper[36504]: I1203 22:26:54.562800 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:54.602364 master-0 kubenswrapper[36504]: I1203 22:26:54.602248 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:54.669916 master-0 kubenswrapper[36504]: E1203 22:26:54.668462 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 22:26:54.679480 master-0 kubenswrapper[36504]: E1203 22:26:54.679362 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 22:26:54.704298 master-0 kubenswrapper[36504]: E1203 22:26:54.700457 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 22:26:54.704298 master-0 kubenswrapper[36504]: E1203 22:26:54.700638 36504 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" containerName="heat-engine" Dec 03 22:26:54.835656 master-0 kubenswrapper[36504]: I1203 22:26:54.835562 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.128.1.1:8000/healthcheck\": dial tcp 10.128.1.1:8000: connect: connection refused" Dec 03 22:26:54.908890 master-0 kubenswrapper[36504]: I1203 22:26:54.908748 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6clwd" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="registry-server" probeResult="failure" output=< Dec 03 22:26:54.908890 master-0 kubenswrapper[36504]: timeout: failed to connect service ":50051" within 1s Dec 03 22:26:54.908890 master-0 kubenswrapper[36504]: > Dec 03 22:26:55.180067 master-0 kubenswrapper[36504]: I1203 22:26:55.179999 36504 generic.go:334] "Generic (PLEG): container finished" podID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerID="56cf6bcaa752a1320bb456cb3340911d8ab586f6773586d8d08b4660feb3f087" exitCode=0 Dec 03 22:26:55.180646 master-0 kubenswrapper[36504]: I1203 22:26:55.180100 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" event={"ID":"7fad818c-c25b-40e8-aeea-e4e9dae1b839","Type":"ContainerDied","Data":"56cf6bcaa752a1320bb456cb3340911d8ab586f6773586d8d08b4660feb3f087"} Dec 03 22:26:55.181227 master-0 kubenswrapper[36504]: I1203 22:26:55.181190 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:55.181558 master-0 kubenswrapper[36504]: I1203 22:26:55.181233 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:55.337796 master-0 kubenswrapper[36504]: I1203 22:26:55.337085 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:55.468167 master-0 kubenswrapper[36504]: I1203 22:26:55.466518 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data-custom\") pod \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " Dec 03 22:26:55.468167 master-0 kubenswrapper[36504]: I1203 22:26:55.466748 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnmmj\" (UniqueName: \"kubernetes.io/projected/7fad818c-c25b-40e8-aeea-e4e9dae1b839-kube-api-access-qnmmj\") pod \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " Dec 03 22:26:55.468167 master-0 kubenswrapper[36504]: I1203 22:26:55.466836 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-combined-ca-bundle\") pod \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " Dec 03 22:26:55.468167 master-0 kubenswrapper[36504]: I1203 22:26:55.467439 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data\") pod \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\" (UID: \"7fad818c-c25b-40e8-aeea-e4e9dae1b839\") " Dec 03 22:26:55.478791 master-0 kubenswrapper[36504]: I1203 22:26:55.474790 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fad818c-c25b-40e8-aeea-e4e9dae1b839-kube-api-access-qnmmj" (OuterVolumeSpecName: "kube-api-access-qnmmj") pod "7fad818c-c25b-40e8-aeea-e4e9dae1b839" (UID: "7fad818c-c25b-40e8-aeea-e4e9dae1b839"). InnerVolumeSpecName "kube-api-access-qnmmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:55.478791 master-0 kubenswrapper[36504]: I1203 22:26:55.475868 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7fad818c-c25b-40e8-aeea-e4e9dae1b839" (UID: "7fad818c-c25b-40e8-aeea-e4e9dae1b839"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:55.510861 master-0 kubenswrapper[36504]: I1203 22:26:55.510496 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7fad818c-c25b-40e8-aeea-e4e9dae1b839" (UID: "7fad818c-c25b-40e8-aeea-e4e9dae1b839"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:55.566489 master-0 kubenswrapper[36504]: I1203 22:26:55.566395 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data" (OuterVolumeSpecName: "config-data") pod "7fad818c-c25b-40e8-aeea-e4e9dae1b839" (UID: "7fad818c-c25b-40e8-aeea-e4e9dae1b839"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:55.574250 master-0 kubenswrapper[36504]: I1203 22:26:55.573656 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:55.574250 master-0 kubenswrapper[36504]: I1203 22:26:55.573720 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:55.574250 master-0 kubenswrapper[36504]: I1203 22:26:55.573730 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fad818c-c25b-40e8-aeea-e4e9dae1b839-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:55.574250 master-0 kubenswrapper[36504]: I1203 22:26:55.573744 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnmmj\" (UniqueName: \"kubernetes.io/projected/7fad818c-c25b-40e8-aeea-e4e9dae1b839-kube-api-access-qnmmj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:56.256760 master-0 kubenswrapper[36504]: I1203 22:26:56.256696 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" Dec 03 22:26:56.257719 master-0 kubenswrapper[36504]: I1203 22:26:56.257669 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-747f594c7b-dwjx4" event={"ID":"7fad818c-c25b-40e8-aeea-e4e9dae1b839","Type":"ContainerDied","Data":"052b5fd2b6cada2cb4a19ff714354d7d254ddc442197f2725e94596f5d0d5158"} Dec 03 22:26:56.257824 master-0 kubenswrapper[36504]: I1203 22:26:56.257763 36504 scope.go:117] "RemoveContainer" containerID="56cf6bcaa752a1320bb456cb3340911d8ab586f6773586d8d08b4660feb3f087" Dec 03 22:26:56.326892 master-0 kubenswrapper[36504]: I1203 22:26:56.323848 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-747f594c7b-dwjx4"] Dec 03 22:26:56.342853 master-0 kubenswrapper[36504]: I1203 22:26:56.342734 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-747f594c7b-dwjx4"] Dec 03 22:26:56.401620 master-0 kubenswrapper[36504]: I1203 22:26:56.401545 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-578bc969b7-l5g64" Dec 03 22:26:56.533382 master-0 kubenswrapper[36504]: I1203 22:26:56.533291 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-768fc575ff-n7992"] Dec 03 22:26:56.543082 master-0 kubenswrapper[36504]: I1203 22:26:56.541122 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:56.547839 master-0 kubenswrapper[36504]: I1203 22:26:56.545982 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:56.614255 master-0 kubenswrapper[36504]: I1203 22:26:56.613238 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:56.650516 master-0 kubenswrapper[36504]: I1203 22:26:56.650147 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:57.123809 master-0 kubenswrapper[36504]: I1203 22:26:57.123716 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" path="/var/lib/kubelet/pods/7fad818c-c25b-40e8-aeea-e4e9dae1b839/volumes" Dec 03 22:26:57.253528 master-0 kubenswrapper[36504]: I1203 22:26:57.250533 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:57.277543 master-0 kubenswrapper[36504]: I1203 22:26:57.277391 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-768fc575ff-n7992" event={"ID":"28b645fa-2039-4145-9f32-a07ee0028539","Type":"ContainerDied","Data":"289a7a77cd156289dfff5ac72aec9491bd5efb87d5e421d5eef0bea5aa494aa0"} Dec 03 22:26:57.277543 master-0 kubenswrapper[36504]: I1203 22:26:57.277544 36504 scope.go:117] "RemoveContainer" containerID="d6fb0970510291553b13e0b3bfb37785787e94130a821f1b646819ee43863437" Dec 03 22:26:57.278336 master-0 kubenswrapper[36504]: I1203 22:26:57.277604 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-768fc575ff-n7992" Dec 03 22:26:57.291136 master-0 kubenswrapper[36504]: I1203 22:26:57.290825 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:57.293160 master-0 kubenswrapper[36504]: I1203 22:26:57.292982 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:57.413016 master-0 kubenswrapper[36504]: I1203 22:26:57.412932 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data-custom\") pod \"28b645fa-2039-4145-9f32-a07ee0028539\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " Dec 03 22:26:57.413016 master-0 kubenswrapper[36504]: I1203 22:26:57.413027 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-combined-ca-bundle\") pod \"28b645fa-2039-4145-9f32-a07ee0028539\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " Dec 03 22:26:57.413410 master-0 kubenswrapper[36504]: I1203 22:26:57.413069 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6gg6\" (UniqueName: \"kubernetes.io/projected/28b645fa-2039-4145-9f32-a07ee0028539-kube-api-access-w6gg6\") pod \"28b645fa-2039-4145-9f32-a07ee0028539\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " Dec 03 22:26:57.413957 master-0 kubenswrapper[36504]: I1203 22:26:57.413559 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data\") pod \"28b645fa-2039-4145-9f32-a07ee0028539\" (UID: \"28b645fa-2039-4145-9f32-a07ee0028539\") " Dec 03 22:26:57.422079 master-0 kubenswrapper[36504]: I1203 22:26:57.421998 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28b645fa-2039-4145-9f32-a07ee0028539" (UID: "28b645fa-2039-4145-9f32-a07ee0028539"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:57.449319 master-0 kubenswrapper[36504]: I1203 22:26:57.446053 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b645fa-2039-4145-9f32-a07ee0028539-kube-api-access-w6gg6" (OuterVolumeSpecName: "kube-api-access-w6gg6") pod "28b645fa-2039-4145-9f32-a07ee0028539" (UID: "28b645fa-2039-4145-9f32-a07ee0028539"). InnerVolumeSpecName "kube-api-access-w6gg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:26:57.488578 master-0 kubenswrapper[36504]: I1203 22:26:57.485232 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28b645fa-2039-4145-9f32-a07ee0028539" (UID: "28b645fa-2039-4145-9f32-a07ee0028539"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:57.492881 master-0 kubenswrapper[36504]: I1203 22:26:57.490915 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data" (OuterVolumeSpecName: "config-data") pod "28b645fa-2039-4145-9f32-a07ee0028539" (UID: "28b645fa-2039-4145-9f32-a07ee0028539"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:26:57.523954 master-0 kubenswrapper[36504]: I1203 22:26:57.523697 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:57.523954 master-0 kubenswrapper[36504]: I1203 22:26:57.523786 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:57.523954 master-0 kubenswrapper[36504]: I1203 22:26:57.523803 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6gg6\" (UniqueName: \"kubernetes.io/projected/28b645fa-2039-4145-9f32-a07ee0028539-kube-api-access-w6gg6\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:57.523954 master-0 kubenswrapper[36504]: I1203 22:26:57.523816 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28b645fa-2039-4145-9f32-a07ee0028539-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:26:57.655722 master-0 kubenswrapper[36504]: I1203 22:26:57.655510 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-768fc575ff-n7992"] Dec 03 22:26:57.673158 master-0 kubenswrapper[36504]: I1203 22:26:57.672398 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-768fc575ff-n7992"] Dec 03 22:26:57.939108 master-0 kubenswrapper[36504]: I1203 22:26:57.934802 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:57.939108 master-0 kubenswrapper[36504]: I1203 22:26:57.935068 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:26:58.059283 master-0 kubenswrapper[36504]: I1203 22:26:58.059199 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-external-api-0" Dec 03 22:26:59.122526 master-0 kubenswrapper[36504]: I1203 22:26:59.122436 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b645fa-2039-4145-9f32-a07ee0028539" path="/var/lib/kubelet/pods/28b645fa-2039-4145-9f32-a07ee0028539/volumes" Dec 03 22:26:59.777883 master-0 kubenswrapper[36504]: I1203 22:26:59.777808 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:26:59.778332 master-0 kubenswrapper[36504]: I1203 22:26:59.777953 36504 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 22:26:59.865811 master-0 kubenswrapper[36504]: I1203 22:26:59.864095 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-baebb-default-internal-api-0" Dec 03 22:27:03.879429 master-0 kubenswrapper[36504]: I1203 22:27:03.879338 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:27:03.948131 master-0 kubenswrapper[36504]: I1203 22:27:03.948047 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:27:04.132226 master-0 kubenswrapper[36504]: I1203 22:27:04.132035 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6clwd"] Dec 03 22:27:04.631670 master-0 kubenswrapper[36504]: E1203 22:27:04.631555 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 22:27:04.636238 master-0 kubenswrapper[36504]: E1203 22:27:04.635343 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 22:27:04.642131 master-0 kubenswrapper[36504]: E1203 22:27:04.642073 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 22:27:04.642238 master-0 kubenswrapper[36504]: E1203 22:27:04.642125 36504 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" containerName="heat-engine" Dec 03 22:27:04.952712 master-0 kubenswrapper[36504]: I1203 22:27:04.952497 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 22:27:05.479781 master-0 kubenswrapper[36504]: I1203 22:27:05.479647 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6clwd" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="registry-server" containerID="cri-o://82b6b4d2789b2a777ecf613850368bf695b90d7a90bec3bf2d1c73de6c234082" gracePeriod=2 Dec 03 22:27:06.496441 master-0 kubenswrapper[36504]: I1203 22:27:06.496382 36504 generic.go:334] "Generic (PLEG): container finished" podID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerID="82b6b4d2789b2a777ecf613850368bf695b90d7a90bec3bf2d1c73de6c234082" exitCode=0 Dec 03 22:27:06.497429 master-0 kubenswrapper[36504]: I1203 22:27:06.497402 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerDied","Data":"82b6b4d2789b2a777ecf613850368bf695b90d7a90bec3bf2d1c73de6c234082"} Dec 03 22:27:06.500949 master-0 kubenswrapper[36504]: I1203 22:27:06.500923 36504 generic.go:334] "Generic (PLEG): container finished" podID="df6c2617-5f7b-41c0-a97a-534404895c92" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" exitCode=0 Dec 03 22:27:06.501191 master-0 kubenswrapper[36504]: I1203 22:27:06.501107 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" event={"ID":"df6c2617-5f7b-41c0-a97a-534404895c92","Type":"ContainerDied","Data":"4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176"} Dec 03 22:27:07.803178 master-0 kubenswrapper[36504]: I1203 22:27:07.803102 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:27:07.905312 master-0 kubenswrapper[36504]: I1203 22:27:07.905249 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:27:07.957797 master-0 kubenswrapper[36504]: I1203 22:27:07.957510 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkzfs\" (UniqueName: \"kubernetes.io/projected/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-kube-api-access-pkzfs\") pod \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " Dec 03 22:27:07.957797 master-0 kubenswrapper[36504]: I1203 22:27:07.957740 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-utilities\") pod \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " Dec 03 22:27:07.958164 master-0 kubenswrapper[36504]: I1203 22:27:07.958146 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-catalog-content\") pod \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\" (UID: \"8447ec5c-34fd-44ce-8bd8-2174ca072e6a\") " Dec 03 22:27:07.965736 master-0 kubenswrapper[36504]: I1203 22:27:07.965480 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-kube-api-access-pkzfs" (OuterVolumeSpecName: "kube-api-access-pkzfs") pod "8447ec5c-34fd-44ce-8bd8-2174ca072e6a" (UID: "8447ec5c-34fd-44ce-8bd8-2174ca072e6a"). InnerVolumeSpecName "kube-api-access-pkzfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:07.968490 master-0 kubenswrapper[36504]: I1203 22:27:07.966075 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-utilities" (OuterVolumeSpecName: "utilities") pod "8447ec5c-34fd-44ce-8bd8-2174ca072e6a" (UID: "8447ec5c-34fd-44ce-8bd8-2174ca072e6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:08.062325 master-0 kubenswrapper[36504]: I1203 22:27:08.062153 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-combined-ca-bundle\") pod \"df6c2617-5f7b-41c0-a97a-534404895c92\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " Dec 03 22:27:08.062631 master-0 kubenswrapper[36504]: I1203 22:27:08.062471 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsvph\" (UniqueName: \"kubernetes.io/projected/df6c2617-5f7b-41c0-a97a-534404895c92-kube-api-access-jsvph\") pod \"df6c2617-5f7b-41c0-a97a-534404895c92\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " Dec 03 22:27:08.062631 master-0 kubenswrapper[36504]: I1203 22:27:08.062522 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data\") pod \"df6c2617-5f7b-41c0-a97a-534404895c92\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " Dec 03 22:27:08.062631 master-0 kubenswrapper[36504]: I1203 22:27:08.062617 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data-custom\") pod \"df6c2617-5f7b-41c0-a97a-534404895c92\" (UID: \"df6c2617-5f7b-41c0-a97a-534404895c92\") " Dec 03 22:27:08.063566 master-0 kubenswrapper[36504]: I1203 22:27:08.063538 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkzfs\" (UniqueName: \"kubernetes.io/projected/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-kube-api-access-pkzfs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.063614 master-0 kubenswrapper[36504]: I1203 22:27:08.063564 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.071052 master-0 kubenswrapper[36504]: I1203 22:27:08.070964 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "df6c2617-5f7b-41c0-a97a-534404895c92" (UID: "df6c2617-5f7b-41c0-a97a-534404895c92"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:08.071349 master-0 kubenswrapper[36504]: I1203 22:27:08.071117 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6c2617-5f7b-41c0-a97a-534404895c92-kube-api-access-jsvph" (OuterVolumeSpecName: "kube-api-access-jsvph") pod "df6c2617-5f7b-41c0-a97a-534404895c92" (UID: "df6c2617-5f7b-41c0-a97a-534404895c92"). InnerVolumeSpecName "kube-api-access-jsvph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:08.080640 master-0 kubenswrapper[36504]: I1203 22:27:08.080484 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8447ec5c-34fd-44ce-8bd8-2174ca072e6a" (UID: "8447ec5c-34fd-44ce-8bd8-2174ca072e6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:08.105045 master-0 kubenswrapper[36504]: I1203 22:27:08.104962 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df6c2617-5f7b-41c0-a97a-534404895c92" (UID: "df6c2617-5f7b-41c0-a97a-534404895c92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:08.138705 master-0 kubenswrapper[36504]: I1203 22:27:08.138589 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data" (OuterVolumeSpecName: "config-data") pod "df6c2617-5f7b-41c0-a97a-534404895c92" (UID: "df6c2617-5f7b-41c0-a97a-534404895c92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:08.167403 master-0 kubenswrapper[36504]: I1203 22:27:08.167307 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.167403 master-0 kubenswrapper[36504]: I1203 22:27:08.167375 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8447ec5c-34fd-44ce-8bd8-2174ca072e6a-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.167403 master-0 kubenswrapper[36504]: I1203 22:27:08.167390 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsvph\" (UniqueName: \"kubernetes.io/projected/df6c2617-5f7b-41c0-a97a-534404895c92-kube-api-access-jsvph\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.167867 master-0 kubenswrapper[36504]: I1203 22:27:08.167468 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.167867 master-0 kubenswrapper[36504]: I1203 22:27:08.167526 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df6c2617-5f7b-41c0-a97a-534404895c92-config-data-custom\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:08.554809 master-0 kubenswrapper[36504]: I1203 22:27:08.554694 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" event={"ID":"df6c2617-5f7b-41c0-a97a-534404895c92","Type":"ContainerDied","Data":"b59ecfc52fa9311940d33af8d69b44977538658bdb35adeb1d1430457ed38ee1"} Dec 03 22:27:08.555238 master-0 kubenswrapper[36504]: I1203 22:27:08.554878 36504 scope.go:117] "RemoveContainer" containerID="4163f1edf6f442ae9aeb1a8e56e4a02009a5110ecee409419281fbd59ee41176" Dec 03 22:27:08.555238 master-0 kubenswrapper[36504]: I1203 22:27:08.554789 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5cd76d4c4c-pwkqf" Dec 03 22:27:08.559837 master-0 kubenswrapper[36504]: I1203 22:27:08.559568 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" event={"ID":"900943d0-fa02-401a-ab6a-7ed811802669","Type":"ContainerStarted","Data":"f0e9c5ed4f553b79cb34acd474a2008633c4b1f9f75e4cd26e4e2843eabf92f6"} Dec 03 22:27:08.565638 master-0 kubenswrapper[36504]: I1203 22:27:08.565571 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6clwd" event={"ID":"8447ec5c-34fd-44ce-8bd8-2174ca072e6a","Type":"ContainerDied","Data":"934a0720c431c88071a3e46909f80f2d0ea6c3fe7c39a7ab5e6169acef99cdb4"} Dec 03 22:27:08.565786 master-0 kubenswrapper[36504]: I1203 22:27:08.565750 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6clwd" Dec 03 22:27:08.595859 master-0 kubenswrapper[36504]: I1203 22:27:08.595699 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" podStartSLOduration=3.088606743 podStartE2EDuration="16.595665364s" podCreationTimestamp="2025-12-03 22:26:52 +0000 UTC" firstStartedPulling="2025-12-03 22:26:53.90010338 +0000 UTC m=+979.119875387" lastFinishedPulling="2025-12-03 22:27:07.407162001 +0000 UTC m=+992.626934008" observedRunningTime="2025-12-03 22:27:08.582534022 +0000 UTC m=+993.802306019" watchObservedRunningTime="2025-12-03 22:27:08.595665364 +0000 UTC m=+993.815437371" Dec 03 22:27:08.612938 master-0 kubenswrapper[36504]: I1203 22:27:08.612886 36504 scope.go:117] "RemoveContainer" containerID="82b6b4d2789b2a777ecf613850368bf695b90d7a90bec3bf2d1c73de6c234082" Dec 03 22:27:08.651867 master-0 kubenswrapper[36504]: I1203 22:27:08.651691 36504 scope.go:117] "RemoveContainer" containerID="1665e2cd852155cf522a89f2d86a7767e3ee7e15ae5b99b3dd16367fc196584d" Dec 03 22:27:08.656757 master-0 kubenswrapper[36504]: I1203 22:27:08.656701 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5cd76d4c4c-pwkqf"] Dec 03 22:27:08.670628 master-0 kubenswrapper[36504]: I1203 22:27:08.670500 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5cd76d4c4c-pwkqf"] Dec 03 22:27:08.684260 master-0 kubenswrapper[36504]: I1203 22:27:08.684207 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6clwd"] Dec 03 22:27:08.692742 master-0 kubenswrapper[36504]: I1203 22:27:08.692669 36504 scope.go:117] "RemoveContainer" containerID="6e837fe13ea120260e23266e0eb7beb38aeedfd5f5de9cf44e30e99e132605a4" Dec 03 22:27:08.698338 master-0 kubenswrapper[36504]: I1203 22:27:08.698254 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6clwd"] Dec 03 22:27:09.114071 master-0 kubenswrapper[36504]: I1203 22:27:09.113839 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" path="/var/lib/kubelet/pods/8447ec5c-34fd-44ce-8bd8-2174ca072e6a/volumes" Dec 03 22:27:09.114782 master-0 kubenswrapper[36504]: I1203 22:27:09.114697 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" path="/var/lib/kubelet/pods/df6c2617-5f7b-41c0-a97a-534404895c92/volumes" Dec 03 22:27:12.500987 master-0 kubenswrapper[36504]: I1203 22:27:12.500899 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7gmnn"] Dec 03 22:27:12.501899 master-0 kubenswrapper[36504]: E1203 22:27:12.501863 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b645fa-2039-4145-9f32-a07ee0028539" containerName="heat-api" Dec 03 22:27:12.501899 master-0 kubenswrapper[36504]: I1203 22:27:12.501889 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b645fa-2039-4145-9f32-a07ee0028539" containerName="heat-api" Dec 03 22:27:12.501986 master-0 kubenswrapper[36504]: E1203 22:27:12.501916 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b645fa-2039-4145-9f32-a07ee0028539" containerName="heat-api" Dec 03 22:27:12.501986 master-0 kubenswrapper[36504]: I1203 22:27:12.501926 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b645fa-2039-4145-9f32-a07ee0028539" containerName="heat-api" Dec 03 22:27:12.501986 master-0 kubenswrapper[36504]: E1203 22:27:12.501942 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="registry-server" Dec 03 22:27:12.501986 master-0 kubenswrapper[36504]: I1203 22:27:12.501952 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="registry-server" Dec 03 22:27:12.501986 master-0 kubenswrapper[36504]: E1203 22:27:12.501979 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" Dec 03 22:27:12.501986 master-0 kubenswrapper[36504]: I1203 22:27:12.501989 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" Dec 03 22:27:12.502293 master-0 kubenswrapper[36504]: E1203 22:27:12.502029 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="extract-content" Dec 03 22:27:12.502293 master-0 kubenswrapper[36504]: I1203 22:27:12.502039 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="extract-content" Dec 03 22:27:12.502293 master-0 kubenswrapper[36504]: E1203 22:27:12.502055 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="extract-utilities" Dec 03 22:27:12.502293 master-0 kubenswrapper[36504]: I1203 22:27:12.502064 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="extract-utilities" Dec 03 22:27:12.502293 master-0 kubenswrapper[36504]: E1203 22:27:12.502098 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" containerName="heat-engine" Dec 03 22:27:12.502293 master-0 kubenswrapper[36504]: I1203 22:27:12.502107 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" containerName="heat-engine" Dec 03 22:27:12.502474 master-0 kubenswrapper[36504]: I1203 22:27:12.502433 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b645fa-2039-4145-9f32-a07ee0028539" containerName="heat-api" Dec 03 22:27:12.502514 master-0 kubenswrapper[36504]: I1203 22:27:12.502473 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b645fa-2039-4145-9f32-a07ee0028539" containerName="heat-api" Dec 03 22:27:12.502551 master-0 kubenswrapper[36504]: I1203 22:27:12.502518 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6c2617-5f7b-41c0-a97a-534404895c92" containerName="heat-engine" Dec 03 22:27:12.502586 master-0 kubenswrapper[36504]: I1203 22:27:12.502557 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fad818c-c25b-40e8-aeea-e4e9dae1b839" containerName="heat-cfnapi" Dec 03 22:27:12.502586 master-0 kubenswrapper[36504]: I1203 22:27:12.502571 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8447ec5c-34fd-44ce-8bd8-2174ca072e6a" containerName="registry-server" Dec 03 22:27:12.505948 master-0 kubenswrapper[36504]: I1203 22:27:12.505253 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.529801 master-0 kubenswrapper[36504]: I1203 22:27:12.525167 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7gmnn"] Dec 03 22:27:12.611902 master-0 kubenswrapper[36504]: I1203 22:27:12.611807 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-catalog-content\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.611902 master-0 kubenswrapper[36504]: I1203 22:27:12.611914 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npw48\" (UniqueName: \"kubernetes.io/projected/78a46fde-0f07-4027-a8d4-9557a26c52d7-kube-api-access-npw48\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.612564 master-0 kubenswrapper[36504]: I1203 22:27:12.612479 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-utilities\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.716582 master-0 kubenswrapper[36504]: I1203 22:27:12.716497 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-catalog-content\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.716955 master-0 kubenswrapper[36504]: I1203 22:27:12.716604 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npw48\" (UniqueName: \"kubernetes.io/projected/78a46fde-0f07-4027-a8d4-9557a26c52d7-kube-api-access-npw48\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.717332 master-0 kubenswrapper[36504]: I1203 22:27:12.717287 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-catalog-content\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.717603 master-0 kubenswrapper[36504]: I1203 22:27:12.717571 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-utilities\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.718275 master-0 kubenswrapper[36504]: I1203 22:27:12.718239 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-utilities\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.739098 master-0 kubenswrapper[36504]: I1203 22:27:12.739036 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npw48\" (UniqueName: \"kubernetes.io/projected/78a46fde-0f07-4027-a8d4-9557a26c52d7-kube-api-access-npw48\") pod \"community-operators-7gmnn\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:12.840140 master-0 kubenswrapper[36504]: I1203 22:27:12.839851 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:13.440672 master-0 kubenswrapper[36504]: I1203 22:27:13.440346 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7gmnn"] Dec 03 22:27:13.659635 master-0 kubenswrapper[36504]: I1203 22:27:13.659537 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerStarted","Data":"7413e537f36fd63b6916d99975e5f93f1216ee7f0a238b692881b99902938a28"} Dec 03 22:27:14.683050 master-0 kubenswrapper[36504]: I1203 22:27:14.682961 36504 generic.go:334] "Generic (PLEG): container finished" podID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerID="0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5" exitCode=0 Dec 03 22:27:14.683050 master-0 kubenswrapper[36504]: I1203 22:27:14.683041 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerDied","Data":"0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5"} Dec 03 22:27:14.720186 master-0 kubenswrapper[36504]: I1203 22:27:14.719479 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kt9v8"] Dec 03 22:27:14.725874 master-0 kubenswrapper[36504]: I1203 22:27:14.725821 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.746650 master-0 kubenswrapper[36504]: I1203 22:27:14.741200 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kt9v8"] Dec 03 22:27:14.814781 master-0 kubenswrapper[36504]: I1203 22:27:14.814688 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-utilities\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.814781 master-0 kubenswrapper[36504]: I1203 22:27:14.814766 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncll2\" (UniqueName: \"kubernetes.io/projected/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-kube-api-access-ncll2\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.815144 master-0 kubenswrapper[36504]: I1203 22:27:14.815032 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-catalog-content\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.921145 master-0 kubenswrapper[36504]: I1203 22:27:14.921051 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-utilities\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.921145 master-0 kubenswrapper[36504]: I1203 22:27:14.921124 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncll2\" (UniqueName: \"kubernetes.io/projected/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-kube-api-access-ncll2\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.921545 master-0 kubenswrapper[36504]: I1203 22:27:14.921241 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-catalog-content\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.921989 master-0 kubenswrapper[36504]: I1203 22:27:14.921949 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-catalog-content\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.922190 master-0 kubenswrapper[36504]: I1203 22:27:14.922142 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-utilities\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:14.947130 master-0 kubenswrapper[36504]: I1203 22:27:14.946929 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncll2\" (UniqueName: \"kubernetes.io/projected/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-kube-api-access-ncll2\") pod \"redhat-marketplace-kt9v8\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:15.129147 master-0 kubenswrapper[36504]: I1203 22:27:15.126204 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:15.665604 master-0 kubenswrapper[36504]: I1203 22:27:15.665521 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kt9v8"] Dec 03 22:27:15.716906 master-0 kubenswrapper[36504]: I1203 22:27:15.716749 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kt9v8" event={"ID":"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10","Type":"ContainerStarted","Data":"2b49916efce93537f207dbd31a6a12df07cfcc87e02366e83733667d23f82127"} Dec 03 22:27:15.719304 master-0 kubenswrapper[36504]: I1203 22:27:15.719216 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerStarted","Data":"5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5"} Dec 03 22:27:16.744468 master-0 kubenswrapper[36504]: I1203 22:27:16.744392 36504 generic.go:334] "Generic (PLEG): container finished" podID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerID="5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5" exitCode=0 Dec 03 22:27:16.745190 master-0 kubenswrapper[36504]: I1203 22:27:16.744480 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerDied","Data":"5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5"} Dec 03 22:27:16.748021 master-0 kubenswrapper[36504]: I1203 22:27:16.747959 36504 generic.go:334] "Generic (PLEG): container finished" podID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerID="450637a8a4c391214dc7621a0706afdb63c289981a94a2630ab16dc81407b41d" exitCode=0 Dec 03 22:27:16.748135 master-0 kubenswrapper[36504]: I1203 22:27:16.748021 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kt9v8" event={"ID":"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10","Type":"ContainerDied","Data":"450637a8a4c391214dc7621a0706afdb63c289981a94a2630ab16dc81407b41d"} Dec 03 22:27:17.580886 master-0 kubenswrapper[36504]: I1203 22:27:17.580829 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:17.645351 master-0 kubenswrapper[36504]: I1203 22:27:17.645261 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-sg-core-conf-yaml\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.645351 master-0 kubenswrapper[36504]: I1203 22:27:17.645356 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-log-httpd\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.645796 master-0 kubenswrapper[36504]: I1203 22:27:17.645692 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-config-data\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.645796 master-0 kubenswrapper[36504]: I1203 22:27:17.645721 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-scripts\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.645912 master-0 kubenswrapper[36504]: I1203 22:27:17.645817 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptzqt\" (UniqueName: \"kubernetes.io/projected/dabfa833-e6ab-4beb-a721-454726a03e5e-kube-api-access-ptzqt\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.645912 master-0 kubenswrapper[36504]: I1203 22:27:17.645864 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-combined-ca-bundle\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.646038 master-0 kubenswrapper[36504]: I1203 22:27:17.646001 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-run-httpd\") pod \"dabfa833-e6ab-4beb-a721-454726a03e5e\" (UID: \"dabfa833-e6ab-4beb-a721-454726a03e5e\") " Dec 03 22:27:17.659411 master-0 kubenswrapper[36504]: I1203 22:27:17.658992 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:17.669628 master-0 kubenswrapper[36504]: I1203 22:27:17.669497 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:17.676062 master-0 kubenswrapper[36504]: I1203 22:27:17.675914 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-scripts" (OuterVolumeSpecName: "scripts") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:17.704676 master-0 kubenswrapper[36504]: I1203 22:27:17.704394 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabfa833-e6ab-4beb-a721-454726a03e5e-kube-api-access-ptzqt" (OuterVolumeSpecName: "kube-api-access-ptzqt") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "kube-api-access-ptzqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:17.759995 master-0 kubenswrapper[36504]: I1203 22:27:17.759905 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:17.759995 master-0 kubenswrapper[36504]: I1203 22:27:17.759957 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:17.759995 master-0 kubenswrapper[36504]: I1203 22:27:17.760011 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptzqt\" (UniqueName: \"kubernetes.io/projected/dabfa833-e6ab-4beb-a721-454726a03e5e-kube-api-access-ptzqt\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:17.759995 master-0 kubenswrapper[36504]: I1203 22:27:17.760022 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dabfa833-e6ab-4beb-a721-454726a03e5e-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:17.786427 master-0 kubenswrapper[36504]: I1203 22:27:17.786352 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerStarted","Data":"5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4"} Dec 03 22:27:17.788567 master-0 kubenswrapper[36504]: I1203 22:27:17.788407 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:17.828041 master-0 kubenswrapper[36504]: I1203 22:27:17.826271 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7gmnn" podStartSLOduration=3.308596849 podStartE2EDuration="5.826234677s" podCreationTimestamp="2025-12-03 22:27:12 +0000 UTC" firstStartedPulling="2025-12-03 22:27:14.693074513 +0000 UTC m=+999.912846520" lastFinishedPulling="2025-12-03 22:27:17.210712351 +0000 UTC m=+1002.430484348" observedRunningTime="2025-12-03 22:27:17.8129581 +0000 UTC m=+1003.032730127" watchObservedRunningTime="2025-12-03 22:27:17.826234677 +0000 UTC m=+1003.046006684" Dec 03 22:27:17.835379 master-0 kubenswrapper[36504]: I1203 22:27:17.833809 36504 generic.go:334] "Generic (PLEG): container finished" podID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerID="ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f" exitCode=137 Dec 03 22:27:17.835379 master-0 kubenswrapper[36504]: I1203 22:27:17.834001 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerDied","Data":"ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f"} Dec 03 22:27:17.835379 master-0 kubenswrapper[36504]: I1203 22:27:17.834160 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dabfa833-e6ab-4beb-a721-454726a03e5e","Type":"ContainerDied","Data":"038802ca96e4e4466028e596698ba5c96e49932850999914a86b878e768cc958"} Dec 03 22:27:17.835379 master-0 kubenswrapper[36504]: I1203 22:27:17.834154 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:17.835379 master-0 kubenswrapper[36504]: I1203 22:27:17.834241 36504 scope.go:117] "RemoveContainer" containerID="ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f" Dec 03 22:27:17.864803 master-0 kubenswrapper[36504]: I1203 22:27:17.864705 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:17.880259 master-0 kubenswrapper[36504]: I1203 22:27:17.879271 36504 scope.go:117] "RemoveContainer" containerID="8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0" Dec 03 22:27:17.896663 master-0 kubenswrapper[36504]: I1203 22:27:17.896522 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:17.929718 master-0 kubenswrapper[36504]: I1203 22:27:17.929655 36504 scope.go:117] "RemoveContainer" containerID="f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29" Dec 03 22:27:17.948680 master-0 kubenswrapper[36504]: I1203 22:27:17.948581 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-config-data" (OuterVolumeSpecName: "config-data") pod "dabfa833-e6ab-4beb-a721-454726a03e5e" (UID: "dabfa833-e6ab-4beb-a721-454726a03e5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:17.981265 master-0 kubenswrapper[36504]: I1203 22:27:17.981201 36504 scope.go:117] "RemoveContainer" containerID="489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e" Dec 03 22:27:17.981481 master-0 kubenswrapper[36504]: I1203 22:27:17.981327 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:17.981481 master-0 kubenswrapper[36504]: I1203 22:27:17.981385 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabfa833-e6ab-4beb-a721-454726a03e5e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:18.197937 master-0 kubenswrapper[36504]: I1203 22:27:18.194436 36504 scope.go:117] "RemoveContainer" containerID="ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f" Dec 03 22:27:18.197937 master-0 kubenswrapper[36504]: I1203 22:27:18.194619 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: E1203 22:27:18.200289 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f\": container with ID starting with ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f not found: ID does not exist" containerID="ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: I1203 22:27:18.200346 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f"} err="failed to get container status \"ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f\": rpc error: code = NotFound desc = could not find container \"ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f\": container with ID starting with ed06ac706c5d39c7deda3a5fd1d99c2e37de02e62e853d6467109c73c0df2f3f not found: ID does not exist" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: I1203 22:27:18.200379 36504 scope.go:117] "RemoveContainer" containerID="8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: E1203 22:27:18.200749 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0\": container with ID starting with 8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0 not found: ID does not exist" containerID="8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: I1203 22:27:18.200784 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0"} err="failed to get container status \"8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0\": rpc error: code = NotFound desc = could not find container \"8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0\": container with ID starting with 8d72dd5b9ccacae3142d995b9dff0260b6d2ca192024c76731c24dee40c6cbe0 not found: ID does not exist" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: I1203 22:27:18.200800 36504 scope.go:117] "RemoveContainer" containerID="f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: E1203 22:27:18.201286 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29\": container with ID starting with f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29 not found: ID does not exist" containerID="f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: I1203 22:27:18.201308 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29"} err="failed to get container status \"f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29\": rpc error: code = NotFound desc = could not find container \"f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29\": container with ID starting with f360ed227d5197d872230d60f0b048897f54128f78f643fe5272aeca0cd0ec29 not found: ID does not exist" Dec 03 22:27:18.201398 master-0 kubenswrapper[36504]: I1203 22:27:18.201321 36504 scope.go:117] "RemoveContainer" containerID="489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e" Dec 03 22:27:18.201963 master-0 kubenswrapper[36504]: E1203 22:27:18.201568 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e\": container with ID starting with 489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e not found: ID does not exist" containerID="489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e" Dec 03 22:27:18.201963 master-0 kubenswrapper[36504]: I1203 22:27:18.201591 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e"} err="failed to get container status \"489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e\": rpc error: code = NotFound desc = could not find container \"489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e\": container with ID starting with 489ab18648ec62f71fe9737552884a515133a38b4c77f5b9555938529cce181e not found: ID does not exist" Dec 03 22:27:18.224327 master-0 kubenswrapper[36504]: I1203 22:27:18.221956 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.235882 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: E1203 22:27:18.237653 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="sg-core" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.237678 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="sg-core" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: E1203 22:27:18.237690 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-notification-agent" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.237697 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-notification-agent" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: E1203 22:27:18.237735 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-central-agent" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.237742 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-central-agent" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: E1203 22:27:18.237786 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="proxy-httpd" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.237803 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="proxy-httpd" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.239870 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-central-agent" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.240004 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="sg-core" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.240039 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="proxy-httpd" Dec 03 22:27:18.240807 master-0 kubenswrapper[36504]: I1203 22:27:18.240117 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" containerName="ceilometer-notification-agent" Dec 03 22:27:18.245337 master-0 kubenswrapper[36504]: I1203 22:27:18.245276 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:18.254659 master-0 kubenswrapper[36504]: I1203 22:27:18.253377 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:27:18.254659 master-0 kubenswrapper[36504]: I1203 22:27:18.253506 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:27:18.298758 master-0 kubenswrapper[36504]: I1203 22:27:18.298689 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.298758 master-0 kubenswrapper[36504]: I1203 22:27:18.298755 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lxmq\" (UniqueName: \"kubernetes.io/projected/e8d19887-a1c7-4adc-8265-11d01d4b126e-kube-api-access-8lxmq\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.299156 master-0 kubenswrapper[36504]: I1203 22:27:18.298896 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.299156 master-0 kubenswrapper[36504]: I1203 22:27:18.298922 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-scripts\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.299156 master-0 kubenswrapper[36504]: I1203 22:27:18.298942 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.299156 master-0 kubenswrapper[36504]: I1203 22:27:18.298977 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-config-data\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.299156 master-0 kubenswrapper[36504]: I1203 22:27:18.299003 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.313840 master-0 kubenswrapper[36504]: I1203 22:27:18.313748 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:18.403842 master-0 kubenswrapper[36504]: I1203 22:27:18.403731 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-scripts\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.404653 master-0 kubenswrapper[36504]: I1203 22:27:18.403816 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.404653 master-0 kubenswrapper[36504]: I1203 22:27:18.404633 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.405025 master-0 kubenswrapper[36504]: I1203 22:27:18.404705 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-config-data\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.405091 master-0 kubenswrapper[36504]: I1203 22:27:18.405069 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.405379 master-0 kubenswrapper[36504]: I1203 22:27:18.405347 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.405428 master-0 kubenswrapper[36504]: I1203 22:27:18.405399 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lxmq\" (UniqueName: \"kubernetes.io/projected/e8d19887-a1c7-4adc-8265-11d01d4b126e-kube-api-access-8lxmq\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.406137 master-0 kubenswrapper[36504]: I1203 22:27:18.406073 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-run-httpd\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.406137 master-0 kubenswrapper[36504]: I1203 22:27:18.406102 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-log-httpd\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.408388 master-0 kubenswrapper[36504]: I1203 22:27:18.408336 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.409844 master-0 kubenswrapper[36504]: I1203 22:27:18.409792 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.410102 master-0 kubenswrapper[36504]: I1203 22:27:18.410048 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-config-data\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.424436 master-0 kubenswrapper[36504]: I1203 22:27:18.424365 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lxmq\" (UniqueName: \"kubernetes.io/projected/e8d19887-a1c7-4adc-8265-11d01d4b126e-kube-api-access-8lxmq\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.425443 master-0 kubenswrapper[36504]: I1203 22:27:18.425382 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-scripts\") pod \"ceilometer-0\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " pod="openstack/ceilometer-0" Dec 03 22:27:18.611868 master-0 kubenswrapper[36504]: I1203 22:27:18.611779 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:18.949047 master-0 kubenswrapper[36504]: I1203 22:27:18.948857 36504 generic.go:334] "Generic (PLEG): container finished" podID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerID="e97bfd2dd06fe8a4fda181f4fe789f73a0920768eff1cc8543ad59f86a3b10ad" exitCode=0 Dec 03 22:27:18.949047 master-0 kubenswrapper[36504]: I1203 22:27:18.949000 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kt9v8" event={"ID":"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10","Type":"ContainerDied","Data":"e97bfd2dd06fe8a4fda181f4fe789f73a0920768eff1cc8543ad59f86a3b10ad"} Dec 03 22:27:19.126017 master-0 kubenswrapper[36504]: I1203 22:27:19.125762 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabfa833-e6ab-4beb-a721-454726a03e5e" path="/var/lib/kubelet/pods/dabfa833-e6ab-4beb-a721-454726a03e5e/volumes" Dec 03 22:27:19.219725 master-0 kubenswrapper[36504]: I1203 22:27:19.219516 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:19.988589 master-0 kubenswrapper[36504]: I1203 22:27:19.988499 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerStarted","Data":"aca5d907e14705f4623725316657bd50a99618701eff1a94f45fb6d3c72277b8"} Dec 03 22:27:20.002218 master-0 kubenswrapper[36504]: I1203 22:27:20.002115 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kt9v8" event={"ID":"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10","Type":"ContainerStarted","Data":"07f29d4c4bcc116b057755eb6d9cd63804aa239b66a77838e377b3e9f4e70370"} Dec 03 22:27:21.026123 master-0 kubenswrapper[36504]: I1203 22:27:21.026040 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerStarted","Data":"005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2"} Dec 03 22:27:21.619898 master-0 kubenswrapper[36504]: I1203 22:27:21.619633 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kt9v8" podStartSLOduration=4.9573805669999995 podStartE2EDuration="7.619599778s" podCreationTimestamp="2025-12-03 22:27:14 +0000 UTC" firstStartedPulling="2025-12-03 22:27:16.750345002 +0000 UTC m=+1001.970117019" lastFinishedPulling="2025-12-03 22:27:19.412564223 +0000 UTC m=+1004.632336230" observedRunningTime="2025-12-03 22:27:20.605464931 +0000 UTC m=+1005.825236938" watchObservedRunningTime="2025-12-03 22:27:21.619599778 +0000 UTC m=+1006.839371785" Dec 03 22:27:21.628805 master-0 kubenswrapper[36504]: I1203 22:27:21.625559 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:22.164751 master-0 kubenswrapper[36504]: I1203 22:27:22.153506 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerStarted","Data":"5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a"} Dec 03 22:27:22.841751 master-0 kubenswrapper[36504]: I1203 22:27:22.841543 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:22.841751 master-0 kubenswrapper[36504]: I1203 22:27:22.841634 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:22.914259 master-0 kubenswrapper[36504]: I1203 22:27:22.914195 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:23.171597 master-0 kubenswrapper[36504]: I1203 22:27:23.171496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerStarted","Data":"bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a"} Dec 03 22:27:23.233529 master-0 kubenswrapper[36504]: I1203 22:27:23.232719 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:23.891265 master-0 kubenswrapper[36504]: I1203 22:27:23.891193 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7gmnn"] Dec 03 22:27:24.192020 master-0 kubenswrapper[36504]: I1203 22:27:24.191903 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerStarted","Data":"eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36"} Dec 03 22:27:24.194180 master-0 kubenswrapper[36504]: I1203 22:27:24.193152 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-central-agent" containerID="cri-o://005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2" gracePeriod=30 Dec 03 22:27:24.194180 master-0 kubenswrapper[36504]: I1203 22:27:24.193281 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="sg-core" containerID="cri-o://bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a" gracePeriod=30 Dec 03 22:27:24.194180 master-0 kubenswrapper[36504]: I1203 22:27:24.193313 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-notification-agent" containerID="cri-o://5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a" gracePeriod=30 Dec 03 22:27:24.197465 master-0 kubenswrapper[36504]: I1203 22:27:24.193224 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="proxy-httpd" containerID="cri-o://eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36" gracePeriod=30 Dec 03 22:27:24.242911 master-0 kubenswrapper[36504]: I1203 22:27:24.242749 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9120706090000001 podStartE2EDuration="6.242716745s" podCreationTimestamp="2025-12-03 22:27:18 +0000 UTC" firstStartedPulling="2025-12-03 22:27:19.243993516 +0000 UTC m=+1004.463765523" lastFinishedPulling="2025-12-03 22:27:23.574639652 +0000 UTC m=+1008.794411659" observedRunningTime="2025-12-03 22:27:24.230237541 +0000 UTC m=+1009.450009558" watchObservedRunningTime="2025-12-03 22:27:24.242716745 +0000 UTC m=+1009.462488752" Dec 03 22:27:25.129299 master-0 kubenswrapper[36504]: I1203 22:27:25.129220 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:25.129299 master-0 kubenswrapper[36504]: I1203 22:27:25.129312 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:25.205639 master-0 kubenswrapper[36504]: I1203 22:27:25.205557 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:25.210407 master-0 kubenswrapper[36504]: I1203 22:27:25.210355 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerID="eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36" exitCode=0 Dec 03 22:27:25.210407 master-0 kubenswrapper[36504]: I1203 22:27:25.210406 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerID="bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a" exitCode=2 Dec 03 22:27:25.210604 master-0 kubenswrapper[36504]: I1203 22:27:25.210421 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerID="5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a" exitCode=0 Dec 03 22:27:25.210604 master-0 kubenswrapper[36504]: I1203 22:27:25.210527 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerDied","Data":"eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36"} Dec 03 22:27:25.210882 master-0 kubenswrapper[36504]: I1203 22:27:25.210632 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerDied","Data":"bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a"} Dec 03 22:27:25.210882 master-0 kubenswrapper[36504]: I1203 22:27:25.210651 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerDied","Data":"5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a"} Dec 03 22:27:25.210976 master-0 kubenswrapper[36504]: I1203 22:27:25.210940 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7gmnn" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="registry-server" containerID="cri-o://5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4" gracePeriod=2 Dec 03 22:27:25.278026 master-0 kubenswrapper[36504]: I1203 22:27:25.277902 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:25.821638 master-0 kubenswrapper[36504]: I1203 22:27:25.821548 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:25.983814 master-0 kubenswrapper[36504]: I1203 22:27:25.983577 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-utilities\") pod \"78a46fde-0f07-4027-a8d4-9557a26c52d7\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " Dec 03 22:27:25.983814 master-0 kubenswrapper[36504]: I1203 22:27:25.983756 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-catalog-content\") pod \"78a46fde-0f07-4027-a8d4-9557a26c52d7\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " Dec 03 22:27:25.984192 master-0 kubenswrapper[36504]: I1203 22:27:25.983974 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npw48\" (UniqueName: \"kubernetes.io/projected/78a46fde-0f07-4027-a8d4-9557a26c52d7-kube-api-access-npw48\") pod \"78a46fde-0f07-4027-a8d4-9557a26c52d7\" (UID: \"78a46fde-0f07-4027-a8d4-9557a26c52d7\") " Dec 03 22:27:25.984618 master-0 kubenswrapper[36504]: I1203 22:27:25.984544 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-utilities" (OuterVolumeSpecName: "utilities") pod "78a46fde-0f07-4027-a8d4-9557a26c52d7" (UID: "78a46fde-0f07-4027-a8d4-9557a26c52d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:25.985641 master-0 kubenswrapper[36504]: I1203 22:27:25.985600 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:25.989103 master-0 kubenswrapper[36504]: I1203 22:27:25.989068 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a46fde-0f07-4027-a8d4-9557a26c52d7-kube-api-access-npw48" (OuterVolumeSpecName: "kube-api-access-npw48") pod "78a46fde-0f07-4027-a8d4-9557a26c52d7" (UID: "78a46fde-0f07-4027-a8d4-9557a26c52d7"). InnerVolumeSpecName "kube-api-access-npw48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:26.068814 master-0 kubenswrapper[36504]: I1203 22:27:26.063876 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78a46fde-0f07-4027-a8d4-9557a26c52d7" (UID: "78a46fde-0f07-4027-a8d4-9557a26c52d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:26.089759 master-0 kubenswrapper[36504]: I1203 22:27:26.089678 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78a46fde-0f07-4027-a8d4-9557a26c52d7-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:26.089759 master-0 kubenswrapper[36504]: I1203 22:27:26.089743 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npw48\" (UniqueName: \"kubernetes.io/projected/78a46fde-0f07-4027-a8d4-9557a26c52d7-kube-api-access-npw48\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:26.233479 master-0 kubenswrapper[36504]: I1203 22:27:26.233286 36504 generic.go:334] "Generic (PLEG): container finished" podID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerID="5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4" exitCode=0 Dec 03 22:27:26.233479 master-0 kubenswrapper[36504]: I1203 22:27:26.233375 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7gmnn" Dec 03 22:27:26.233479 master-0 kubenswrapper[36504]: I1203 22:27:26.233393 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerDied","Data":"5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4"} Dec 03 22:27:26.235384 master-0 kubenswrapper[36504]: I1203 22:27:26.233523 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7gmnn" event={"ID":"78a46fde-0f07-4027-a8d4-9557a26c52d7","Type":"ContainerDied","Data":"7413e537f36fd63b6916d99975e5f93f1216ee7f0a238b692881b99902938a28"} Dec 03 22:27:26.235384 master-0 kubenswrapper[36504]: I1203 22:27:26.233549 36504 scope.go:117] "RemoveContainer" containerID="5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4" Dec 03 22:27:26.239087 master-0 kubenswrapper[36504]: I1203 22:27:26.238997 36504 generic.go:334] "Generic (PLEG): container finished" podID="900943d0-fa02-401a-ab6a-7ed811802669" containerID="f0e9c5ed4f553b79cb34acd474a2008633c4b1f9f75e4cd26e4e2843eabf92f6" exitCode=0 Dec 03 22:27:26.240864 master-0 kubenswrapper[36504]: I1203 22:27:26.240660 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" event={"ID":"900943d0-fa02-401a-ab6a-7ed811802669","Type":"ContainerDied","Data":"f0e9c5ed4f553b79cb34acd474a2008633c4b1f9f75e4cd26e4e2843eabf92f6"} Dec 03 22:27:26.296095 master-0 kubenswrapper[36504]: I1203 22:27:26.293282 36504 scope.go:117] "RemoveContainer" containerID="5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5" Dec 03 22:27:26.351645 master-0 kubenswrapper[36504]: I1203 22:27:26.345777 36504 scope.go:117] "RemoveContainer" containerID="0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5" Dec 03 22:27:26.363832 master-0 kubenswrapper[36504]: I1203 22:27:26.363704 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7gmnn"] Dec 03 22:27:26.380807 master-0 kubenswrapper[36504]: I1203 22:27:26.380707 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7gmnn"] Dec 03 22:27:26.411514 master-0 kubenswrapper[36504]: I1203 22:27:26.410977 36504 scope.go:117] "RemoveContainer" containerID="5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: E1203 22:27:26.411652 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4\": container with ID starting with 5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4 not found: ID does not exist" containerID="5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: I1203 22:27:26.411696 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4"} err="failed to get container status \"5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4\": rpc error: code = NotFound desc = could not find container \"5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4\": container with ID starting with 5ba64319535f79369ea88aaff107479a551f871ea4eab2c1bedf2414b6c452e4 not found: ID does not exist" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: I1203 22:27:26.411727 36504 scope.go:117] "RemoveContainer" containerID="5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: E1203 22:27:26.412083 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5\": container with ID starting with 5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5 not found: ID does not exist" containerID="5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: I1203 22:27:26.412100 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5"} err="failed to get container status \"5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5\": rpc error: code = NotFound desc = could not find container \"5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5\": container with ID starting with 5a33c015de7d17ec8bb3e1ceaf04647bef9f165c73dd00150757203644fbd2a5 not found: ID does not exist" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: I1203 22:27:26.412114 36504 scope.go:117] "RemoveContainer" containerID="0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: E1203 22:27:26.412487 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5\": container with ID starting with 0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5 not found: ID does not exist" containerID="0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5" Dec 03 22:27:26.413335 master-0 kubenswrapper[36504]: I1203 22:27:26.412562 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5"} err="failed to get container status \"0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5\": rpc error: code = NotFound desc = could not find container \"0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5\": container with ID starting with 0be83f45433e4bc8efabec2a6e7d77ab189868dbf694717782ea55e0f36862e5 not found: ID does not exist" Dec 03 22:27:27.096192 master-0 kubenswrapper[36504]: I1203 22:27:27.096121 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:27:27.116062 master-0 kubenswrapper[36504]: I1203 22:27:27.114192 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" path="/var/lib/kubelet/pods/78a46fde-0f07-4027-a8d4-9557a26c52d7/volumes" Dec 03 22:27:27.781218 master-0 kubenswrapper[36504]: I1203 22:27:27.781146 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:27:27.854947 master-0 kubenswrapper[36504]: I1203 22:27:27.854867 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-scripts\") pod \"900943d0-fa02-401a-ab6a-7ed811802669\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " Dec 03 22:27:27.855894 master-0 kubenswrapper[36504]: I1203 22:27:27.855857 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-combined-ca-bundle\") pod \"900943d0-fa02-401a-ab6a-7ed811802669\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " Dec 03 22:27:27.857116 master-0 kubenswrapper[36504]: I1203 22:27:27.857040 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhns\" (UniqueName: \"kubernetes.io/projected/900943d0-fa02-401a-ab6a-7ed811802669-kube-api-access-xdhns\") pod \"900943d0-fa02-401a-ab6a-7ed811802669\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " Dec 03 22:27:27.857832 master-0 kubenswrapper[36504]: I1203 22:27:27.857809 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-config-data\") pod \"900943d0-fa02-401a-ab6a-7ed811802669\" (UID: \"900943d0-fa02-401a-ab6a-7ed811802669\") " Dec 03 22:27:27.860363 master-0 kubenswrapper[36504]: I1203 22:27:27.859660 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-scripts" (OuterVolumeSpecName: "scripts") pod "900943d0-fa02-401a-ab6a-7ed811802669" (UID: "900943d0-fa02-401a-ab6a-7ed811802669"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:27.865155 master-0 kubenswrapper[36504]: I1203 22:27:27.865067 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900943d0-fa02-401a-ab6a-7ed811802669-kube-api-access-xdhns" (OuterVolumeSpecName: "kube-api-access-xdhns") pod "900943d0-fa02-401a-ab6a-7ed811802669" (UID: "900943d0-fa02-401a-ab6a-7ed811802669"). InnerVolumeSpecName "kube-api-access-xdhns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:27.899403 master-0 kubenswrapper[36504]: I1203 22:27:27.899304 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "900943d0-fa02-401a-ab6a-7ed811802669" (UID: "900943d0-fa02-401a-ab6a-7ed811802669"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:27.929558 master-0 kubenswrapper[36504]: I1203 22:27:27.929467 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-config-data" (OuterVolumeSpecName: "config-data") pod "900943d0-fa02-401a-ab6a-7ed811802669" (UID: "900943d0-fa02-401a-ab6a-7ed811802669"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:27.962580 master-0 kubenswrapper[36504]: I1203 22:27:27.962417 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:27.962580 master-0 kubenswrapper[36504]: I1203 22:27:27.962471 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:27.962580 master-0 kubenswrapper[36504]: I1203 22:27:27.962481 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900943d0-fa02-401a-ab6a-7ed811802669-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:27.962580 master-0 kubenswrapper[36504]: I1203 22:27:27.962526 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdhns\" (UniqueName: \"kubernetes.io/projected/900943d0-fa02-401a-ab6a-7ed811802669-kube-api-access-xdhns\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:28.275269 master-0 kubenswrapper[36504]: I1203 22:27:28.275107 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" event={"ID":"900943d0-fa02-401a-ab6a-7ed811802669","Type":"ContainerDied","Data":"060905482e1d11119343323b1a22294f13e729979271e06575a301545c5463b9"} Dec 03 22:27:28.275269 master-0 kubenswrapper[36504]: I1203 22:27:28.275170 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="060905482e1d11119343323b1a22294f13e729979271e06575a301545c5463b9" Dec 03 22:27:28.275698 master-0 kubenswrapper[36504]: I1203 22:27:28.275284 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-h9tlt" Dec 03 22:27:28.450716 master-0 kubenswrapper[36504]: I1203 22:27:28.450609 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 22:27:28.451546 master-0 kubenswrapper[36504]: E1203 22:27:28.451508 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="extract-utilities" Dec 03 22:27:28.451546 master-0 kubenswrapper[36504]: I1203 22:27:28.451534 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="extract-utilities" Dec 03 22:27:28.451546 master-0 kubenswrapper[36504]: E1203 22:27:28.451552 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="registry-server" Dec 03 22:27:28.451732 master-0 kubenswrapper[36504]: I1203 22:27:28.451561 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="registry-server" Dec 03 22:27:28.451732 master-0 kubenswrapper[36504]: E1203 22:27:28.451589 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="extract-content" Dec 03 22:27:28.451732 master-0 kubenswrapper[36504]: I1203 22:27:28.451596 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="extract-content" Dec 03 22:27:28.451732 master-0 kubenswrapper[36504]: E1203 22:27:28.451639 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900943d0-fa02-401a-ab6a-7ed811802669" containerName="nova-cell0-conductor-db-sync" Dec 03 22:27:28.451732 master-0 kubenswrapper[36504]: I1203 22:27:28.451646 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="900943d0-fa02-401a-ab6a-7ed811802669" containerName="nova-cell0-conductor-db-sync" Dec 03 22:27:28.452686 master-0 kubenswrapper[36504]: I1203 22:27:28.452021 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a46fde-0f07-4027-a8d4-9557a26c52d7" containerName="registry-server" Dec 03 22:27:28.452686 master-0 kubenswrapper[36504]: I1203 22:27:28.452115 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="900943d0-fa02-401a-ab6a-7ed811802669" containerName="nova-cell0-conductor-db-sync" Dec 03 22:27:28.453339 master-0 kubenswrapper[36504]: I1203 22:27:28.453311 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.458070 master-0 kubenswrapper[36504]: I1203 22:27:28.457990 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 22:27:28.465359 master-0 kubenswrapper[36504]: I1203 22:27:28.465281 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 22:27:28.544912 master-0 kubenswrapper[36504]: I1203 22:27:28.544756 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-ftthv"] Dec 03 22:27:28.547598 master-0 kubenswrapper[36504]: I1203 22:27:28.547544 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.572461 master-0 kubenswrapper[36504]: I1203 22:27:28.572398 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-3663-account-create-update-7qg85"] Dec 03 22:27:28.575224 master-0 kubenswrapper[36504]: I1203 22:27:28.575153 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.578239 master-0 kubenswrapper[36504]: I1203 22:27:28.578195 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 03 22:27:28.590615 master-0 kubenswrapper[36504]: I1203 22:27:28.590140 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-ftthv"] Dec 03 22:27:28.593242 master-0 kubenswrapper[36504]: I1203 22:27:28.591637 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f542282-489e-4d26-b122-880156d0fb83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.593242 master-0 kubenswrapper[36504]: I1203 22:27:28.592169 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74b2\" (UniqueName: \"kubernetes.io/projected/0f542282-489e-4d26-b122-880156d0fb83-kube-api-access-w74b2\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.593621 master-0 kubenswrapper[36504]: I1203 22:27:28.593591 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f542282-489e-4d26-b122-880156d0fb83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.603588 master-0 kubenswrapper[36504]: I1203 22:27:28.603543 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3663-account-create-update-7qg85"] Dec 03 22:27:28.696369 master-0 kubenswrapper[36504]: I1203 22:27:28.696283 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74b2\" (UniqueName: \"kubernetes.io/projected/0f542282-489e-4d26-b122-880156d0fb83-kube-api-access-w74b2\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.696369 master-0 kubenswrapper[36504]: I1203 22:27:28.696366 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc9tx\" (UniqueName: \"kubernetes.io/projected/cac6200f-b04a-499d-befd-180a060fa107-kube-api-access-jc9tx\") pod \"aodh-3663-account-create-update-7qg85\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.696737 master-0 kubenswrapper[36504]: I1203 22:27:28.696435 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-operator-scripts\") pod \"aodh-db-create-ftthv\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.696737 master-0 kubenswrapper[36504]: I1203 22:27:28.696538 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac6200f-b04a-499d-befd-180a060fa107-operator-scripts\") pod \"aodh-3663-account-create-update-7qg85\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.696737 master-0 kubenswrapper[36504]: I1203 22:27:28.696585 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f542282-489e-4d26-b122-880156d0fb83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.696737 master-0 kubenswrapper[36504]: I1203 22:27:28.696616 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2csj\" (UniqueName: \"kubernetes.io/projected/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-kube-api-access-k2csj\") pod \"aodh-db-create-ftthv\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.696737 master-0 kubenswrapper[36504]: I1203 22:27:28.696642 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f542282-489e-4d26-b122-880156d0fb83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.701455 master-0 kubenswrapper[36504]: I1203 22:27:28.701385 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f542282-489e-4d26-b122-880156d0fb83-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.707378 master-0 kubenswrapper[36504]: I1203 22:27:28.707302 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f542282-489e-4d26-b122-880156d0fb83-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.718301 master-0 kubenswrapper[36504]: I1203 22:27:28.718221 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74b2\" (UniqueName: \"kubernetes.io/projected/0f542282-489e-4d26-b122-880156d0fb83-kube-api-access-w74b2\") pod \"nova-cell0-conductor-0\" (UID: \"0f542282-489e-4d26-b122-880156d0fb83\") " pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.800919 master-0 kubenswrapper[36504]: I1203 22:27:28.799533 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-operator-scripts\") pod \"aodh-db-create-ftthv\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.800919 master-0 kubenswrapper[36504]: I1203 22:27:28.799867 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac6200f-b04a-499d-befd-180a060fa107-operator-scripts\") pod \"aodh-3663-account-create-update-7qg85\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.800919 master-0 kubenswrapper[36504]: I1203 22:27:28.800023 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2csj\" (UniqueName: \"kubernetes.io/projected/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-kube-api-access-k2csj\") pod \"aodh-db-create-ftthv\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.800919 master-0 kubenswrapper[36504]: I1203 22:27:28.800230 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc9tx\" (UniqueName: \"kubernetes.io/projected/cac6200f-b04a-499d-befd-180a060fa107-kube-api-access-jc9tx\") pod \"aodh-3663-account-create-update-7qg85\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.800919 master-0 kubenswrapper[36504]: I1203 22:27:28.800599 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-operator-scripts\") pod \"aodh-db-create-ftthv\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.801742 master-0 kubenswrapper[36504]: I1203 22:27:28.801108 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac6200f-b04a-499d-befd-180a060fa107-operator-scripts\") pod \"aodh-3663-account-create-update-7qg85\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.809185 master-0 kubenswrapper[36504]: I1203 22:27:28.809103 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:28.818550 master-0 kubenswrapper[36504]: I1203 22:27:28.818480 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc9tx\" (UniqueName: \"kubernetes.io/projected/cac6200f-b04a-499d-befd-180a060fa107-kube-api-access-jc9tx\") pod \"aodh-3663-account-create-update-7qg85\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:28.823256 master-0 kubenswrapper[36504]: I1203 22:27:28.823204 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2csj\" (UniqueName: \"kubernetes.io/projected/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-kube-api-access-k2csj\") pod \"aodh-db-create-ftthv\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.891562 master-0 kubenswrapper[36504]: I1203 22:27:28.891474 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:28.907409 master-0 kubenswrapper[36504]: I1203 22:27:28.906648 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:29.433765 master-0 kubenswrapper[36504]: I1203 22:27:29.433655 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 22:27:29.651591 master-0 kubenswrapper[36504]: I1203 22:27:29.650920 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-ftthv"] Dec 03 22:27:29.659372 master-0 kubenswrapper[36504]: W1203 22:27:29.659260 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ebd83a6_08e5_4f36_b253_fd52e0c6ab4a.slice/crio-35a7f4b30a857fa520ce86c3d1c67524b18b7eea1a45c13d2209b7fdebce7c18 WatchSource:0}: Error finding container 35a7f4b30a857fa520ce86c3d1c67524b18b7eea1a45c13d2209b7fdebce7c18: Status 404 returned error can't find the container with id 35a7f4b30a857fa520ce86c3d1c67524b18b7eea1a45c13d2209b7fdebce7c18 Dec 03 22:27:29.668805 master-0 kubenswrapper[36504]: I1203 22:27:29.668621 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3663-account-create-update-7qg85"] Dec 03 22:27:29.921686 master-0 kubenswrapper[36504]: I1203 22:27:29.921475 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kt9v8"] Dec 03 22:27:29.922436 master-0 kubenswrapper[36504]: I1203 22:27:29.921886 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kt9v8" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="registry-server" containerID="cri-o://07f29d4c4bcc116b057755eb6d9cd63804aa239b66a77838e377b3e9f4e70370" gracePeriod=2 Dec 03 22:27:30.360906 master-0 kubenswrapper[36504]: I1203 22:27:30.360732 36504 generic.go:334] "Generic (PLEG): container finished" podID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerID="07f29d4c4bcc116b057755eb6d9cd63804aa239b66a77838e377b3e9f4e70370" exitCode=0 Dec 03 22:27:30.361344 master-0 kubenswrapper[36504]: I1203 22:27:30.361317 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kt9v8" event={"ID":"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10","Type":"ContainerDied","Data":"07f29d4c4bcc116b057755eb6d9cd63804aa239b66a77838e377b3e9f4e70370"} Dec 03 22:27:30.367877 master-0 kubenswrapper[36504]: I1203 22:27:30.367795 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0f542282-489e-4d26-b122-880156d0fb83","Type":"ContainerStarted","Data":"af7fe9c72c23f1b7c5ea32368017a31c7ec88e050b47f58a93355e55201caa59"} Dec 03 22:27:30.370560 master-0 kubenswrapper[36504]: I1203 22:27:30.370509 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0f542282-489e-4d26-b122-880156d0fb83","Type":"ContainerStarted","Data":"8a465240c0b8020e4075c300963f95acca213a95070556df52fe07224538d4ab"} Dec 03 22:27:30.370819 master-0 kubenswrapper[36504]: I1203 22:27:30.370804 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:30.375062 master-0 kubenswrapper[36504]: I1203 22:27:30.374979 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3663-account-create-update-7qg85" event={"ID":"cac6200f-b04a-499d-befd-180a060fa107","Type":"ContainerStarted","Data":"c0a35831890dae029eccdf90661de68d520117c870fcf940dfb47072e61bb8bf"} Dec 03 22:27:30.375138 master-0 kubenswrapper[36504]: I1203 22:27:30.375071 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3663-account-create-update-7qg85" event={"ID":"cac6200f-b04a-499d-befd-180a060fa107","Type":"ContainerStarted","Data":"4dba23c74e9e007f37f9a6b26ee60a54c805fdb7cd1579a6955136f31a970b6b"} Dec 03 22:27:30.382237 master-0 kubenswrapper[36504]: I1203 22:27:30.382168 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ftthv" event={"ID":"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a","Type":"ContainerStarted","Data":"88bbcb3ed67ebc1dfcb335b10768d701956846acbdefc3dfaac0ca483d36eeeb"} Dec 03 22:27:30.382351 master-0 kubenswrapper[36504]: I1203 22:27:30.382242 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ftthv" event={"ID":"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a","Type":"ContainerStarted","Data":"35a7f4b30a857fa520ce86c3d1c67524b18b7eea1a45c13d2209b7fdebce7c18"} Dec 03 22:27:30.425622 master-0 kubenswrapper[36504]: I1203 22:27:30.423372 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.423340707 podStartE2EDuration="2.423340707s" podCreationTimestamp="2025-12-03 22:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:30.400182014 +0000 UTC m=+1015.619954021" watchObservedRunningTime="2025-12-03 22:27:30.423340707 +0000 UTC m=+1015.643112714" Dec 03 22:27:30.434166 master-0 kubenswrapper[36504]: I1203 22:27:30.434068 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-ftthv" podStartSLOduration=2.434042888 podStartE2EDuration="2.434042888s" podCreationTimestamp="2025-12-03 22:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:30.426627589 +0000 UTC m=+1015.646399596" watchObservedRunningTime="2025-12-03 22:27:30.434042888 +0000 UTC m=+1015.653814895" Dec 03 22:27:30.499893 master-0 kubenswrapper[36504]: I1203 22:27:30.498488 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-3663-account-create-update-7qg85" podStartSLOduration=2.498455665 podStartE2EDuration="2.498455665s" podCreationTimestamp="2025-12-03 22:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:30.45132179 +0000 UTC m=+1015.671093797" watchObservedRunningTime="2025-12-03 22:27:30.498455665 +0000 UTC m=+1015.718227672" Dec 03 22:27:30.893260 master-0 kubenswrapper[36504]: I1203 22:27:30.893128 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:30.927227 master-0 kubenswrapper[36504]: I1203 22:27:30.926939 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-utilities\") pod \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " Dec 03 22:27:30.927967 master-0 kubenswrapper[36504]: I1203 22:27:30.927246 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-catalog-content\") pod \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " Dec 03 22:27:30.927967 master-0 kubenswrapper[36504]: I1203 22:27:30.927295 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncll2\" (UniqueName: \"kubernetes.io/projected/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-kube-api-access-ncll2\") pod \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\" (UID: \"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10\") " Dec 03 22:27:30.927967 master-0 kubenswrapper[36504]: I1203 22:27:30.927713 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-utilities" (OuterVolumeSpecName: "utilities") pod "7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" (UID: "7491c0dc-b2bf-470a-94ff-5f10fb4e0d10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:30.928794 master-0 kubenswrapper[36504]: I1203 22:27:30.928755 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:30.947109 master-0 kubenswrapper[36504]: I1203 22:27:30.946912 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" (UID: "7491c0dc-b2bf-470a-94ff-5f10fb4e0d10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:30.947924 master-0 kubenswrapper[36504]: I1203 22:27:30.947852 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-kube-api-access-ncll2" (OuterVolumeSpecName: "kube-api-access-ncll2") pod "7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" (UID: "7491c0dc-b2bf-470a-94ff-5f10fb4e0d10"). InnerVolumeSpecName "kube-api-access-ncll2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:31.032453 master-0 kubenswrapper[36504]: I1203 22:27:31.032354 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.032453 master-0 kubenswrapper[36504]: I1203 22:27:31.032440 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncll2\" (UniqueName: \"kubernetes.io/projected/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10-kube-api-access-ncll2\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.259390 master-0 kubenswrapper[36504]: I1203 22:27:31.259333 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:31.342816 master-0 kubenswrapper[36504]: I1203 22:27:31.342723 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-scripts\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.343300 master-0 kubenswrapper[36504]: I1203 22:27:31.342931 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-run-httpd\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.343300 master-0 kubenswrapper[36504]: I1203 22:27:31.342962 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-log-httpd\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.343300 master-0 kubenswrapper[36504]: I1203 22:27:31.343198 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-config-data\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.343756 master-0 kubenswrapper[36504]: I1203 22:27:31.343729 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-sg-core-conf-yaml\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.343969 master-0 kubenswrapper[36504]: I1203 22:27:31.343903 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:31.344114 master-0 kubenswrapper[36504]: I1203 22:27:31.344082 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lxmq\" (UniqueName: \"kubernetes.io/projected/e8d19887-a1c7-4adc-8265-11d01d4b126e-kube-api-access-8lxmq\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.344177 master-0 kubenswrapper[36504]: I1203 22:27:31.344120 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-combined-ca-bundle\") pod \"e8d19887-a1c7-4adc-8265-11d01d4b126e\" (UID: \"e8d19887-a1c7-4adc-8265-11d01d4b126e\") " Dec 03 22:27:31.345024 master-0 kubenswrapper[36504]: I1203 22:27:31.344972 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:31.346278 master-0 kubenswrapper[36504]: I1203 22:27:31.346252 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.346386 master-0 kubenswrapper[36504]: I1203 22:27:31.346371 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e8d19887-a1c7-4adc-8265-11d01d4b126e-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.360778 master-0 kubenswrapper[36504]: I1203 22:27:31.360469 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-scripts" (OuterVolumeSpecName: "scripts") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:31.361304 master-0 kubenswrapper[36504]: I1203 22:27:31.361119 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d19887-a1c7-4adc-8265-11d01d4b126e-kube-api-access-8lxmq" (OuterVolumeSpecName: "kube-api-access-8lxmq") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "kube-api-access-8lxmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:31.405919 master-0 kubenswrapper[36504]: I1203 22:27:31.405828 36504 generic.go:334] "Generic (PLEG): container finished" podID="0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" containerID="88bbcb3ed67ebc1dfcb335b10768d701956846acbdefc3dfaac0ca483d36eeeb" exitCode=0 Dec 03 22:27:31.406175 master-0 kubenswrapper[36504]: I1203 22:27:31.405922 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ftthv" event={"ID":"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a","Type":"ContainerDied","Data":"88bbcb3ed67ebc1dfcb335b10768d701956846acbdefc3dfaac0ca483d36eeeb"} Dec 03 22:27:31.407715 master-0 kubenswrapper[36504]: I1203 22:27:31.407638 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:31.410695 master-0 kubenswrapper[36504]: I1203 22:27:31.410645 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kt9v8" Dec 03 22:27:31.410790 master-0 kubenswrapper[36504]: I1203 22:27:31.410684 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kt9v8" event={"ID":"7491c0dc-b2bf-470a-94ff-5f10fb4e0d10","Type":"ContainerDied","Data":"2b49916efce93537f207dbd31a6a12df07cfcc87e02366e83733667d23f82127"} Dec 03 22:27:31.410956 master-0 kubenswrapper[36504]: I1203 22:27:31.410921 36504 scope.go:117] "RemoveContainer" containerID="07f29d4c4bcc116b057755eb6d9cd63804aa239b66a77838e377b3e9f4e70370" Dec 03 22:27:31.418550 master-0 kubenswrapper[36504]: I1203 22:27:31.418489 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerID="005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2" exitCode=0 Dec 03 22:27:31.418735 master-0 kubenswrapper[36504]: I1203 22:27:31.418580 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerDied","Data":"005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2"} Dec 03 22:27:31.418860 master-0 kubenswrapper[36504]: I1203 22:27:31.418672 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:31.418958 master-0 kubenswrapper[36504]: I1203 22:27:31.418797 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e8d19887-a1c7-4adc-8265-11d01d4b126e","Type":"ContainerDied","Data":"aca5d907e14705f4623725316657bd50a99618701eff1a94f45fb6d3c72277b8"} Dec 03 22:27:31.422082 master-0 kubenswrapper[36504]: I1203 22:27:31.422027 36504 generic.go:334] "Generic (PLEG): container finished" podID="cac6200f-b04a-499d-befd-180a060fa107" containerID="c0a35831890dae029eccdf90661de68d520117c870fcf940dfb47072e61bb8bf" exitCode=0 Dec 03 22:27:31.427013 master-0 kubenswrapper[36504]: I1203 22:27:31.425388 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3663-account-create-update-7qg85" event={"ID":"cac6200f-b04a-499d-befd-180a060fa107","Type":"ContainerDied","Data":"c0a35831890dae029eccdf90661de68d520117c870fcf940dfb47072e61bb8bf"} Dec 03 22:27:31.453566 master-0 kubenswrapper[36504]: I1203 22:27:31.452645 36504 scope.go:117] "RemoveContainer" containerID="e97bfd2dd06fe8a4fda181f4fe789f73a0920768eff1cc8543ad59f86a3b10ad" Dec 03 22:27:31.459074 master-0 kubenswrapper[36504]: I1203 22:27:31.459016 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.459162 master-0 kubenswrapper[36504]: I1203 22:27:31.459083 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.459162 master-0 kubenswrapper[36504]: I1203 22:27:31.459134 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lxmq\" (UniqueName: \"kubernetes.io/projected/e8d19887-a1c7-4adc-8265-11d01d4b126e-kube-api-access-8lxmq\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.492804 master-0 kubenswrapper[36504]: I1203 22:27:31.492688 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:31.495611 master-0 kubenswrapper[36504]: I1203 22:27:31.495575 36504 scope.go:117] "RemoveContainer" containerID="450637a8a4c391214dc7621a0706afdb63c289981a94a2630ab16dc81407b41d" Dec 03 22:27:31.501542 master-0 kubenswrapper[36504]: I1203 22:27:31.501431 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kt9v8"] Dec 03 22:27:31.505526 master-0 kubenswrapper[36504]: I1203 22:27:31.505208 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-config-data" (OuterVolumeSpecName: "config-data") pod "e8d19887-a1c7-4adc-8265-11d01d4b126e" (UID: "e8d19887-a1c7-4adc-8265-11d01d4b126e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:31.519188 master-0 kubenswrapper[36504]: I1203 22:27:31.519082 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kt9v8"] Dec 03 22:27:31.530451 master-0 kubenswrapper[36504]: I1203 22:27:31.529178 36504 scope.go:117] "RemoveContainer" containerID="eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36" Dec 03 22:27:31.555966 master-0 kubenswrapper[36504]: I1203 22:27:31.555902 36504 scope.go:117] "RemoveContainer" containerID="bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a" Dec 03 22:27:31.562949 master-0 kubenswrapper[36504]: I1203 22:27:31.562885 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.562949 master-0 kubenswrapper[36504]: I1203 22:27:31.562928 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8d19887-a1c7-4adc-8265-11d01d4b126e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:31.593458 master-0 kubenswrapper[36504]: I1203 22:27:31.593372 36504 scope.go:117] "RemoveContainer" containerID="5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a" Dec 03 22:27:31.619833 master-0 kubenswrapper[36504]: I1203 22:27:31.619735 36504 scope.go:117] "RemoveContainer" containerID="005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2" Dec 03 22:27:31.659720 master-0 kubenswrapper[36504]: I1203 22:27:31.658894 36504 scope.go:117] "RemoveContainer" containerID="eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36" Dec 03 22:27:31.661314 master-0 kubenswrapper[36504]: E1203 22:27:31.660384 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36\": container with ID starting with eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36 not found: ID does not exist" containerID="eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36" Dec 03 22:27:31.661314 master-0 kubenswrapper[36504]: I1203 22:27:31.660443 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36"} err="failed to get container status \"eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36\": rpc error: code = NotFound desc = could not find container \"eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36\": container with ID starting with eda0fa553378f6f11015257369f24c5a1d40ea51c46b14b4ac3ceaca125a2b36 not found: ID does not exist" Dec 03 22:27:31.661314 master-0 kubenswrapper[36504]: I1203 22:27:31.660470 36504 scope.go:117] "RemoveContainer" containerID="bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a" Dec 03 22:27:31.661490 master-0 kubenswrapper[36504]: E1203 22:27:31.661334 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a\": container with ID starting with bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a not found: ID does not exist" containerID="bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a" Dec 03 22:27:31.661490 master-0 kubenswrapper[36504]: I1203 22:27:31.661354 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a"} err="failed to get container status \"bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a\": rpc error: code = NotFound desc = could not find container \"bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a\": container with ID starting with bc75e402a1357e5cd101686539e8f5543c9e83ecf359d2c811cd2078949a268a not found: ID does not exist" Dec 03 22:27:31.661490 master-0 kubenswrapper[36504]: I1203 22:27:31.661372 36504 scope.go:117] "RemoveContainer" containerID="5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a" Dec 03 22:27:31.662316 master-0 kubenswrapper[36504]: E1203 22:27:31.662284 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a\": container with ID starting with 5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a not found: ID does not exist" containerID="5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a" Dec 03 22:27:31.662383 master-0 kubenswrapper[36504]: I1203 22:27:31.662313 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a"} err="failed to get container status \"5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a\": rpc error: code = NotFound desc = could not find container \"5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a\": container with ID starting with 5df81d6bcaf8da62d92c3a4d59e255c49025e3bcc77c342edfabcfb80a3a280a not found: ID does not exist" Dec 03 22:27:31.662383 master-0 kubenswrapper[36504]: I1203 22:27:31.662330 36504 scope.go:117] "RemoveContainer" containerID="005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2" Dec 03 22:27:31.664673 master-0 kubenswrapper[36504]: E1203 22:27:31.664589 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2\": container with ID starting with 005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2 not found: ID does not exist" containerID="005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2" Dec 03 22:27:31.664792 master-0 kubenswrapper[36504]: I1203 22:27:31.664671 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2"} err="failed to get container status \"005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2\": rpc error: code = NotFound desc = could not find container \"005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2\": container with ID starting with 005cbab78b330003a55d6dc37465f09e4555e8400864a40c01ea5a194d7bcce2 not found: ID does not exist" Dec 03 22:27:31.775936 master-0 kubenswrapper[36504]: I1203 22:27:31.775737 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:31.792414 master-0 kubenswrapper[36504]: I1203 22:27:31.792322 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:31.826296 master-0 kubenswrapper[36504]: I1203 22:27:31.826127 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: E1203 22:27:31.827176 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="extract-utilities" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: I1203 22:27:31.827210 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="extract-utilities" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: E1203 22:27:31.827240 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="sg-core" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: I1203 22:27:31.827247 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="sg-core" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: E1203 22:27:31.827274 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-notification-agent" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: I1203 22:27:31.827282 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-notification-agent" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: E1203 22:27:31.827307 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="extract-content" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: I1203 22:27:31.827313 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="extract-content" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: E1203 22:27:31.827324 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="proxy-httpd" Dec 03 22:27:31.827320 master-0 kubenswrapper[36504]: I1203 22:27:31.827331 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="proxy-httpd" Dec 03 22:27:31.827689 master-0 kubenswrapper[36504]: E1203 22:27:31.827361 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-central-agent" Dec 03 22:27:31.827689 master-0 kubenswrapper[36504]: I1203 22:27:31.827372 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-central-agent" Dec 03 22:27:31.827689 master-0 kubenswrapper[36504]: E1203 22:27:31.827385 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="registry-server" Dec 03 22:27:31.827689 master-0 kubenswrapper[36504]: I1203 22:27:31.827391 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="registry-server" Dec 03 22:27:31.827883 master-0 kubenswrapper[36504]: I1203 22:27:31.827729 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" containerName="registry-server" Dec 03 22:27:31.827883 master-0 kubenswrapper[36504]: I1203 22:27:31.827763 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-notification-agent" Dec 03 22:27:31.827883 master-0 kubenswrapper[36504]: I1203 22:27:31.827800 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="ceilometer-central-agent" Dec 03 22:27:31.827883 master-0 kubenswrapper[36504]: I1203 22:27:31.827822 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="proxy-httpd" Dec 03 22:27:31.827883 master-0 kubenswrapper[36504]: I1203 22:27:31.827835 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" containerName="sg-core" Dec 03 22:27:31.840511 master-0 kubenswrapper[36504]: I1203 22:27:31.840441 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:31.847815 master-0 kubenswrapper[36504]: I1203 22:27:31.847682 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:27:31.849192 master-0 kubenswrapper[36504]: I1203 22:27:31.849106 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:27:31.879058 master-0 kubenswrapper[36504]: I1203 22:27:31.878994 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbzf2\" (UniqueName: \"kubernetes.io/projected/cea2d208-6be3-4b2f-8624-79136866f5b4-kube-api-access-jbzf2\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.879228 master-0 kubenswrapper[36504]: I1203 22:27:31.879177 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-scripts\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.879429 master-0 kubenswrapper[36504]: I1203 22:27:31.879410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-run-httpd\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.881022 master-0 kubenswrapper[36504]: I1203 22:27:31.879483 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-log-httpd\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.881079 master-0 kubenswrapper[36504]: I1203 22:27:31.881056 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-config-data\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.881242 master-0 kubenswrapper[36504]: I1203 22:27:31.881228 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.881415 master-0 kubenswrapper[36504]: I1203 22:27:31.881293 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.893378 master-0 kubenswrapper[36504]: I1203 22:27:31.893273 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:31.984423 master-0 kubenswrapper[36504]: I1203 22:27:31.984337 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbzf2\" (UniqueName: \"kubernetes.io/projected/cea2d208-6be3-4b2f-8624-79136866f5b4-kube-api-access-jbzf2\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985175 master-0 kubenswrapper[36504]: I1203 22:27:31.984452 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-scripts\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985175 master-0 kubenswrapper[36504]: I1203 22:27:31.984563 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-run-httpd\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985175 master-0 kubenswrapper[36504]: I1203 22:27:31.984607 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-log-httpd\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985175 master-0 kubenswrapper[36504]: I1203 22:27:31.984667 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-config-data\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985175 master-0 kubenswrapper[36504]: I1203 22:27:31.984733 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985175 master-0 kubenswrapper[36504]: I1203 22:27:31.984762 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985491 master-0 kubenswrapper[36504]: I1203 22:27:31.985321 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-run-httpd\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.985491 master-0 kubenswrapper[36504]: I1203 22:27:31.985391 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-log-httpd\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.989717 master-0 kubenswrapper[36504]: I1203 22:27:31.989652 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.989912 master-0 kubenswrapper[36504]: I1203 22:27:31.989869 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.990413 master-0 kubenswrapper[36504]: I1203 22:27:31.990367 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-config-data\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:31.993414 master-0 kubenswrapper[36504]: I1203 22:27:31.993322 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-scripts\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:32.007987 master-0 kubenswrapper[36504]: I1203 22:27:32.007893 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbzf2\" (UniqueName: \"kubernetes.io/projected/cea2d208-6be3-4b2f-8624-79136866f5b4-kube-api-access-jbzf2\") pod \"ceilometer-0\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " pod="openstack/ceilometer-0" Dec 03 22:27:32.096240 master-0 kubenswrapper[36504]: I1203 22:27:32.096042 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:27:32.229010 master-0 kubenswrapper[36504]: I1203 22:27:32.228893 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:27:33.091406 master-0 kubenswrapper[36504]: I1203 22:27:33.089798 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:27:33.091406 master-0 kubenswrapper[36504]: W1203 22:27:33.090263 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcea2d208_6be3_4b2f_8624_79136866f5b4.slice/crio-cad6f657558b3e3f15fab1f51e907b23de08fe072efff8ccb6d25b33d25c3633 WatchSource:0}: Error finding container cad6f657558b3e3f15fab1f51e907b23de08fe072efff8ccb6d25b33d25c3633: Status 404 returned error can't find the container with id cad6f657558b3e3f15fab1f51e907b23de08fe072efff8ccb6d25b33d25c3633 Dec 03 22:27:33.124330 master-0 kubenswrapper[36504]: I1203 22:27:33.124233 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7491c0dc-b2bf-470a-94ff-5f10fb4e0d10" path="/var/lib/kubelet/pods/7491c0dc-b2bf-470a-94ff-5f10fb4e0d10/volumes" Dec 03 22:27:33.125529 master-0 kubenswrapper[36504]: I1203 22:27:33.125488 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d19887-a1c7-4adc-8265-11d01d4b126e" path="/var/lib/kubelet/pods/e8d19887-a1c7-4adc-8265-11d01d4b126e/volumes" Dec 03 22:27:33.215128 master-0 kubenswrapper[36504]: I1203 22:27:33.215072 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:33.225646 master-0 kubenswrapper[36504]: I1203 22:27:33.225524 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:33.234285 master-0 kubenswrapper[36504]: I1203 22:27:33.234149 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2csj\" (UniqueName: \"kubernetes.io/projected/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-kube-api-access-k2csj\") pod \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " Dec 03 22:27:33.234907 master-0 kubenswrapper[36504]: I1203 22:27:33.234761 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-operator-scripts\") pod \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\" (UID: \"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a\") " Dec 03 22:27:33.236173 master-0 kubenswrapper[36504]: I1203 22:27:33.236105 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" (UID: "0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:33.242201 master-0 kubenswrapper[36504]: I1203 22:27:33.242089 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-kube-api-access-k2csj" (OuterVolumeSpecName: "kube-api-access-k2csj") pod "0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" (UID: "0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a"). InnerVolumeSpecName "kube-api-access-k2csj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:33.339652 master-0 kubenswrapper[36504]: I1203 22:27:33.337994 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac6200f-b04a-499d-befd-180a060fa107-operator-scripts\") pod \"cac6200f-b04a-499d-befd-180a060fa107\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " Dec 03 22:27:33.339652 master-0 kubenswrapper[36504]: I1203 22:27:33.338138 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc9tx\" (UniqueName: \"kubernetes.io/projected/cac6200f-b04a-499d-befd-180a060fa107-kube-api-access-jc9tx\") pod \"cac6200f-b04a-499d-befd-180a060fa107\" (UID: \"cac6200f-b04a-499d-befd-180a060fa107\") " Dec 03 22:27:33.339652 master-0 kubenswrapper[36504]: I1203 22:27:33.338577 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac6200f-b04a-499d-befd-180a060fa107-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cac6200f-b04a-499d-befd-180a060fa107" (UID: "cac6200f-b04a-499d-befd-180a060fa107"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:33.340280 master-0 kubenswrapper[36504]: I1203 22:27:33.339705 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2csj\" (UniqueName: \"kubernetes.io/projected/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-kube-api-access-k2csj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:33.340280 master-0 kubenswrapper[36504]: I1203 22:27:33.339724 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cac6200f-b04a-499d-befd-180a060fa107-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:33.340280 master-0 kubenswrapper[36504]: I1203 22:27:33.339737 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:33.342636 master-0 kubenswrapper[36504]: I1203 22:27:33.341903 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac6200f-b04a-499d-befd-180a060fa107-kube-api-access-jc9tx" (OuterVolumeSpecName: "kube-api-access-jc9tx") pod "cac6200f-b04a-499d-befd-180a060fa107" (UID: "cac6200f-b04a-499d-befd-180a060fa107"). InnerVolumeSpecName "kube-api-access-jc9tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:33.443339 master-0 kubenswrapper[36504]: I1203 22:27:33.443086 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc9tx\" (UniqueName: \"kubernetes.io/projected/cac6200f-b04a-499d-befd-180a060fa107-kube-api-access-jc9tx\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:33.461132 master-0 kubenswrapper[36504]: I1203 22:27:33.461056 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3663-account-create-update-7qg85" event={"ID":"cac6200f-b04a-499d-befd-180a060fa107","Type":"ContainerDied","Data":"4dba23c74e9e007f37f9a6b26ee60a54c805fdb7cd1579a6955136f31a970b6b"} Dec 03 22:27:33.461132 master-0 kubenswrapper[36504]: I1203 22:27:33.461125 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dba23c74e9e007f37f9a6b26ee60a54c805fdb7cd1579a6955136f31a970b6b" Dec 03 22:27:33.461132 master-0 kubenswrapper[36504]: I1203 22:27:33.461085 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3663-account-create-update-7qg85" Dec 03 22:27:33.463606 master-0 kubenswrapper[36504]: I1203 22:27:33.463582 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-ftthv" Dec 03 22:27:33.463747 master-0 kubenswrapper[36504]: I1203 22:27:33.463682 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-ftthv" event={"ID":"0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a","Type":"ContainerDied","Data":"35a7f4b30a857fa520ce86c3d1c67524b18b7eea1a45c13d2209b7fdebce7c18"} Dec 03 22:27:33.463861 master-0 kubenswrapper[36504]: I1203 22:27:33.463843 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a7f4b30a857fa520ce86c3d1c67524b18b7eea1a45c13d2209b7fdebce7c18" Dec 03 22:27:33.468154 master-0 kubenswrapper[36504]: I1203 22:27:33.466853 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerStarted","Data":"cad6f657558b3e3f15fab1f51e907b23de08fe072efff8ccb6d25b33d25c3633"} Dec 03 22:27:34.485283 master-0 kubenswrapper[36504]: I1203 22:27:34.484806 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerStarted","Data":"a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214"} Dec 03 22:27:34.485283 master-0 kubenswrapper[36504]: I1203 22:27:34.484882 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerStarted","Data":"ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d"} Dec 03 22:27:35.502760 master-0 kubenswrapper[36504]: I1203 22:27:35.502559 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerStarted","Data":"511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18"} Dec 03 22:27:35.734138 master-0 kubenswrapper[36504]: E1203 22:27:35.734044 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:27:37.542369 master-0 kubenswrapper[36504]: I1203 22:27:37.542271 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerStarted","Data":"1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296"} Dec 03 22:27:37.543235 master-0 kubenswrapper[36504]: I1203 22:27:37.542575 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:27:38.872504 master-0 kubenswrapper[36504]: I1203 22:27:38.872374 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.610668745 podStartE2EDuration="7.872334516s" podCreationTimestamp="2025-12-03 22:27:31 +0000 UTC" firstStartedPulling="2025-12-03 22:27:33.100797141 +0000 UTC m=+1018.320569148" lastFinishedPulling="2025-12-03 22:27:36.362462912 +0000 UTC m=+1021.582234919" observedRunningTime="2025-12-03 22:27:37.584485089 +0000 UTC m=+1022.804257106" watchObservedRunningTime="2025-12-03 22:27:38.872334516 +0000 UTC m=+1024.092106523" Dec 03 22:27:38.873944 master-0 kubenswrapper[36504]: I1203 22:27:38.873887 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-2lkqc"] Dec 03 22:27:38.875305 master-0 kubenswrapper[36504]: E1203 22:27:38.874803 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" containerName="mariadb-database-create" Dec 03 22:27:38.875305 master-0 kubenswrapper[36504]: I1203 22:27:38.874831 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" containerName="mariadb-database-create" Dec 03 22:27:38.875305 master-0 kubenswrapper[36504]: E1203 22:27:38.874876 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac6200f-b04a-499d-befd-180a060fa107" containerName="mariadb-account-create-update" Dec 03 22:27:38.875305 master-0 kubenswrapper[36504]: I1203 22:27:38.874884 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac6200f-b04a-499d-befd-180a060fa107" containerName="mariadb-account-create-update" Dec 03 22:27:38.875305 master-0 kubenswrapper[36504]: I1203 22:27:38.875226 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" containerName="mariadb-database-create" Dec 03 22:27:38.875305 master-0 kubenswrapper[36504]: I1203 22:27:38.875284 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac6200f-b04a-499d-befd-180a060fa107" containerName="mariadb-account-create-update" Dec 03 22:27:38.876566 master-0 kubenswrapper[36504]: I1203 22:27:38.876525 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:38.889310 master-0 kubenswrapper[36504]: I1203 22:27:38.889207 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 22:27:38.889604 master-0 kubenswrapper[36504]: I1203 22:27:38.889332 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 22:27:38.889681 master-0 kubenswrapper[36504]: I1203 22:27:38.889646 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 22:27:38.890150 master-0 kubenswrapper[36504]: I1203 22:27:38.890105 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 22:27:38.911177 master-0 kubenswrapper[36504]: I1203 22:27:38.906732 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-2lkqc"] Dec 03 22:27:38.951115 master-0 kubenswrapper[36504]: I1203 22:27:38.951013 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrszx\" (UniqueName: \"kubernetes.io/projected/a26d6e38-228d-4663-9826-d79e438d324e-kube-api-access-lrszx\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:38.951423 master-0 kubenswrapper[36504]: I1203 22:27:38.951221 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-config-data\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:38.951423 master-0 kubenswrapper[36504]: I1203 22:27:38.951301 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-combined-ca-bundle\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:38.952259 master-0 kubenswrapper[36504]: I1203 22:27:38.952220 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-scripts\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.059422 master-0 kubenswrapper[36504]: I1203 22:27:39.059343 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrszx\" (UniqueName: \"kubernetes.io/projected/a26d6e38-228d-4663-9826-d79e438d324e-kube-api-access-lrszx\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.059824 master-0 kubenswrapper[36504]: I1203 22:27:39.059438 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-config-data\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.059824 master-0 kubenswrapper[36504]: I1203 22:27:39.059475 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-combined-ca-bundle\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.059922 master-0 kubenswrapper[36504]: I1203 22:27:39.059896 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-scripts\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.064533 master-0 kubenswrapper[36504]: I1203 22:27:39.064480 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-config-data\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.064751 master-0 kubenswrapper[36504]: I1203 22:27:39.064576 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-combined-ca-bundle\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.066542 master-0 kubenswrapper[36504]: I1203 22:27:39.066504 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-scripts\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.086103 master-0 kubenswrapper[36504]: I1203 22:27:39.086036 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrszx\" (UniqueName: \"kubernetes.io/projected/a26d6e38-228d-4663-9826-d79e438d324e-kube-api-access-lrszx\") pod \"aodh-db-sync-2lkqc\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.241048 master-0 kubenswrapper[36504]: I1203 22:27:39.240745 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:39.639039 master-0 kubenswrapper[36504]: I1203 22:27:39.620270 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9ckgg"] Dec 03 22:27:39.639039 master-0 kubenswrapper[36504]: I1203 22:27:39.623274 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.639039 master-0 kubenswrapper[36504]: I1203 22:27:39.627652 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 22:27:39.639039 master-0 kubenswrapper[36504]: I1203 22:27:39.633543 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 22:27:39.639039 master-0 kubenswrapper[36504]: I1203 22:27:39.638368 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ckgg"] Dec 03 22:27:39.712932 master-0 kubenswrapper[36504]: I1203 22:27:39.712834 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-config-data\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.714855 master-0 kubenswrapper[36504]: I1203 22:27:39.714814 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk9tn\" (UniqueName: \"kubernetes.io/projected/aa3c88c3-0ad0-4fc0-941c-52bde77db555-kube-api-access-dk9tn\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.714956 master-0 kubenswrapper[36504]: I1203 22:27:39.714935 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-scripts\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.715107 master-0 kubenswrapper[36504]: I1203 22:27:39.715081 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.821166 master-0 kubenswrapper[36504]: I1203 22:27:39.820645 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk9tn\" (UniqueName: \"kubernetes.io/projected/aa3c88c3-0ad0-4fc0-941c-52bde77db555-kube-api-access-dk9tn\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.821166 master-0 kubenswrapper[36504]: I1203 22:27:39.820744 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-scripts\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.842814 master-0 kubenswrapper[36504]: I1203 22:27:39.837786 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.842814 master-0 kubenswrapper[36504]: I1203 22:27:39.838470 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-config-data\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.842814 master-0 kubenswrapper[36504]: I1203 22:27:39.838747 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-scripts\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.872870 master-0 kubenswrapper[36504]: I1203 22:27:39.857812 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.884877 master-0 kubenswrapper[36504]: I1203 22:27:39.884482 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-config-data\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.953328 master-0 kubenswrapper[36504]: I1203 22:27:39.942935 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk9tn\" (UniqueName: \"kubernetes.io/projected/aa3c88c3-0ad0-4fc0-941c-52bde77db555-kube-api-access-dk9tn\") pod \"nova-cell0-cell-mapping-9ckgg\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:39.976799 master-0 kubenswrapper[36504]: I1203 22:27:39.962323 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-2lkqc"] Dec 03 22:27:40.089798 master-0 kubenswrapper[36504]: I1203 22:27:40.089183 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:40.164819 master-0 kubenswrapper[36504]: I1203 22:27:40.148656 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:40.164819 master-0 kubenswrapper[36504]: I1203 22:27:40.156022 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:27:40.192808 master-0 kubenswrapper[36504]: I1203 22:27:40.192080 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:27:40.220804 master-0 kubenswrapper[36504]: I1203 22:27:40.218954 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-config-data\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.220804 master-0 kubenswrapper[36504]: I1203 22:27:40.219046 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3485dc52-4df3-4d63-ae03-1fca7630d64a-logs\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.220804 master-0 kubenswrapper[36504]: I1203 22:27:40.219071 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t88q\" (UniqueName: \"kubernetes.io/projected/3485dc52-4df3-4d63-ae03-1fca7630d64a-kube-api-access-2t88q\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.220804 master-0 kubenswrapper[36504]: I1203 22:27:40.219208 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.255606 master-0 kubenswrapper[36504]: I1203 22:27:40.253619 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:40.259531 master-0 kubenswrapper[36504]: I1203 22:27:40.257123 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:40.298145 master-0 kubenswrapper[36504]: I1203 22:27:40.293466 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336156 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336282 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d05ff21-79c6-4875-b5d4-d06641554c9b-logs\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336352 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdg6\" (UniqueName: \"kubernetes.io/projected/7d05ff21-79c6-4875-b5d4-d06641554c9b-kube-api-access-pgdg6\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336425 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-config-data\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336483 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336524 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-config-data\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336543 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3485dc52-4df3-4d63-ae03-1fca7630d64a-logs\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.336560 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t88q\" (UniqueName: \"kubernetes.io/projected/3485dc52-4df3-4d63-ae03-1fca7630d64a-kube-api-access-2t88q\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.345375 master-0 kubenswrapper[36504]: I1203 22:27:40.341670 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:40.364805 master-0 kubenswrapper[36504]: I1203 22:27:40.347789 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.364805 master-0 kubenswrapper[36504]: I1203 22:27:40.354503 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3485dc52-4df3-4d63-ae03-1fca7630d64a-logs\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.364805 master-0 kubenswrapper[36504]: I1203 22:27:40.356184 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-config-data\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.441200 master-0 kubenswrapper[36504]: I1203 22:27:40.425945 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t88q\" (UniqueName: \"kubernetes.io/projected/3485dc52-4df3-4d63-ae03-1fca7630d64a-kube-api-access-2t88q\") pod \"nova-api-0\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " pod="openstack/nova-api-0" Dec 03 22:27:40.564559 master-0 kubenswrapper[36504]: I1203 22:27:40.560139 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d05ff21-79c6-4875-b5d4-d06641554c9b-logs\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.564559 master-0 kubenswrapper[36504]: I1203 22:27:40.560420 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdg6\" (UniqueName: \"kubernetes.io/projected/7d05ff21-79c6-4875-b5d4-d06641554c9b-kube-api-access-pgdg6\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.564559 master-0 kubenswrapper[36504]: I1203 22:27:40.560729 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-config-data\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.564559 master-0 kubenswrapper[36504]: I1203 22:27:40.561469 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d05ff21-79c6-4875-b5d4-d06641554c9b-logs\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.580812 master-0 kubenswrapper[36504]: I1203 22:27:40.580735 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.582385 master-0 kubenswrapper[36504]: I1203 22:27:40.582325 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:27:40.585211 master-0 kubenswrapper[36504]: I1203 22:27:40.585142 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-config-data\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.654187 master-0 kubenswrapper[36504]: I1203 22:27:40.652598 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.654187 master-0 kubenswrapper[36504]: I1203 22:27:40.652670 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdg6\" (UniqueName: \"kubernetes.io/projected/7d05ff21-79c6-4875-b5d4-d06641554c9b-kube-api-access-pgdg6\") pod \"nova-metadata-0\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:40.668337 master-0 kubenswrapper[36504]: I1203 22:27:40.668081 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:40.672381 master-0 kubenswrapper[36504]: I1203 22:27:40.671284 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:40.759578 master-0 kubenswrapper[36504]: I1203 22:27:40.740896 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:40.796579 master-0 kubenswrapper[36504]: I1203 22:27:40.796480 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:27:40.796952 master-0 kubenswrapper[36504]: I1203 22:27:40.796844 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:27:40.802295 master-0 kubenswrapper[36504]: I1203 22:27:40.800349 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:40.803374 master-0 kubenswrapper[36504]: I1203 22:27:40.803313 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:27:40.803561 master-0 kubenswrapper[36504]: I1203 22:27:40.803423 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2lkqc" event={"ID":"a26d6e38-228d-4663-9826-d79e438d324e","Type":"ContainerStarted","Data":"6ac37113f84ee635add12dececed935af8c56588b39f35ac42e3b58cd3ef151b"} Dec 03 22:27:40.804192 master-0 kubenswrapper[36504]: I1203 22:27:40.803999 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 22:27:40.804192 master-0 kubenswrapper[36504]: I1203 22:27:40.804148 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 22:27:40.858998 master-0 kubenswrapper[36504]: I1203 22:27:40.835975 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:40.885801 master-0 kubenswrapper[36504]: I1203 22:27:40.883208 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b5b7f9cc-hv7jl"] Dec 03 22:27:40.887610 master-0 kubenswrapper[36504]: I1203 22:27:40.886338 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:40.938066 master-0 kubenswrapper[36504]: I1203 22:27:40.922183 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b5b7f9cc-hv7jl"] Dec 03 22:27:40.938066 master-0 kubenswrapper[36504]: I1203 22:27:40.936523 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:40.938066 master-0 kubenswrapper[36504]: I1203 22:27:40.936851 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:40.938066 master-0 kubenswrapper[36504]: I1203 22:27:40.937082 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbq98\" (UniqueName: \"kubernetes.io/projected/c380b560-bf38-4838-8e7d-2d65092dc448-kube-api-access-qbq98\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:40.938066 master-0 kubenswrapper[36504]: I1203 22:27:40.937564 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:40.938066 master-0 kubenswrapper[36504]: I1203 22:27:40.937876 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ccx6\" (UniqueName: \"kubernetes.io/projected/4f60ec5c-4977-4191-a3a1-399d0859e1f7-kube-api-access-6ccx6\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:40.953821 master-0 kubenswrapper[36504]: I1203 22:27:40.938349 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-config-data\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.046805 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnwf\" (UniqueName: \"kubernetes.io/projected/13feaa47-c881-462d-b78d-1716d9552aeb-kube-api-access-lwnwf\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.046935 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-svc\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047021 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047041 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-config\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047071 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbq98\" (UniqueName: \"kubernetes.io/projected/c380b560-bf38-4838-8e7d-2d65092dc448-kube-api-access-qbq98\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047117 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047152 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-edpm\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047201 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047251 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ccx6\" (UniqueName: \"kubernetes.io/projected/4f60ec5c-4977-4191-a3a1-399d0859e1f7-kube-api-access-6ccx6\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047339 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-config-data\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047386 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047436 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.049877 master-0 kubenswrapper[36504]: I1203 22:27:41.047475 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.087948 master-0 kubenswrapper[36504]: I1203 22:27:41.082218 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ckgg"] Dec 03 22:27:41.098919 master-0 kubenswrapper[36504]: I1203 22:27:41.097941 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.111910 master-0 kubenswrapper[36504]: I1203 22:27:41.111259 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-config-data\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.111910 master-0 kubenswrapper[36504]: I1203 22:27:41.111377 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.121555 master-0 kubenswrapper[36504]: I1203 22:27:41.121489 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.124123 master-0 kubenswrapper[36504]: I1203 22:27:41.124012 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbq98\" (UniqueName: \"kubernetes.io/projected/c380b560-bf38-4838-8e7d-2d65092dc448-kube-api-access-qbq98\") pod \"nova-scheduler-0\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:41.131630 master-0 kubenswrapper[36504]: I1203 22:27:41.131580 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ccx6\" (UniqueName: \"kubernetes.io/projected/4f60ec5c-4977-4191-a3a1-399d0859e1f7-kube-api-access-6ccx6\") pod \"nova-cell1-novncproxy-0\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.150756 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-svc\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.150897 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-config\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.150961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-edpm\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.151010 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.151134 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.151187 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.151229 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwnwf\" (UniqueName: \"kubernetes.io/projected/13feaa47-c881-462d-b78d-1716d9552aeb-kube-api-access-lwnwf\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.152420 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-svc\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.152735 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-config\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.153037 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.153076 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.152582 master-0 kubenswrapper[36504]: I1203 22:27:41.153179 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.155588 master-0 kubenswrapper[36504]: I1203 22:27:41.153564 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-edpm\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.172631 master-0 kubenswrapper[36504]: I1203 22:27:41.172486 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:27:41.173354 master-0 kubenswrapper[36504]: I1203 22:27:41.173108 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwnwf\" (UniqueName: \"kubernetes.io/projected/13feaa47-c881-462d-b78d-1716d9552aeb-kube-api-access-lwnwf\") pod \"dnsmasq-dns-6b5b7f9cc-hv7jl\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.195812 master-0 kubenswrapper[36504]: I1203 22:27:41.195193 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:41.377823 master-0 kubenswrapper[36504]: I1203 22:27:41.377060 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:41.555179 master-0 kubenswrapper[36504]: I1203 22:27:41.550874 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nn2xp"] Dec 03 22:27:41.608880 master-0 kubenswrapper[36504]: I1203 22:27:41.608658 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.614043 master-0 kubenswrapper[36504]: I1203 22:27:41.613641 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 22:27:41.615144 master-0 kubenswrapper[36504]: I1203 22:27:41.614200 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 22:27:41.650897 master-0 kubenswrapper[36504]: I1203 22:27:41.650253 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nn2xp"] Dec 03 22:27:41.674506 master-0 kubenswrapper[36504]: I1203 22:27:41.674398 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:41.712453 master-0 kubenswrapper[36504]: I1203 22:27:41.690564 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-config-data\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.712453 master-0 kubenswrapper[36504]: I1203 22:27:41.690706 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgqlx\" (UniqueName: \"kubernetes.io/projected/bc0b5f74-16f0-4381-86a2-4ba49c379be8-kube-api-access-cgqlx\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.712453 master-0 kubenswrapper[36504]: I1203 22:27:41.691032 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-scripts\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.712453 master-0 kubenswrapper[36504]: I1203 22:27:41.691196 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.809914 master-0 kubenswrapper[36504]: I1203 22:27:41.805402 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:41.809914 master-0 kubenswrapper[36504]: I1203 22:27:41.806161 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-scripts\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.809914 master-0 kubenswrapper[36504]: I1203 22:27:41.806554 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.809914 master-0 kubenswrapper[36504]: I1203 22:27:41.807747 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-config-data\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.809914 master-0 kubenswrapper[36504]: I1203 22:27:41.807866 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgqlx\" (UniqueName: \"kubernetes.io/projected/bc0b5f74-16f0-4381-86a2-4ba49c379be8-kube-api-access-cgqlx\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.817582 master-0 kubenswrapper[36504]: I1203 22:27:41.815558 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-scripts\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.824069 master-0 kubenswrapper[36504]: I1203 22:27:41.818889 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.826418 master-0 kubenswrapper[36504]: I1203 22:27:41.826099 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-config-data\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.836538 master-0 kubenswrapper[36504]: I1203 22:27:41.836357 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgqlx\" (UniqueName: \"kubernetes.io/projected/bc0b5f74-16f0-4381-86a2-4ba49c379be8-kube-api-access-cgqlx\") pod \"nova-cell1-conductor-db-sync-nn2xp\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:41.857230 master-0 kubenswrapper[36504]: I1203 22:27:41.857152 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ckgg" event={"ID":"aa3c88c3-0ad0-4fc0-941c-52bde77db555","Type":"ContainerStarted","Data":"f8361c0f5cf9a6c8c5afb7de59d0f6b25c830c2067447840afc674bc481aa945"} Dec 03 22:27:41.857230 master-0 kubenswrapper[36504]: I1203 22:27:41.857233 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ckgg" event={"ID":"aa3c88c3-0ad0-4fc0-941c-52bde77db555","Type":"ContainerStarted","Data":"b59cc62bb94f1486aed562c13923d489af660597932c9aff423a9ee6016ee0cd"} Dec 03 22:27:41.870718 master-0 kubenswrapper[36504]: I1203 22:27:41.870639 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3485dc52-4df3-4d63-ae03-1fca7630d64a","Type":"ContainerStarted","Data":"d6b72af6f352c84a9ccbf855f55719e775b20d2c2ef9dc15a5bd359fddbbac26"} Dec 03 22:27:41.875598 master-0 kubenswrapper[36504]: W1203 22:27:41.875155 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d05ff21_79c6_4875_b5d4_d06641554c9b.slice/crio-fe95f352afb522dae2c9a4bbdc596ed379180657c7d9fda158f89fdb9cf393d5 WatchSource:0}: Error finding container fe95f352afb522dae2c9a4bbdc596ed379180657c7d9fda158f89fdb9cf393d5: Status 404 returned error can't find the container with id fe95f352afb522dae2c9a4bbdc596ed379180657c7d9fda158f89fdb9cf393d5 Dec 03 22:27:41.905338 master-0 kubenswrapper[36504]: I1203 22:27:41.905189 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9ckgg" podStartSLOduration=2.905147059 podStartE2EDuration="2.905147059s" podCreationTimestamp="2025-12-03 22:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:41.881359486 +0000 UTC m=+1027.101131503" watchObservedRunningTime="2025-12-03 22:27:41.905147059 +0000 UTC m=+1027.124919066" Dec 03 22:27:41.998239 master-0 kubenswrapper[36504]: I1203 22:27:41.998178 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:42.047247 master-0 kubenswrapper[36504]: I1203 22:27:42.047170 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:42.270446 master-0 kubenswrapper[36504]: I1203 22:27:42.260731 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:27:42.390937 master-0 kubenswrapper[36504]: I1203 22:27:42.389286 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b5b7f9cc-hv7jl"] Dec 03 22:27:42.660887 master-0 kubenswrapper[36504]: I1203 22:27:42.660077 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nn2xp"] Dec 03 22:27:42.926816 master-0 kubenswrapper[36504]: I1203 22:27:42.925935 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" event={"ID":"bc0b5f74-16f0-4381-86a2-4ba49c379be8","Type":"ContainerStarted","Data":"ad4cb37ec3e91a8c25c7827d622de97ed1093c02c33b3d637d8f6a4efadbb64b"} Dec 03 22:27:42.927689 master-0 kubenswrapper[36504]: I1203 22:27:42.927140 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c380b560-bf38-4838-8e7d-2d65092dc448","Type":"ContainerStarted","Data":"6c0772b8f209bb0853f863beeaf3722c224b5b95724a1e2e99b2abf8fdf30781"} Dec 03 22:27:42.951029 master-0 kubenswrapper[36504]: I1203 22:27:42.948153 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d05ff21-79c6-4875-b5d4-d06641554c9b","Type":"ContainerStarted","Data":"fe95f352afb522dae2c9a4bbdc596ed379180657c7d9fda158f89fdb9cf393d5"} Dec 03 22:27:42.958257 master-0 kubenswrapper[36504]: I1203 22:27:42.955759 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" event={"ID":"13feaa47-c881-462d-b78d-1716d9552aeb","Type":"ContainerStarted","Data":"86c68dd2c14edd4fbb942b4946db74406df09d3a121089b692c3c2d93892730b"} Dec 03 22:27:42.965801 master-0 kubenswrapper[36504]: I1203 22:27:42.964321 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f60ec5c-4977-4191-a3a1-399d0859e1f7","Type":"ContainerStarted","Data":"108286f14b2d345aabe20ed9c1d9ce8508ebdbed73c6f56f3ef0e27a91e0f456"} Dec 03 22:27:44.017049 master-0 kubenswrapper[36504]: I1203 22:27:44.016760 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" event={"ID":"bc0b5f74-16f0-4381-86a2-4ba49c379be8","Type":"ContainerStarted","Data":"204f255c14881341d8537a4d603b58e261b7f1036050cd39c7ecc4c9e13f4d6c"} Dec 03 22:27:44.022217 master-0 kubenswrapper[36504]: I1203 22:27:44.022152 36504 generic.go:334] "Generic (PLEG): container finished" podID="13feaa47-c881-462d-b78d-1716d9552aeb" containerID="2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db" exitCode=0 Dec 03 22:27:44.022217 master-0 kubenswrapper[36504]: I1203 22:27:44.022217 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" event={"ID":"13feaa47-c881-462d-b78d-1716d9552aeb","Type":"ContainerDied","Data":"2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db"} Dec 03 22:27:44.053697 master-0 kubenswrapper[36504]: I1203 22:27:44.051353 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" podStartSLOduration=3.051007179 podStartE2EDuration="3.051007179s" podCreationTimestamp="2025-12-03 22:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:44.044222379 +0000 UTC m=+1029.263994406" watchObservedRunningTime="2025-12-03 22:27:44.051007179 +0000 UTC m=+1029.270779186" Dec 03 22:27:44.188517 master-0 kubenswrapper[36504]: I1203 22:27:44.188402 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:44.228430 master-0 kubenswrapper[36504]: I1203 22:27:44.228338 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:27:50.278047 master-0 kubenswrapper[36504]: I1203 22:27:50.277735 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d05ff21-79c6-4875-b5d4-d06641554c9b","Type":"ContainerStarted","Data":"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595"} Dec 03 22:27:50.279245 master-0 kubenswrapper[36504]: I1203 22:27:50.279209 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2lkqc" event={"ID":"a26d6e38-228d-4663-9826-d79e438d324e","Type":"ContainerStarted","Data":"649d688dfef8ebcff089c631cb163152da87a58648d7b1a66c36c2a31818aaf6"} Dec 03 22:27:50.280612 master-0 kubenswrapper[36504]: I1203 22:27:50.280577 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3485dc52-4df3-4d63-ae03-1fca7630d64a","Type":"ContainerStarted","Data":"79f7b7aedcb26f75a92630e9087931a3449785ee25307f1900d3f7e8bc356472"} Dec 03 22:27:50.283832 master-0 kubenswrapper[36504]: I1203 22:27:50.283718 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" event={"ID":"13feaa47-c881-462d-b78d-1716d9552aeb","Type":"ContainerStarted","Data":"99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9"} Dec 03 22:27:50.293087 master-0 kubenswrapper[36504]: I1203 22:27:50.293008 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:50.295869 master-0 kubenswrapper[36504]: I1203 22:27:50.295700 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f60ec5c-4977-4191-a3a1-399d0859e1f7","Type":"ContainerStarted","Data":"5ab7979fd89d941a5c3e5b09e4170705d1c54e3de7e98cfb055fb1fdaad5203c"} Dec 03 22:27:50.296089 master-0 kubenswrapper[36504]: I1203 22:27:50.296052 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4f60ec5c-4977-4191-a3a1-399d0859e1f7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5ab7979fd89d941a5c3e5b09e4170705d1c54e3de7e98cfb055fb1fdaad5203c" gracePeriod=30 Dec 03 22:27:50.311807 master-0 kubenswrapper[36504]: I1203 22:27:50.309504 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c380b560-bf38-4838-8e7d-2d65092dc448","Type":"ContainerStarted","Data":"c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a"} Dec 03 22:27:50.350499 master-0 kubenswrapper[36504]: I1203 22:27:50.350164 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-2lkqc" podStartSLOduration=2.826845904 podStartE2EDuration="12.349407823s" podCreationTimestamp="2025-12-03 22:27:38 +0000 UTC" firstStartedPulling="2025-12-03 22:27:40.0034515 +0000 UTC m=+1025.223223507" lastFinishedPulling="2025-12-03 22:27:49.526013419 +0000 UTC m=+1034.745785426" observedRunningTime="2025-12-03 22:27:50.325055562 +0000 UTC m=+1035.544827579" watchObservedRunningTime="2025-12-03 22:27:50.349407823 +0000 UTC m=+1035.569179830" Dec 03 22:27:50.412884 master-0 kubenswrapper[36504]: I1203 22:27:50.412756 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.031046302 podStartE2EDuration="10.412720855s" podCreationTimestamp="2025-12-03 22:27:40 +0000 UTC" firstStartedPulling="2025-12-03 22:27:42.026582695 +0000 UTC m=+1027.246354702" lastFinishedPulling="2025-12-03 22:27:49.408257248 +0000 UTC m=+1034.628029255" observedRunningTime="2025-12-03 22:27:50.359356359 +0000 UTC m=+1035.579128386" watchObservedRunningTime="2025-12-03 22:27:50.412720855 +0000 UTC m=+1035.632492862" Dec 03 22:27:50.443994 master-0 kubenswrapper[36504]: I1203 22:27:50.443808 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.18922408 podStartE2EDuration="10.443781473s" podCreationTimestamp="2025-12-03 22:27:40 +0000 UTC" firstStartedPulling="2025-12-03 22:27:42.207592937 +0000 UTC m=+1027.427364944" lastFinishedPulling="2025-12-03 22:27:49.46215033 +0000 UTC m=+1034.681922337" observedRunningTime="2025-12-03 22:27:50.393670018 +0000 UTC m=+1035.613442025" watchObservedRunningTime="2025-12-03 22:27:50.443781473 +0000 UTC m=+1035.663553480" Dec 03 22:27:50.451755 master-0 kubenswrapper[36504]: I1203 22:27:50.451667 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" podStartSLOduration=10.451642856 podStartE2EDuration="10.451642856s" podCreationTimestamp="2025-12-03 22:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:50.42257465 +0000 UTC m=+1035.642346697" watchObservedRunningTime="2025-12-03 22:27:50.451642856 +0000 UTC m=+1035.671414863" Dec 03 22:27:51.174005 master-0 kubenswrapper[36504]: I1203 22:27:51.173928 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 22:27:51.178801 master-0 kubenswrapper[36504]: I1203 22:27:51.175622 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 22:27:51.197810 master-0 kubenswrapper[36504]: I1203 22:27:51.197107 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:27:51.211817 master-0 kubenswrapper[36504]: I1203 22:27:51.210967 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 22:27:51.333829 master-0 kubenswrapper[36504]: I1203 22:27:51.333670 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3485dc52-4df3-4d63-ae03-1fca7630d64a","Type":"ContainerStarted","Data":"d380cbbcb4d15196b60838cd2d865338897d8f915f5375fdc293f20e58d06b90"} Dec 03 22:27:51.339791 master-0 kubenswrapper[36504]: I1203 22:27:51.339093 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d05ff21-79c6-4875-b5d4-d06641554c9b","Type":"ContainerStarted","Data":"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48"} Dec 03 22:27:51.343788 master-0 kubenswrapper[36504]: I1203 22:27:51.340304 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-log" containerID="cri-o://c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595" gracePeriod=30 Dec 03 22:27:51.343788 master-0 kubenswrapper[36504]: I1203 22:27:51.340494 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-metadata" containerID="cri-o://9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48" gracePeriod=30 Dec 03 22:27:51.373790 master-0 kubenswrapper[36504]: I1203 22:27:51.373338 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.604897162 podStartE2EDuration="12.373302771s" podCreationTimestamp="2025-12-03 22:27:39 +0000 UTC" firstStartedPulling="2025-12-03 22:27:41.636965459 +0000 UTC m=+1026.856737466" lastFinishedPulling="2025-12-03 22:27:49.405371068 +0000 UTC m=+1034.625143075" observedRunningTime="2025-12-03 22:27:51.358267067 +0000 UTC m=+1036.578039074" watchObservedRunningTime="2025-12-03 22:27:51.373302771 +0000 UTC m=+1036.593074768" Dec 03 22:27:51.404805 master-0 kubenswrapper[36504]: I1203 22:27:51.404623 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.824285256 podStartE2EDuration="11.404579505s" podCreationTimestamp="2025-12-03 22:27:40 +0000 UTC" firstStartedPulling="2025-12-03 22:27:41.890080474 +0000 UTC m=+1027.109852481" lastFinishedPulling="2025-12-03 22:27:49.470374723 +0000 UTC m=+1034.690146730" observedRunningTime="2025-12-03 22:27:51.394219776 +0000 UTC m=+1036.613991793" watchObservedRunningTime="2025-12-03 22:27:51.404579505 +0000 UTC m=+1036.624351512" Dec 03 22:27:51.405141 master-0 kubenswrapper[36504]: I1203 22:27:51.404935 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 22:27:52.115654 master-0 kubenswrapper[36504]: I1203 22:27:52.115604 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:52.202389 master-0 kubenswrapper[36504]: I1203 22:27:52.202292 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-combined-ca-bundle\") pod \"7d05ff21-79c6-4875-b5d4-d06641554c9b\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " Dec 03 22:27:52.203318 master-0 kubenswrapper[36504]: I1203 22:27:52.202757 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgdg6\" (UniqueName: \"kubernetes.io/projected/7d05ff21-79c6-4875-b5d4-d06641554c9b-kube-api-access-pgdg6\") pod \"7d05ff21-79c6-4875-b5d4-d06641554c9b\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " Dec 03 22:27:52.203318 master-0 kubenswrapper[36504]: I1203 22:27:52.202967 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-config-data\") pod \"7d05ff21-79c6-4875-b5d4-d06641554c9b\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " Dec 03 22:27:52.203318 master-0 kubenswrapper[36504]: I1203 22:27:52.203002 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d05ff21-79c6-4875-b5d4-d06641554c9b-logs\") pod \"7d05ff21-79c6-4875-b5d4-d06641554c9b\" (UID: \"7d05ff21-79c6-4875-b5d4-d06641554c9b\") " Dec 03 22:27:52.208333 master-0 kubenswrapper[36504]: I1203 22:27:52.208225 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d05ff21-79c6-4875-b5d4-d06641554c9b-logs" (OuterVolumeSpecName: "logs") pod "7d05ff21-79c6-4875-b5d4-d06641554c9b" (UID: "7d05ff21-79c6-4875-b5d4-d06641554c9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:52.215073 master-0 kubenswrapper[36504]: I1203 22:27:52.214159 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d05ff21-79c6-4875-b5d4-d06641554c9b-kube-api-access-pgdg6" (OuterVolumeSpecName: "kube-api-access-pgdg6") pod "7d05ff21-79c6-4875-b5d4-d06641554c9b" (UID: "7d05ff21-79c6-4875-b5d4-d06641554c9b"). InnerVolumeSpecName "kube-api-access-pgdg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:52.250545 master-0 kubenswrapper[36504]: I1203 22:27:52.250347 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-config-data" (OuterVolumeSpecName: "config-data") pod "7d05ff21-79c6-4875-b5d4-d06641554c9b" (UID: "7d05ff21-79c6-4875-b5d4-d06641554c9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:52.254965 master-0 kubenswrapper[36504]: I1203 22:27:52.254885 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d05ff21-79c6-4875-b5d4-d06641554c9b" (UID: "7d05ff21-79c6-4875-b5d4-d06641554c9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:52.309616 master-0 kubenswrapper[36504]: I1203 22:27:52.309527 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgdg6\" (UniqueName: \"kubernetes.io/projected/7d05ff21-79c6-4875-b5d4-d06641554c9b-kube-api-access-pgdg6\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:52.309616 master-0 kubenswrapper[36504]: I1203 22:27:52.309600 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:52.309616 master-0 kubenswrapper[36504]: I1203 22:27:52.309620 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d05ff21-79c6-4875-b5d4-d06641554c9b-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:52.309616 master-0 kubenswrapper[36504]: I1203 22:27:52.309633 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d05ff21-79c6-4875-b5d4-d06641554c9b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:52.358860 master-0 kubenswrapper[36504]: I1203 22:27:52.358780 36504 generic.go:334] "Generic (PLEG): container finished" podID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerID="9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48" exitCode=0 Dec 03 22:27:52.358860 master-0 kubenswrapper[36504]: I1203 22:27:52.358832 36504 generic.go:334] "Generic (PLEG): container finished" podID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerID="c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595" exitCode=143 Dec 03 22:27:52.359718 master-0 kubenswrapper[36504]: I1203 22:27:52.358896 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d05ff21-79c6-4875-b5d4-d06641554c9b","Type":"ContainerDied","Data":"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48"} Dec 03 22:27:52.359718 master-0 kubenswrapper[36504]: I1203 22:27:52.358970 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d05ff21-79c6-4875-b5d4-d06641554c9b","Type":"ContainerDied","Data":"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595"} Dec 03 22:27:52.359718 master-0 kubenswrapper[36504]: I1203 22:27:52.358989 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d05ff21-79c6-4875-b5d4-d06641554c9b","Type":"ContainerDied","Data":"fe95f352afb522dae2c9a4bbdc596ed379180657c7d9fda158f89fdb9cf393d5"} Dec 03 22:27:52.359718 master-0 kubenswrapper[36504]: I1203 22:27:52.358966 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:52.359718 master-0 kubenswrapper[36504]: I1203 22:27:52.358996 36504 scope.go:117] "RemoveContainer" containerID="9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48" Dec 03 22:27:52.361961 master-0 kubenswrapper[36504]: I1203 22:27:52.361882 36504 generic.go:334] "Generic (PLEG): container finished" podID="aa3c88c3-0ad0-4fc0-941c-52bde77db555" containerID="f8361c0f5cf9a6c8c5afb7de59d0f6b25c830c2067447840afc674bc481aa945" exitCode=0 Dec 03 22:27:52.364459 master-0 kubenswrapper[36504]: I1203 22:27:52.364405 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ckgg" event={"ID":"aa3c88c3-0ad0-4fc0-941c-52bde77db555","Type":"ContainerDied","Data":"f8361c0f5cf9a6c8c5afb7de59d0f6b25c830c2067447840afc674bc481aa945"} Dec 03 22:27:52.386482 master-0 kubenswrapper[36504]: I1203 22:27:52.386378 36504 scope.go:117] "RemoveContainer" containerID="c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595" Dec 03 22:27:52.429062 master-0 kubenswrapper[36504]: I1203 22:27:52.428989 36504 scope.go:117] "RemoveContainer" containerID="9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48" Dec 03 22:27:52.429785 master-0 kubenswrapper[36504]: E1203 22:27:52.429711 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48\": container with ID starting with 9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48 not found: ID does not exist" containerID="9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48" Dec 03 22:27:52.429898 master-0 kubenswrapper[36504]: I1203 22:27:52.429840 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48"} err="failed to get container status \"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48\": rpc error: code = NotFound desc = could not find container \"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48\": container with ID starting with 9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48 not found: ID does not exist" Dec 03 22:27:52.429973 master-0 kubenswrapper[36504]: I1203 22:27:52.429902 36504 scope.go:117] "RemoveContainer" containerID="c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595" Dec 03 22:27:52.431978 master-0 kubenswrapper[36504]: E1203 22:27:52.431819 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595\": container with ID starting with c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595 not found: ID does not exist" containerID="c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595" Dec 03 22:27:52.431978 master-0 kubenswrapper[36504]: I1203 22:27:52.431878 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595"} err="failed to get container status \"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595\": rpc error: code = NotFound desc = could not find container \"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595\": container with ID starting with c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595 not found: ID does not exist" Dec 03 22:27:52.431978 master-0 kubenswrapper[36504]: I1203 22:27:52.431912 36504 scope.go:117] "RemoveContainer" containerID="9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48" Dec 03 22:27:52.434956 master-0 kubenswrapper[36504]: I1203 22:27:52.433271 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48"} err="failed to get container status \"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48\": rpc error: code = NotFound desc = could not find container \"9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48\": container with ID starting with 9f2c303abc89f3af04b7fadea86e2fbde7894b2e272dce464484ae7beca54d48 not found: ID does not exist" Dec 03 22:27:52.434956 master-0 kubenswrapper[36504]: I1203 22:27:52.433317 36504 scope.go:117] "RemoveContainer" containerID="c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595" Dec 03 22:27:52.434956 master-0 kubenswrapper[36504]: I1203 22:27:52.433908 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595"} err="failed to get container status \"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595\": rpc error: code = NotFound desc = could not find container \"c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595\": container with ID starting with c35898c3b7194fcb1bff0f32fbb48e4d84d1010e4b66983ca5febdcb8362d595 not found: ID does not exist" Dec 03 22:27:52.508312 master-0 kubenswrapper[36504]: I1203 22:27:52.507960 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:52.601940 master-0 kubenswrapper[36504]: I1203 22:27:52.601851 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.780675 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: E1203 22:27:52.781378 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-log" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.781398 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-log" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: E1203 22:27:52.781453 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-metadata" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.781464 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-metadata" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.781853 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-log" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.781909 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" containerName="nova-metadata-metadata" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.783341 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.789474 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:27:52.796372 master-0 kubenswrapper[36504]: I1203 22:27:52.790096 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 22:27:52.796991 master-0 kubenswrapper[36504]: I1203 22:27:52.796418 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:52.826805 master-0 kubenswrapper[36504]: I1203 22:27:52.824399 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.826805 master-0 kubenswrapper[36504]: I1203 22:27:52.825060 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.826805 master-0 kubenswrapper[36504]: I1203 22:27:52.825193 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4chjc\" (UniqueName: \"kubernetes.io/projected/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-kube-api-access-4chjc\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.826805 master-0 kubenswrapper[36504]: I1203 22:27:52.825263 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-config-data\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.826805 master-0 kubenswrapper[36504]: I1203 22:27:52.825978 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-logs\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.928386 master-0 kubenswrapper[36504]: I1203 22:27:52.928293 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.929001 master-0 kubenswrapper[36504]: I1203 22:27:52.928418 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.930886 master-0 kubenswrapper[36504]: I1203 22:27:52.929014 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4chjc\" (UniqueName: \"kubernetes.io/projected/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-kube-api-access-4chjc\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.930886 master-0 kubenswrapper[36504]: I1203 22:27:52.929643 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-config-data\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.930886 master-0 kubenswrapper[36504]: I1203 22:27:52.929717 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-logs\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.930886 master-0 kubenswrapper[36504]: I1203 22:27:52.930130 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-logs\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.937844 master-0 kubenswrapper[36504]: I1203 22:27:52.937721 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-config-data\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.938433 master-0 kubenswrapper[36504]: I1203 22:27:52.938380 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.943656 master-0 kubenswrapper[36504]: I1203 22:27:52.943603 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:52.947824 master-0 kubenswrapper[36504]: I1203 22:27:52.947749 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4chjc\" (UniqueName: \"kubernetes.io/projected/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-kube-api-access-4chjc\") pod \"nova-metadata-0\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " pod="openstack/nova-metadata-0" Dec 03 22:27:53.113404 master-0 kubenswrapper[36504]: I1203 22:27:53.113311 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d05ff21-79c6-4875-b5d4-d06641554c9b" path="/var/lib/kubelet/pods/7d05ff21-79c6-4875-b5d4-d06641554c9b/volumes" Dec 03 22:27:53.121329 master-0 kubenswrapper[36504]: I1203 22:27:53.121261 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:53.385513 master-0 kubenswrapper[36504]: I1203 22:27:53.385167 36504 generic.go:334] "Generic (PLEG): container finished" podID="a26d6e38-228d-4663-9826-d79e438d324e" containerID="649d688dfef8ebcff089c631cb163152da87a58648d7b1a66c36c2a31818aaf6" exitCode=0 Dec 03 22:27:53.387173 master-0 kubenswrapper[36504]: I1203 22:27:53.387133 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2lkqc" event={"ID":"a26d6e38-228d-4663-9826-d79e438d324e","Type":"ContainerDied","Data":"649d688dfef8ebcff089c631cb163152da87a58648d7b1a66c36c2a31818aaf6"} Dec 03 22:27:53.644465 master-0 kubenswrapper[36504]: I1203 22:27:53.644364 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:53.673704 master-0 kubenswrapper[36504]: W1203 22:27:53.673597 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacad37ed_16b9_4f18_a27d_5951a2ac0a1b.slice/crio-fc4f02586ec8ace22c9cbf2c9ced399e0e0bdafe768ddb4142fe5442bb9ab357 WatchSource:0}: Error finding container fc4f02586ec8ace22c9cbf2c9ced399e0e0bdafe768ddb4142fe5442bb9ab357: Status 404 returned error can't find the container with id fc4f02586ec8ace22c9cbf2c9ced399e0e0bdafe768ddb4142fe5442bb9ab357 Dec 03 22:27:54.076126 master-0 kubenswrapper[36504]: I1203 22:27:54.075469 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:54.195289 master-0 kubenswrapper[36504]: I1203 22:27:54.195210 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-combined-ca-bundle\") pod \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " Dec 03 22:27:54.195660 master-0 kubenswrapper[36504]: I1203 22:27:54.195627 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-scripts\") pod \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " Dec 03 22:27:54.195785 master-0 kubenswrapper[36504]: I1203 22:27:54.195673 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-config-data\") pod \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " Dec 03 22:27:54.195971 master-0 kubenswrapper[36504]: I1203 22:27:54.195940 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk9tn\" (UniqueName: \"kubernetes.io/projected/aa3c88c3-0ad0-4fc0-941c-52bde77db555-kube-api-access-dk9tn\") pod \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\" (UID: \"aa3c88c3-0ad0-4fc0-941c-52bde77db555\") " Dec 03 22:27:54.200343 master-0 kubenswrapper[36504]: I1203 22:27:54.200297 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3c88c3-0ad0-4fc0-941c-52bde77db555-kube-api-access-dk9tn" (OuterVolumeSpecName: "kube-api-access-dk9tn") pod "aa3c88c3-0ad0-4fc0-941c-52bde77db555" (UID: "aa3c88c3-0ad0-4fc0-941c-52bde77db555"). InnerVolumeSpecName "kube-api-access-dk9tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:54.200868 master-0 kubenswrapper[36504]: I1203 22:27:54.200802 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-scripts" (OuterVolumeSpecName: "scripts") pod "aa3c88c3-0ad0-4fc0-941c-52bde77db555" (UID: "aa3c88c3-0ad0-4fc0-941c-52bde77db555"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:54.238794 master-0 kubenswrapper[36504]: I1203 22:27:54.238681 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-config-data" (OuterVolumeSpecName: "config-data") pod "aa3c88c3-0ad0-4fc0-941c-52bde77db555" (UID: "aa3c88c3-0ad0-4fc0-941c-52bde77db555"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:54.259098 master-0 kubenswrapper[36504]: I1203 22:27:54.259023 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa3c88c3-0ad0-4fc0-941c-52bde77db555" (UID: "aa3c88c3-0ad0-4fc0-941c-52bde77db555"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:54.303258 master-0 kubenswrapper[36504]: I1203 22:27:54.303052 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:54.303258 master-0 kubenswrapper[36504]: I1203 22:27:54.303124 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:54.303258 master-0 kubenswrapper[36504]: I1203 22:27:54.303138 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa3c88c3-0ad0-4fc0-941c-52bde77db555-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:54.303258 master-0 kubenswrapper[36504]: I1203 22:27:54.303151 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk9tn\" (UniqueName: \"kubernetes.io/projected/aa3c88c3-0ad0-4fc0-941c-52bde77db555-kube-api-access-dk9tn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:54.430789 master-0 kubenswrapper[36504]: I1203 22:27:54.430645 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9ckgg" event={"ID":"aa3c88c3-0ad0-4fc0-941c-52bde77db555","Type":"ContainerDied","Data":"b59cc62bb94f1486aed562c13923d489af660597932c9aff423a9ee6016ee0cd"} Dec 03 22:27:54.430789 master-0 kubenswrapper[36504]: I1203 22:27:54.430718 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59cc62bb94f1486aed562c13923d489af660597932c9aff423a9ee6016ee0cd" Dec 03 22:27:54.445877 master-0 kubenswrapper[36504]: I1203 22:27:54.430857 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9ckgg" Dec 03 22:27:54.445877 master-0 kubenswrapper[36504]: I1203 22:27:54.440034 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acad37ed-16b9-4f18-a27d-5951a2ac0a1b","Type":"ContainerStarted","Data":"ab75f1e35642b67b8e1e2427d7a90348649e1d178ab717fe62e7dcf9862399c3"} Dec 03 22:27:54.445877 master-0 kubenswrapper[36504]: I1203 22:27:54.440112 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acad37ed-16b9-4f18-a27d-5951a2ac0a1b","Type":"ContainerStarted","Data":"fc4f02586ec8ace22c9cbf2c9ced399e0e0bdafe768ddb4142fe5442bb9ab357"} Dec 03 22:27:55.042803 master-0 kubenswrapper[36504]: I1203 22:27:55.035953 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:55.042803 master-0 kubenswrapper[36504]: I1203 22:27:55.036597 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-log" containerID="cri-o://79f7b7aedcb26f75a92630e9087931a3449785ee25307f1900d3f7e8bc356472" gracePeriod=30 Dec 03 22:27:55.042803 master-0 kubenswrapper[36504]: I1203 22:27:55.037524 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-api" containerID="cri-o://d380cbbcb4d15196b60838cd2d865338897d8f915f5375fdc293f20e58d06b90" gracePeriod=30 Dec 03 22:27:55.089710 master-0 kubenswrapper[36504]: I1203 22:27:55.087853 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:55.089710 master-0 kubenswrapper[36504]: I1203 22:27:55.088148 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c380b560-bf38-4838-8e7d-2d65092dc448" containerName="nova-scheduler-scheduler" containerID="cri-o://c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a" gracePeriod=30 Dec 03 22:27:55.143261 master-0 kubenswrapper[36504]: I1203 22:27:55.143175 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:55.481111 master-0 kubenswrapper[36504]: I1203 22:27:55.481035 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:55.482694 master-0 kubenswrapper[36504]: I1203 22:27:55.482638 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acad37ed-16b9-4f18-a27d-5951a2ac0a1b","Type":"ContainerStarted","Data":"3e379be897a08685a4c595667f1f5f871e02f7b22d61737bedda10aaa8b41945"} Dec 03 22:27:55.490409 master-0 kubenswrapper[36504]: I1203 22:27:55.489438 36504 generic.go:334] "Generic (PLEG): container finished" podID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerID="d380cbbcb4d15196b60838cd2d865338897d8f915f5375fdc293f20e58d06b90" exitCode=0 Dec 03 22:27:55.490409 master-0 kubenswrapper[36504]: I1203 22:27:55.489513 36504 generic.go:334] "Generic (PLEG): container finished" podID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerID="79f7b7aedcb26f75a92630e9087931a3449785ee25307f1900d3f7e8bc356472" exitCode=143 Dec 03 22:27:55.490409 master-0 kubenswrapper[36504]: I1203 22:27:55.489663 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3485dc52-4df3-4d63-ae03-1fca7630d64a","Type":"ContainerDied","Data":"d380cbbcb4d15196b60838cd2d865338897d8f915f5375fdc293f20e58d06b90"} Dec 03 22:27:55.490409 master-0 kubenswrapper[36504]: I1203 22:27:55.489715 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3485dc52-4df3-4d63-ae03-1fca7630d64a","Type":"ContainerDied","Data":"79f7b7aedcb26f75a92630e9087931a3449785ee25307f1900d3f7e8bc356472"} Dec 03 22:27:55.499083 master-0 kubenswrapper[36504]: I1203 22:27:55.494840 36504 generic.go:334] "Generic (PLEG): container finished" podID="bc0b5f74-16f0-4381-86a2-4ba49c379be8" containerID="204f255c14881341d8537a4d603b58e261b7f1036050cd39c7ecc4c9e13f4d6c" exitCode=0 Dec 03 22:27:55.499083 master-0 kubenswrapper[36504]: I1203 22:27:55.494973 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" event={"ID":"bc0b5f74-16f0-4381-86a2-4ba49c379be8","Type":"ContainerDied","Data":"204f255c14881341d8537a4d603b58e261b7f1036050cd39c7ecc4c9e13f4d6c"} Dec 03 22:27:55.519144 master-0 kubenswrapper[36504]: I1203 22:27:55.510538 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-2lkqc" Dec 03 22:27:55.527013 master-0 kubenswrapper[36504]: I1203 22:27:55.526902 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-2lkqc" event={"ID":"a26d6e38-228d-4663-9826-d79e438d324e","Type":"ContainerDied","Data":"6ac37113f84ee635add12dececed935af8c56588b39f35ac42e3b58cd3ef151b"} Dec 03 22:27:55.527371 master-0 kubenswrapper[36504]: I1203 22:27:55.527045 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ac37113f84ee635add12dececed935af8c56588b39f35ac42e3b58cd3ef151b" Dec 03 22:27:55.578337 master-0 kubenswrapper[36504]: I1203 22:27:55.578101 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.578066965 podStartE2EDuration="3.578066965s" podCreationTimestamp="2025-12-03 22:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:55.562352221 +0000 UTC m=+1040.782124228" watchObservedRunningTime="2025-12-03 22:27:55.578066965 +0000 UTC m=+1040.797838972" Dec 03 22:27:55.590970 master-0 kubenswrapper[36504]: I1203 22:27:55.589912 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-config-data\") pod \"a26d6e38-228d-4663-9826-d79e438d324e\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " Dec 03 22:27:55.590970 master-0 kubenswrapper[36504]: I1203 22:27:55.590002 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-scripts\") pod \"a26d6e38-228d-4663-9826-d79e438d324e\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " Dec 03 22:27:55.590970 master-0 kubenswrapper[36504]: I1203 22:27:55.590336 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-combined-ca-bundle\") pod \"a26d6e38-228d-4663-9826-d79e438d324e\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " Dec 03 22:27:55.590970 master-0 kubenswrapper[36504]: I1203 22:27:55.590483 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrszx\" (UniqueName: \"kubernetes.io/projected/a26d6e38-228d-4663-9826-d79e438d324e-kube-api-access-lrszx\") pod \"a26d6e38-228d-4663-9826-d79e438d324e\" (UID: \"a26d6e38-228d-4663-9826-d79e438d324e\") " Dec 03 22:27:55.602949 master-0 kubenswrapper[36504]: I1203 22:27:55.601918 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26d6e38-228d-4663-9826-d79e438d324e-kube-api-access-lrszx" (OuterVolumeSpecName: "kube-api-access-lrszx") pod "a26d6e38-228d-4663-9826-d79e438d324e" (UID: "a26d6e38-228d-4663-9826-d79e438d324e"). InnerVolumeSpecName "kube-api-access-lrszx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:55.609267 master-0 kubenswrapper[36504]: I1203 22:27:55.609163 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-scripts" (OuterVolumeSpecName: "scripts") pod "a26d6e38-228d-4663-9826-d79e438d324e" (UID: "a26d6e38-228d-4663-9826-d79e438d324e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:55.638874 master-0 kubenswrapper[36504]: I1203 22:27:55.638800 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a26d6e38-228d-4663-9826-d79e438d324e" (UID: "a26d6e38-228d-4663-9826-d79e438d324e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:55.759295 master-0 kubenswrapper[36504]: I1203 22:27:55.759200 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-config-data" (OuterVolumeSpecName: "config-data") pod "a26d6e38-228d-4663-9826-d79e438d324e" (UID: "a26d6e38-228d-4663-9826-d79e438d324e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:55.759799 master-0 kubenswrapper[36504]: I1203 22:27:55.759699 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:55.759799 master-0 kubenswrapper[36504]: I1203 22:27:55.759785 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:55.759904 master-0 kubenswrapper[36504]: I1203 22:27:55.759803 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a26d6e38-228d-4663-9826-d79e438d324e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:55.759904 master-0 kubenswrapper[36504]: I1203 22:27:55.759826 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrszx\" (UniqueName: \"kubernetes.io/projected/a26d6e38-228d-4663-9826-d79e438d324e-kube-api-access-lrszx\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:56.030117 master-0 kubenswrapper[36504]: I1203 22:27:56.030034 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:27:56.069396 master-0 kubenswrapper[36504]: I1203 22:27:56.069316 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t88q\" (UniqueName: \"kubernetes.io/projected/3485dc52-4df3-4d63-ae03-1fca7630d64a-kube-api-access-2t88q\") pod \"3485dc52-4df3-4d63-ae03-1fca7630d64a\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " Dec 03 22:27:56.070110 master-0 kubenswrapper[36504]: I1203 22:27:56.070090 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-config-data\") pod \"3485dc52-4df3-4d63-ae03-1fca7630d64a\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " Dec 03 22:27:56.070818 master-0 kubenswrapper[36504]: I1203 22:27:56.070800 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3485dc52-4df3-4d63-ae03-1fca7630d64a-logs\") pod \"3485dc52-4df3-4d63-ae03-1fca7630d64a\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " Dec 03 22:27:56.071098 master-0 kubenswrapper[36504]: I1203 22:27:56.071079 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-combined-ca-bundle\") pod \"3485dc52-4df3-4d63-ae03-1fca7630d64a\" (UID: \"3485dc52-4df3-4d63-ae03-1fca7630d64a\") " Dec 03 22:27:56.076167 master-0 kubenswrapper[36504]: I1203 22:27:56.076056 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3485dc52-4df3-4d63-ae03-1fca7630d64a-logs" (OuterVolumeSpecName: "logs") pod "3485dc52-4df3-4d63-ae03-1fca7630d64a" (UID: "3485dc52-4df3-4d63-ae03-1fca7630d64a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:56.076930 master-0 kubenswrapper[36504]: I1203 22:27:56.076884 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3485dc52-4df3-4d63-ae03-1fca7630d64a-kube-api-access-2t88q" (OuterVolumeSpecName: "kube-api-access-2t88q") pod "3485dc52-4df3-4d63-ae03-1fca7630d64a" (UID: "3485dc52-4df3-4d63-ae03-1fca7630d64a"). InnerVolumeSpecName "kube-api-access-2t88q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:56.110390 master-0 kubenswrapper[36504]: I1203 22:27:56.110293 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-config-data" (OuterVolumeSpecName: "config-data") pod "3485dc52-4df3-4d63-ae03-1fca7630d64a" (UID: "3485dc52-4df3-4d63-ae03-1fca7630d64a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:56.110956 master-0 kubenswrapper[36504]: I1203 22:27:56.110762 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3485dc52-4df3-4d63-ae03-1fca7630d64a" (UID: "3485dc52-4df3-4d63-ae03-1fca7630d64a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:56.176062 master-0 kubenswrapper[36504]: I1203 22:27:56.175970 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:56.176062 master-0 kubenswrapper[36504]: I1203 22:27:56.176029 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3485dc52-4df3-4d63-ae03-1fca7630d64a-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:56.176062 master-0 kubenswrapper[36504]: I1203 22:27:56.176041 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3485dc52-4df3-4d63-ae03-1fca7630d64a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:56.176062 master-0 kubenswrapper[36504]: I1203 22:27:56.176053 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t88q\" (UniqueName: \"kubernetes.io/projected/3485dc52-4df3-4d63-ae03-1fca7630d64a-kube-api-access-2t88q\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:56.178192 master-0 kubenswrapper[36504]: E1203 22:27:56.178116 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:27:56.179890 master-0 kubenswrapper[36504]: E1203 22:27:56.179849 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:27:56.182935 master-0 kubenswrapper[36504]: E1203 22:27:56.182888 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:27:56.183015 master-0 kubenswrapper[36504]: E1203 22:27:56.182946 36504 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c380b560-bf38-4838-8e7d-2d65092dc448" containerName="nova-scheduler-scheduler" Dec 03 22:27:56.379239 master-0 kubenswrapper[36504]: I1203 22:27:56.379152 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:27:56.503703 master-0 kubenswrapper[36504]: I1203 22:27:56.503608 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d65b86965-zlwpd"] Dec 03 22:27:56.505728 master-0 kubenswrapper[36504]: I1203 22:27:56.505444 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerName="dnsmasq-dns" containerID="cri-o://cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f" gracePeriod=10 Dec 03 22:27:56.598824 master-0 kubenswrapper[36504]: I1203 22:27:56.595352 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3485dc52-4df3-4d63-ae03-1fca7630d64a","Type":"ContainerDied","Data":"d6b72af6f352c84a9ccbf855f55719e775b20d2c2ef9dc15a5bd359fddbbac26"} Dec 03 22:27:56.598824 master-0 kubenswrapper[36504]: I1203 22:27:56.595460 36504 scope.go:117] "RemoveContainer" containerID="d380cbbcb4d15196b60838cd2d865338897d8f915f5375fdc293f20e58d06b90" Dec 03 22:27:56.598824 master-0 kubenswrapper[36504]: I1203 22:27:56.595790 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:27:56.621034 master-0 kubenswrapper[36504]: I1203 22:27:56.603426 36504 generic.go:334] "Generic (PLEG): container finished" podID="c380b560-bf38-4838-8e7d-2d65092dc448" containerID="c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a" exitCode=0 Dec 03 22:27:56.621034 master-0 kubenswrapper[36504]: I1203 22:27:56.603843 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c380b560-bf38-4838-8e7d-2d65092dc448","Type":"ContainerDied","Data":"c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a"} Dec 03 22:27:56.621034 master-0 kubenswrapper[36504]: I1203 22:27:56.604174 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-log" containerID="cri-o://ab75f1e35642b67b8e1e2427d7a90348649e1d178ab717fe62e7dcf9862399c3" gracePeriod=30 Dec 03 22:27:56.621034 master-0 kubenswrapper[36504]: I1203 22:27:56.604425 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-metadata" containerID="cri-o://3e379be897a08685a4c595667f1f5f871e02f7b22d61737bedda10aaa8b41945" gracePeriod=30 Dec 03 22:27:56.715327 master-0 kubenswrapper[36504]: I1203 22:27:56.713943 36504 scope.go:117] "RemoveContainer" containerID="79f7b7aedcb26f75a92630e9087931a3449785ee25307f1900d3f7e8bc356472" Dec 03 22:27:56.886292 master-0 kubenswrapper[36504]: I1203 22:27:56.886218 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:56.933923 master-0 kubenswrapper[36504]: I1203 22:27:56.933839 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:56.963374 master-0 kubenswrapper[36504]: I1203 22:27:56.962595 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:56.963957 master-0 kubenswrapper[36504]: E1203 22:27:56.963882 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3c88c3-0ad0-4fc0-941c-52bde77db555" containerName="nova-manage" Dec 03 22:27:56.963957 master-0 kubenswrapper[36504]: I1203 22:27:56.963911 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3c88c3-0ad0-4fc0-941c-52bde77db555" containerName="nova-manage" Dec 03 22:27:56.964129 master-0 kubenswrapper[36504]: E1203 22:27:56.963965 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-log" Dec 03 22:27:56.964129 master-0 kubenswrapper[36504]: I1203 22:27:56.963976 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-log" Dec 03 22:27:56.964129 master-0 kubenswrapper[36504]: E1203 22:27:56.963987 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26d6e38-228d-4663-9826-d79e438d324e" containerName="aodh-db-sync" Dec 03 22:27:56.964129 master-0 kubenswrapper[36504]: I1203 22:27:56.963996 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26d6e38-228d-4663-9826-d79e438d324e" containerName="aodh-db-sync" Dec 03 22:27:56.964129 master-0 kubenswrapper[36504]: E1203 22:27:56.964009 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-api" Dec 03 22:27:56.964129 master-0 kubenswrapper[36504]: I1203 22:27:56.964017 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-api" Dec 03 22:27:56.966107 master-0 kubenswrapper[36504]: I1203 22:27:56.965256 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa3c88c3-0ad0-4fc0-941c-52bde77db555" containerName="nova-manage" Dec 03 22:27:56.966107 master-0 kubenswrapper[36504]: I1203 22:27:56.965292 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-api" Dec 03 22:27:56.966107 master-0 kubenswrapper[36504]: I1203 22:27:56.965362 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" containerName="nova-api-log" Dec 03 22:27:56.966107 master-0 kubenswrapper[36504]: I1203 22:27:56.965391 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26d6e38-228d-4663-9826-d79e438d324e" containerName="aodh-db-sync" Dec 03 22:27:56.979656 master-0 kubenswrapper[36504]: I1203 22:27:56.979595 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:27:56.983636 master-0 kubenswrapper[36504]: I1203 22:27:56.982460 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:27:57.029088 master-0 kubenswrapper[36504]: I1203 22:27:57.026161 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:57.131402 master-0 kubenswrapper[36504]: I1203 22:27:57.127554 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-logs\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.131402 master-0 kubenswrapper[36504]: I1203 22:27:57.127713 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wsh\" (UniqueName: \"kubernetes.io/projected/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-kube-api-access-98wsh\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.131402 master-0 kubenswrapper[36504]: I1203 22:27:57.127796 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-config-data\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.131402 master-0 kubenswrapper[36504]: I1203 22:27:57.127842 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.142192 master-0 kubenswrapper[36504]: I1203 22:27:57.142118 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3485dc52-4df3-4d63-ae03-1fca7630d64a" path="/var/lib/kubelet/pods/3485dc52-4df3-4d63-ae03-1fca7630d64a/volumes" Dec 03 22:27:57.230638 master-0 kubenswrapper[36504]: I1203 22:27:57.230446 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-logs\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.230638 master-0 kubenswrapper[36504]: I1203 22:27:57.230588 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98wsh\" (UniqueName: \"kubernetes.io/projected/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-kube-api-access-98wsh\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.230638 master-0 kubenswrapper[36504]: I1203 22:27:57.230624 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-config-data\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.231115 master-0 kubenswrapper[36504]: I1203 22:27:57.230653 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.238450 master-0 kubenswrapper[36504]: I1203 22:27:57.238391 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-logs\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.239473 master-0 kubenswrapper[36504]: I1203 22:27:57.239428 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-config-data\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.256978 master-0 kubenswrapper[36504]: I1203 22:27:57.254156 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.259424 master-0 kubenswrapper[36504]: I1203 22:27:57.259367 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:27:57.279791 master-0 kubenswrapper[36504]: I1203 22:27:57.279403 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wsh\" (UniqueName: \"kubernetes.io/projected/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-kube-api-access-98wsh\") pod \"nova-api-0\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " pod="openstack/nova-api-0" Dec 03 22:27:57.286070 master-0 kubenswrapper[36504]: I1203 22:27:57.285751 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:27:57.340832 master-0 kubenswrapper[36504]: I1203 22:27:57.339133 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:27:57.438169 master-0 kubenswrapper[36504]: I1203 22:27:57.438112 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-config\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438300 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbq98\" (UniqueName: \"kubernetes.io/projected/c380b560-bf38-4838-8e7d-2d65092dc448-kube-api-access-qbq98\") pod \"c380b560-bf38-4838-8e7d-2d65092dc448\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438382 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-config-data\") pod \"c380b560-bf38-4838-8e7d-2d65092dc448\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438463 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-sb\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438622 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-combined-ca-bundle\") pod \"c380b560-bf38-4838-8e7d-2d65092dc448\" (UID: \"c380b560-bf38-4838-8e7d-2d65092dc448\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438731 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-swift-storage-0\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438815 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-edpm\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.438892 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-svc\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.439038 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfzd9\" (UniqueName: \"kubernetes.io/projected/edd801a2-0ddf-475b-81d5-8421a26d7011-kube-api-access-lfzd9\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.441465 master-0 kubenswrapper[36504]: I1203 22:27:57.439118 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-nb\") pod \"edd801a2-0ddf-475b-81d5-8421a26d7011\" (UID: \"edd801a2-0ddf-475b-81d5-8421a26d7011\") " Dec 03 22:27:57.450964 master-0 kubenswrapper[36504]: I1203 22:27:57.447967 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c380b560-bf38-4838-8e7d-2d65092dc448-kube-api-access-qbq98" (OuterVolumeSpecName: "kube-api-access-qbq98") pod "c380b560-bf38-4838-8e7d-2d65092dc448" (UID: "c380b560-bf38-4838-8e7d-2d65092dc448"). InnerVolumeSpecName "kube-api-access-qbq98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:57.454537 master-0 kubenswrapper[36504]: I1203 22:27:57.454478 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd801a2-0ddf-475b-81d5-8421a26d7011-kube-api-access-lfzd9" (OuterVolumeSpecName: "kube-api-access-lfzd9") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "kube-api-access-lfzd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:57.493295 master-0 kubenswrapper[36504]: I1203 22:27:57.492544 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:57.510529 master-0 kubenswrapper[36504]: I1203 22:27:57.505582 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-config-data" (OuterVolumeSpecName: "config-data") pod "c380b560-bf38-4838-8e7d-2d65092dc448" (UID: "c380b560-bf38-4838-8e7d-2d65092dc448"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:57.545693 master-0 kubenswrapper[36504]: I1203 22:27:57.544611 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfzd9\" (UniqueName: \"kubernetes.io/projected/edd801a2-0ddf-475b-81d5-8421a26d7011-kube-api-access-lfzd9\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.545693 master-0 kubenswrapper[36504]: I1203 22:27:57.544658 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbq98\" (UniqueName: \"kubernetes.io/projected/c380b560-bf38-4838-8e7d-2d65092dc448-kube-api-access-qbq98\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.545693 master-0 kubenswrapper[36504]: I1203 22:27:57.544673 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.568465 master-0 kubenswrapper[36504]: I1203 22:27:57.566151 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c380b560-bf38-4838-8e7d-2d65092dc448" (UID: "c380b560-bf38-4838-8e7d-2d65092dc448"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:57.594936 master-0 kubenswrapper[36504]: I1203 22:27:57.581942 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:57.594936 master-0 kubenswrapper[36504]: I1203 22:27:57.583696 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-config" (OuterVolumeSpecName: "config") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:57.608371 master-0 kubenswrapper[36504]: I1203 22:27:57.606568 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:57.608371 master-0 kubenswrapper[36504]: I1203 22:27:57.607334 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-edpm" (OuterVolumeSpecName: "edpm") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:57.662682 master-0 kubenswrapper[36504]: I1203 22:27:57.661128 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-scripts\") pod \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " Dec 03 22:27:57.662682 master-0 kubenswrapper[36504]: I1203 22:27:57.661399 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-config-data\") pod \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " Dec 03 22:27:57.662682 master-0 kubenswrapper[36504]: I1203 22:27:57.661554 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgqlx\" (UniqueName: \"kubernetes.io/projected/bc0b5f74-16f0-4381-86a2-4ba49c379be8-kube-api-access-cgqlx\") pod \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " Dec 03 22:27:57.662682 master-0 kubenswrapper[36504]: I1203 22:27:57.661702 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-combined-ca-bundle\") pod \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\" (UID: \"bc0b5f74-16f0-4381-86a2-4ba49c379be8\") " Dec 03 22:27:57.666416 master-0 kubenswrapper[36504]: I1203 22:27:57.666157 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c380b560-bf38-4838-8e7d-2d65092dc448-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.666416 master-0 kubenswrapper[36504]: I1203 22:27:57.666207 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.666416 master-0 kubenswrapper[36504]: I1203 22:27:57.666228 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.666416 master-0 kubenswrapper[36504]: I1203 22:27:57.666243 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.666416 master-0 kubenswrapper[36504]: I1203 22:27:57.666258 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.668268 master-0 kubenswrapper[36504]: I1203 22:27:57.668187 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-scripts" (OuterVolumeSpecName: "scripts") pod "bc0b5f74-16f0-4381-86a2-4ba49c379be8" (UID: "bc0b5f74-16f0-4381-86a2-4ba49c379be8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:57.669455 master-0 kubenswrapper[36504]: I1203 22:27:57.669407 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0b5f74-16f0-4381-86a2-4ba49c379be8-kube-api-access-cgqlx" (OuterVolumeSpecName: "kube-api-access-cgqlx") pod "bc0b5f74-16f0-4381-86a2-4ba49c379be8" (UID: "bc0b5f74-16f0-4381-86a2-4ba49c379be8"). InnerVolumeSpecName "kube-api-access-cgqlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:57.710648 master-0 kubenswrapper[36504]: I1203 22:27:57.710183 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:57.711167 master-0 kubenswrapper[36504]: I1203 22:27:57.710884 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "edd801a2-0ddf-475b-81d5-8421a26d7011" (UID: "edd801a2-0ddf-475b-81d5-8421a26d7011"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:27:57.715285 master-0 kubenswrapper[36504]: I1203 22:27:57.715220 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-config-data" (OuterVolumeSpecName: "config-data") pod "bc0b5f74-16f0-4381-86a2-4ba49c379be8" (UID: "bc0b5f74-16f0-4381-86a2-4ba49c379be8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:57.719701 master-0 kubenswrapper[36504]: I1203 22:27:57.719383 36504 generic.go:334] "Generic (PLEG): container finished" podID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerID="3e379be897a08685a4c595667f1f5f871e02f7b22d61737bedda10aaa8b41945" exitCode=0 Dec 03 22:27:57.719701 master-0 kubenswrapper[36504]: I1203 22:27:57.719436 36504 generic.go:334] "Generic (PLEG): container finished" podID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerID="ab75f1e35642b67b8e1e2427d7a90348649e1d178ab717fe62e7dcf9862399c3" exitCode=143 Dec 03 22:27:57.719701 master-0 kubenswrapper[36504]: I1203 22:27:57.719549 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acad37ed-16b9-4f18-a27d-5951a2ac0a1b","Type":"ContainerDied","Data":"3e379be897a08685a4c595667f1f5f871e02f7b22d61737bedda10aaa8b41945"} Dec 03 22:27:57.722336 master-0 kubenswrapper[36504]: I1203 22:27:57.721921 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acad37ed-16b9-4f18-a27d-5951a2ac0a1b","Type":"ContainerDied","Data":"ab75f1e35642b67b8e1e2427d7a90348649e1d178ab717fe62e7dcf9862399c3"} Dec 03 22:27:57.746838 master-0 kubenswrapper[36504]: I1203 22:27:57.746612 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc0b5f74-16f0-4381-86a2-4ba49c379be8" (UID: "bc0b5f74-16f0-4381-86a2-4ba49c379be8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:57.770833 master-0 kubenswrapper[36504]: I1203 22:27:57.770706 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.770833 master-0 kubenswrapper[36504]: I1203 22:27:57.770787 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.770833 master-0 kubenswrapper[36504]: I1203 22:27:57.770802 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.770833 master-0 kubenswrapper[36504]: I1203 22:27:57.770816 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgqlx\" (UniqueName: \"kubernetes.io/projected/bc0b5f74-16f0-4381-86a2-4ba49c379be8-kube-api-access-cgqlx\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.770833 master-0 kubenswrapper[36504]: I1203 22:27:57.770827 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0b5f74-16f0-4381-86a2-4ba49c379be8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.770833 master-0 kubenswrapper[36504]: I1203 22:27:57.770837 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/edd801a2-0ddf-475b-81d5-8421a26d7011-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:57.774917 master-0 kubenswrapper[36504]: I1203 22:27:57.774795 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" event={"ID":"bc0b5f74-16f0-4381-86a2-4ba49c379be8","Type":"ContainerDied","Data":"ad4cb37ec3e91a8c25c7827d622de97ed1093c02c33b3d637d8f6a4efadbb64b"} Dec 03 22:27:57.774917 master-0 kubenswrapper[36504]: I1203 22:27:57.774859 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4cb37ec3e91a8c25c7827d622de97ed1093c02c33b3d637d8f6a4efadbb64b" Dec 03 22:27:57.775053 master-0 kubenswrapper[36504]: I1203 22:27:57.774947 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nn2xp" Dec 03 22:27:57.784579 master-0 kubenswrapper[36504]: I1203 22:27:57.784213 36504 generic.go:334] "Generic (PLEG): container finished" podID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerID="cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f" exitCode=0 Dec 03 22:27:57.784579 master-0 kubenswrapper[36504]: I1203 22:27:57.784328 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" event={"ID":"edd801a2-0ddf-475b-81d5-8421a26d7011","Type":"ContainerDied","Data":"cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f"} Dec 03 22:27:57.784579 master-0 kubenswrapper[36504]: I1203 22:27:57.784379 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" event={"ID":"edd801a2-0ddf-475b-81d5-8421a26d7011","Type":"ContainerDied","Data":"5e6cb4f14b56ad949c576e714976697830a41cb41e9a1fe3e48315f72cef06da"} Dec 03 22:27:57.784579 master-0 kubenswrapper[36504]: I1203 22:27:57.784404 36504 scope.go:117] "RemoveContainer" containerID="cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f" Dec 03 22:27:57.784847 master-0 kubenswrapper[36504]: I1203 22:27:57.784607 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d65b86965-zlwpd" Dec 03 22:27:57.791445 master-0 kubenswrapper[36504]: I1203 22:27:57.790723 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: E1203 22:27:57.791581 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerName="init" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.791613 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerName="init" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: E1203 22:27:57.791672 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c380b560-bf38-4838-8e7d-2d65092dc448" containerName="nova-scheduler-scheduler" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.791685 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c380b560-bf38-4838-8e7d-2d65092dc448" containerName="nova-scheduler-scheduler" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: E1203 22:27:57.791742 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0b5f74-16f0-4381-86a2-4ba49c379be8" containerName="nova-cell1-conductor-db-sync" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.791752 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0b5f74-16f0-4381-86a2-4ba49c379be8" containerName="nova-cell1-conductor-db-sync" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: E1203 22:27:57.791832 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerName="dnsmasq-dns" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.791843 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerName="dnsmasq-dns" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.792190 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0b5f74-16f0-4381-86a2-4ba49c379be8" containerName="nova-cell1-conductor-db-sync" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.792242 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" containerName="dnsmasq-dns" Dec 03 22:27:57.792552 master-0 kubenswrapper[36504]: I1203 22:27:57.792292 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c380b560-bf38-4838-8e7d-2d65092dc448" containerName="nova-scheduler-scheduler" Dec 03 22:27:57.800116 master-0 kubenswrapper[36504]: I1203 22:27:57.799536 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:57.805158 master-0 kubenswrapper[36504]: I1203 22:27:57.805028 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 22:27:57.810643 master-0 kubenswrapper[36504]: I1203 22:27:57.810061 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c380b560-bf38-4838-8e7d-2d65092dc448","Type":"ContainerDied","Data":"6c0772b8f209bb0853f863beeaf3722c224b5b95724a1e2e99b2abf8fdf30781"} Dec 03 22:27:57.810643 master-0 kubenswrapper[36504]: I1203 22:27:57.810187 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:27:57.810643 master-0 kubenswrapper[36504]: I1203 22:27:57.810410 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 22:27:57.877006 master-0 kubenswrapper[36504]: I1203 22:27:57.876913 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fea920-59c1-439c-a559-e3d921900b47-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:57.877446 master-0 kubenswrapper[36504]: I1203 22:27:57.877410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fea920-59c1-439c-a559-e3d921900b47-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:57.877498 master-0 kubenswrapper[36504]: I1203 22:27:57.877476 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7m7q\" (UniqueName: \"kubernetes.io/projected/97fea920-59c1-439c-a559-e3d921900b47-kube-api-access-h7m7q\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:57.924049 master-0 kubenswrapper[36504]: I1203 22:27:57.920415 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:57.929150 master-0 kubenswrapper[36504]: I1203 22:27:57.929093 36504 scope.go:117] "RemoveContainer" containerID="8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6" Dec 03 22:27:57.979388 master-0 kubenswrapper[36504]: I1203 22:27:57.979303 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d65b86965-zlwpd"] Dec 03 22:27:57.988408 master-0 kubenswrapper[36504]: I1203 22:27:57.988339 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-config-data\") pod \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " Dec 03 22:27:57.988690 master-0 kubenswrapper[36504]: I1203 22:27:57.988652 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4chjc\" (UniqueName: \"kubernetes.io/projected/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-kube-api-access-4chjc\") pod \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " Dec 03 22:27:57.988731 master-0 kubenswrapper[36504]: I1203 22:27:57.988713 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-combined-ca-bundle\") pod \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " Dec 03 22:27:57.992439 master-0 kubenswrapper[36504]: I1203 22:27:57.992355 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-kube-api-access-4chjc" (OuterVolumeSpecName: "kube-api-access-4chjc") pod "acad37ed-16b9-4f18-a27d-5951a2ac0a1b" (UID: "acad37ed-16b9-4f18-a27d-5951a2ac0a1b"). InnerVolumeSpecName "kube-api-access-4chjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:27:57.998941 master-0 kubenswrapper[36504]: I1203 22:27:57.998837 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d65b86965-zlwpd"] Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.014252 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-logs\") pod \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.014400 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-nova-metadata-tls-certs\") pod \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\" (UID: \"acad37ed-16b9-4f18-a27d-5951a2ac0a1b\") " Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.015614 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fea920-59c1-439c-a559-e3d921900b47-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.015740 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fea920-59c1-439c-a559-e3d921900b47-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.015791 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7m7q\" (UniqueName: \"kubernetes.io/projected/97fea920-59c1-439c-a559-e3d921900b47-kube-api-access-h7m7q\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.016437 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4chjc\" (UniqueName: \"kubernetes.io/projected/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-kube-api-access-4chjc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:58.016843 master-0 kubenswrapper[36504]: I1203 22:27:58.016564 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-logs" (OuterVolumeSpecName: "logs") pod "acad37ed-16b9-4f18-a27d-5951a2ac0a1b" (UID: "acad37ed-16b9-4f18-a27d-5951a2ac0a1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:27:58.021646 master-0 kubenswrapper[36504]: I1203 22:27:58.021047 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:58.033251 master-0 kubenswrapper[36504]: I1203 22:27:58.033156 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97fea920-59c1-439c-a559-e3d921900b47-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.035915 master-0 kubenswrapper[36504]: I1203 22:27:58.035872 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fea920-59c1-439c-a559-e3d921900b47-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.042396 master-0 kubenswrapper[36504]: I1203 22:27:58.042075 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-config-data" (OuterVolumeSpecName: "config-data") pod "acad37ed-16b9-4f18-a27d-5951a2ac0a1b" (UID: "acad37ed-16b9-4f18-a27d-5951a2ac0a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:58.049435 master-0 kubenswrapper[36504]: I1203 22:27:58.048597 36504 scope.go:117] "RemoveContainer" containerID="cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f" Dec 03 22:27:58.054257 master-0 kubenswrapper[36504]: E1203 22:27:58.050664 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f\": container with ID starting with cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f not found: ID does not exist" containerID="cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f" Dec 03 22:27:58.054257 master-0 kubenswrapper[36504]: I1203 22:27:58.050751 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f"} err="failed to get container status \"cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f\": rpc error: code = NotFound desc = could not find container \"cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f\": container with ID starting with cf1defb13f5e835aff988410a813de1e4ca8acebf4be3209ae11ed911a9d7d1f not found: ID does not exist" Dec 03 22:27:58.054257 master-0 kubenswrapper[36504]: I1203 22:27:58.050817 36504 scope.go:117] "RemoveContainer" containerID="8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6" Dec 03 22:27:58.054257 master-0 kubenswrapper[36504]: E1203 22:27:58.052845 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6\": container with ID starting with 8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6 not found: ID does not exist" containerID="8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6" Dec 03 22:27:58.054257 master-0 kubenswrapper[36504]: I1203 22:27:58.052941 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6"} err="failed to get container status \"8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6\": rpc error: code = NotFound desc = could not find container \"8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6\": container with ID starting with 8d35c94ce31f755d33841936bb0f6c1ea5e3469f6a7b0ba837c9bc6c850467d6 not found: ID does not exist" Dec 03 22:27:58.054257 master-0 kubenswrapper[36504]: I1203 22:27:58.052994 36504 scope.go:117] "RemoveContainer" containerID="c232908f2fefea86c87df63a72db5ea7ed5d376af80e1e2c5b254e413077eb1a" Dec 03 22:27:58.065461 master-0 kubenswrapper[36504]: I1203 22:27:58.065395 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7m7q\" (UniqueName: \"kubernetes.io/projected/97fea920-59c1-439c-a559-e3d921900b47-kube-api-access-h7m7q\") pod \"nova-cell1-conductor-0\" (UID: \"97fea920-59c1-439c-a559-e3d921900b47\") " pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.072211 master-0 kubenswrapper[36504]: I1203 22:27:58.072140 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: I1203 22:27:58.091027 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: E1203 22:27:58.091860 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-log" Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: I1203 22:27:58.091884 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-log" Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: E1203 22:27:58.091950 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-metadata" Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: I1203 22:27:58.091958 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-metadata" Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: I1203 22:27:58.092351 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-log" Dec 03 22:27:58.092488 master-0 kubenswrapper[36504]: I1203 22:27:58.092422 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" containerName="nova-metadata-metadata" Dec 03 22:27:58.094472 master-0 kubenswrapper[36504]: I1203 22:27:58.093792 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:27:58.099635 master-0 kubenswrapper[36504]: I1203 22:27:58.099543 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 22:27:58.121034 master-0 kubenswrapper[36504]: I1203 22:27:58.119711 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:58.121034 master-0 kubenswrapper[36504]: I1203 22:27:58.119793 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:58.129803 master-0 kubenswrapper[36504]: I1203 22:27:58.129695 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "acad37ed-16b9-4f18-a27d-5951a2ac0a1b" (UID: "acad37ed-16b9-4f18-a27d-5951a2ac0a1b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:58.135828 master-0 kubenswrapper[36504]: I1203 22:27:58.135723 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:58.184216 master-0 kubenswrapper[36504]: I1203 22:27:58.184156 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acad37ed-16b9-4f18-a27d-5951a2ac0a1b" (UID: "acad37ed-16b9-4f18-a27d-5951a2ac0a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:27:58.194908 master-0 kubenswrapper[36504]: I1203 22:27:58.194847 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:27:58.199374 master-0 kubenswrapper[36504]: I1203 22:27:58.199290 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:58.223188 master-0 kubenswrapper[36504]: I1203 22:27:58.223113 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-config-data\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.223325 master-0 kubenswrapper[36504]: I1203 22:27:58.223295 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.223376 master-0 kubenswrapper[36504]: I1203 22:27:58.223323 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hp6v\" (UniqueName: \"kubernetes.io/projected/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-kube-api-access-7hp6v\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.224219 master-0 kubenswrapper[36504]: I1203 22:27:58.223552 36504 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:58.224219 master-0 kubenswrapper[36504]: I1203 22:27:58.223571 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acad37ed-16b9-4f18-a27d-5951a2ac0a1b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:27:58.331820 master-0 kubenswrapper[36504]: I1203 22:27:58.328403 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.331820 master-0 kubenswrapper[36504]: I1203 22:27:58.328459 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hp6v\" (UniqueName: \"kubernetes.io/projected/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-kube-api-access-7hp6v\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.331820 master-0 kubenswrapper[36504]: I1203 22:27:58.328688 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-config-data\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.348798 master-0 kubenswrapper[36504]: I1203 22:27:58.345811 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-config-data\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.367803 master-0 kubenswrapper[36504]: I1203 22:27:58.365623 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.378158 master-0 kubenswrapper[36504]: I1203 22:27:58.377475 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hp6v\" (UniqueName: \"kubernetes.io/projected/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-kube-api-access-7hp6v\") pod \"nova-scheduler-0\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " pod="openstack/nova-scheduler-0" Dec 03 22:27:58.581269 master-0 kubenswrapper[36504]: I1203 22:27:58.563466 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 22:27:58.581269 master-0 kubenswrapper[36504]: I1203 22:27:58.569629 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:27:58.581269 master-0 kubenswrapper[36504]: I1203 22:27:58.570532 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 22:27:58.599741 master-0 kubenswrapper[36504]: I1203 22:27:58.591463 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 22:27:58.599741 master-0 kubenswrapper[36504]: I1203 22:27:58.591877 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 22:27:58.600722 master-0 kubenswrapper[36504]: I1203 22:27:58.600645 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 22:27:58.677463 master-0 kubenswrapper[36504]: I1203 22:27:58.658598 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.677463 master-0 kubenswrapper[36504]: I1203 22:27:58.658885 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-config-data\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.677463 master-0 kubenswrapper[36504]: I1203 22:27:58.658958 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8pl\" (UniqueName: \"kubernetes.io/projected/f2bbce50-6964-456d-affb-e8b38e7311b4-kube-api-access-tc8pl\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.677463 master-0 kubenswrapper[36504]: I1203 22:27:58.658998 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-scripts\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.766629 master-0 kubenswrapper[36504]: I1203 22:27:58.764419 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-config-data\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.766629 master-0 kubenswrapper[36504]: I1203 22:27:58.764618 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8pl\" (UniqueName: \"kubernetes.io/projected/f2bbce50-6964-456d-affb-e8b38e7311b4-kube-api-access-tc8pl\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.766629 master-0 kubenswrapper[36504]: I1203 22:27:58.764701 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-scripts\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.766629 master-0 kubenswrapper[36504]: I1203 22:27:58.764843 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.774863 master-0 kubenswrapper[36504]: I1203 22:27:58.771572 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.774863 master-0 kubenswrapper[36504]: I1203 22:27:58.772998 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-config-data\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.774863 master-0 kubenswrapper[36504]: I1203 22:27:58.773900 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-scripts\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.836188 master-0 kubenswrapper[36504]: I1203 22:27:58.790180 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8pl\" (UniqueName: \"kubernetes.io/projected/f2bbce50-6964-456d-affb-e8b38e7311b4-kube-api-access-tc8pl\") pod \"aodh-0\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " pod="openstack/aodh-0" Dec 03 22:27:58.852963 master-0 kubenswrapper[36504]: I1203 22:27:58.852863 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6490dcf0-f493-4c1f-8a1c-38395f86f4fa","Type":"ContainerStarted","Data":"43c1a5eeadb83673a4c10c55d3907be3b11a2d08b50cc6cf87ee9c72533cd2e2"} Dec 03 22:27:58.852963 master-0 kubenswrapper[36504]: I1203 22:27:58.852943 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6490dcf0-f493-4c1f-8a1c-38395f86f4fa","Type":"ContainerStarted","Data":"e1481c0cce24ae41099a6a589d471b5bc5beb77269614807796be7f03b87f399"} Dec 03 22:27:58.855485 master-0 kubenswrapper[36504]: I1203 22:27:58.855291 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acad37ed-16b9-4f18-a27d-5951a2ac0a1b","Type":"ContainerDied","Data":"fc4f02586ec8ace22c9cbf2c9ced399e0e0bdafe768ddb4142fe5442bb9ab357"} Dec 03 22:27:58.855485 master-0 kubenswrapper[36504]: I1203 22:27:58.855336 36504 scope.go:117] "RemoveContainer" containerID="3e379be897a08685a4c595667f1f5f871e02f7b22d61737bedda10aaa8b41945" Dec 03 22:27:58.855569 master-0 kubenswrapper[36504]: I1203 22:27:58.855542 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:58.930206 master-0 kubenswrapper[36504]: I1203 22:27:58.926752 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 22:27:58.948951 master-0 kubenswrapper[36504]: I1203 22:27:58.943830 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 22:27:59.102467 master-0 kubenswrapper[36504]: I1203 22:27:59.102393 36504 scope.go:117] "RemoveContainer" containerID="ab75f1e35642b67b8e1e2427d7a90348649e1d178ab717fe62e7dcf9862399c3" Dec 03 22:27:59.180441 master-0 kubenswrapper[36504]: I1203 22:27:59.178167 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c380b560-bf38-4838-8e7d-2d65092dc448" path="/var/lib/kubelet/pods/c380b560-bf38-4838-8e7d-2d65092dc448/volumes" Dec 03 22:27:59.180441 master-0 kubenswrapper[36504]: I1203 22:27:59.179407 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd801a2-0ddf-475b-81d5-8421a26d7011" path="/var/lib/kubelet/pods/edd801a2-0ddf-475b-81d5-8421a26d7011/volumes" Dec 03 22:27:59.228682 master-0 kubenswrapper[36504]: I1203 22:27:59.228483 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:59.264663 master-0 kubenswrapper[36504]: I1203 22:27:59.264585 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:59.332316 master-0 kubenswrapper[36504]: I1203 22:27:59.332100 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:59.336247 master-0 kubenswrapper[36504]: I1203 22:27:59.336197 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:59.340893 master-0 kubenswrapper[36504]: I1203 22:27:59.340844 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 22:27:59.341327 master-0 kubenswrapper[36504]: I1203 22:27:59.341159 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:27:59.351663 master-0 kubenswrapper[36504]: I1203 22:27:59.351463 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljn46\" (UniqueName: \"kubernetes.io/projected/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-kube-api-access-ljn46\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.352983 master-0 kubenswrapper[36504]: I1203 22:27:59.352927 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:27:59.353606 master-0 kubenswrapper[36504]: I1203 22:27:59.353549 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.353825 master-0 kubenswrapper[36504]: I1203 22:27:59.353696 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-config-data\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.353825 master-0 kubenswrapper[36504]: I1203 22:27:59.353780 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.353825 master-0 kubenswrapper[36504]: I1203 22:27:59.353806 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-logs\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.384305 master-0 kubenswrapper[36504]: I1203 22:27:59.384242 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:27:59.455424 master-0 kubenswrapper[36504]: I1203 22:27:59.455344 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-logs\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.455885 master-0 kubenswrapper[36504]: I1203 22:27:59.455543 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljn46\" (UniqueName: \"kubernetes.io/projected/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-kube-api-access-ljn46\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.455885 master-0 kubenswrapper[36504]: I1203 22:27:59.455686 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.455885 master-0 kubenswrapper[36504]: I1203 22:27:59.455762 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-config-data\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.455885 master-0 kubenswrapper[36504]: I1203 22:27:59.455833 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.456331 master-0 kubenswrapper[36504]: I1203 22:27:59.456265 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-logs\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.461151 master-0 kubenswrapper[36504]: I1203 22:27:59.461052 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.462392 master-0 kubenswrapper[36504]: I1203 22:27:59.462329 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.462994 master-0 kubenswrapper[36504]: I1203 22:27:59.462935 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-config-data\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.508158 master-0 kubenswrapper[36504]: I1203 22:27:59.508103 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljn46\" (UniqueName: \"kubernetes.io/projected/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-kube-api-access-ljn46\") pod \"nova-metadata-0\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " pod="openstack/nova-metadata-0" Dec 03 22:27:59.616983 master-0 kubenswrapper[36504]: I1203 22:27:59.616014 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 22:27:59.698423 master-0 kubenswrapper[36504]: I1203 22:27:59.693434 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:27:59.953285 master-0 kubenswrapper[36504]: I1203 22:27:59.953016 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerStarted","Data":"8bc68641b7eb82f56a637086a9b85a4e219efb4f8063e7841135e7bad1ee1f12"} Dec 03 22:27:59.956721 master-0 kubenswrapper[36504]: I1203 22:27:59.956502 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"97fea920-59c1-439c-a559-e3d921900b47","Type":"ContainerStarted","Data":"907877cb3024ded67272f68e5a1b4287c22ba593dbca964e808f8e542582fad1"} Dec 03 22:27:59.956721 master-0 kubenswrapper[36504]: I1203 22:27:59.956548 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"97fea920-59c1-439c-a559-e3d921900b47","Type":"ContainerStarted","Data":"dc61e7997d15145cce92b3e5ac22e5859518c1e3e244a7438b780b95b1881690"} Dec 03 22:27:59.960476 master-0 kubenswrapper[36504]: I1203 22:27:59.959564 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 22:27:59.966989 master-0 kubenswrapper[36504]: I1203 22:27:59.966575 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8871882f-bcc5-4ba4-8abc-8e19ce250bf9","Type":"ContainerStarted","Data":"f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0"} Dec 03 22:27:59.966989 master-0 kubenswrapper[36504]: I1203 22:27:59.966665 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8871882f-bcc5-4ba4-8abc-8e19ce250bf9","Type":"ContainerStarted","Data":"06be330b3e02aac13d2c02eb891d3dca875cdd7231d8e8505224b7f8067571ce"} Dec 03 22:27:59.971752 master-0 kubenswrapper[36504]: I1203 22:27:59.970896 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6490dcf0-f493-4c1f-8a1c-38395f86f4fa","Type":"ContainerStarted","Data":"fe30345ae2400975eb7a58519365c17bdd6cc1822b87000eba394aa53e15d337"} Dec 03 22:28:00.029643 master-0 kubenswrapper[36504]: I1203 22:28:00.025586 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.025443434 podStartE2EDuration="3.025443434s" podCreationTimestamp="2025-12-03 22:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:27:59.985719518 +0000 UTC m=+1045.205491535" watchObservedRunningTime="2025-12-03 22:28:00.025443434 +0000 UTC m=+1045.245215461" Dec 03 22:28:00.077356 master-0 kubenswrapper[36504]: I1203 22:28:00.077228 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.077183669 podStartE2EDuration="4.077183669s" podCreationTimestamp="2025-12-03 22:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:00.013506635 +0000 UTC m=+1045.233278662" watchObservedRunningTime="2025-12-03 22:28:00.077183669 +0000 UTC m=+1045.296955676" Dec 03 22:28:00.126758 master-0 kubenswrapper[36504]: I1203 22:28:00.126552 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.12651609 podStartE2EDuration="3.12651609s" podCreationTimestamp="2025-12-03 22:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:00.046895455 +0000 UTC m=+1045.266667472" watchObservedRunningTime="2025-12-03 22:28:00.12651609 +0000 UTC m=+1045.346288097" Dec 03 22:28:00.307244 master-0 kubenswrapper[36504]: I1203 22:28:00.307044 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:00.993682 master-0 kubenswrapper[36504]: I1203 22:28:00.993501 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerStarted","Data":"f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea"} Dec 03 22:28:00.997058 master-0 kubenswrapper[36504]: I1203 22:28:00.997019 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df","Type":"ContainerStarted","Data":"38cb0943a7e5d3013c44a9450a646d8483eb910274e21b5318223c637bc23680"} Dec 03 22:28:00.997058 master-0 kubenswrapper[36504]: I1203 22:28:00.997055 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df","Type":"ContainerStarted","Data":"f3b283c1050484812c4f93280c67a1b74a265be98b7d70424b8a992d0b2b64d3"} Dec 03 22:28:01.143456 master-0 kubenswrapper[36504]: I1203 22:28:01.143364 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acad37ed-16b9-4f18-a27d-5951a2ac0a1b" path="/var/lib/kubelet/pods/acad37ed-16b9-4f18-a27d-5951a2ac0a1b/volumes" Dec 03 22:28:01.878826 master-0 kubenswrapper[36504]: I1203 22:28:01.878399 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:01.881173 master-0 kubenswrapper[36504]: I1203 22:28:01.881087 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-central-agent" containerID="cri-o://ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d" gracePeriod=30 Dec 03 22:28:01.881258 master-0 kubenswrapper[36504]: I1203 22:28:01.881165 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-notification-agent" containerID="cri-o://a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214" gracePeriod=30 Dec 03 22:28:01.881298 master-0 kubenswrapper[36504]: I1203 22:28:01.881217 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="proxy-httpd" containerID="cri-o://1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296" gracePeriod=30 Dec 03 22:28:01.881422 master-0 kubenswrapper[36504]: I1203 22:28:01.881388 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="sg-core" containerID="cri-o://511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18" gracePeriod=30 Dec 03 22:28:01.886391 master-0 kubenswrapper[36504]: I1203 22:28:01.886318 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:28:02.071392 master-0 kubenswrapper[36504]: I1203 22:28:02.070453 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df","Type":"ContainerStarted","Data":"77a5d364d521b3672d4e1b7973c8c11b8d59e9103eb2ce07e5e595bd0e9613eb"} Dec 03 22:28:02.089705 master-0 kubenswrapper[36504]: I1203 22:28:02.088602 36504 generic.go:334] "Generic (PLEG): container finished" podID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerID="511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18" exitCode=2 Dec 03 22:28:02.089705 master-0 kubenswrapper[36504]: I1203 22:28:02.088677 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerDied","Data":"511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18"} Dec 03 22:28:02.110859 master-0 kubenswrapper[36504]: I1203 22:28:02.107952 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.107837045 podStartE2EDuration="3.107837045s" podCreationTimestamp="2025-12-03 22:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:02.104171862 +0000 UTC m=+1047.323943869" watchObservedRunningTime="2025-12-03 22:28:02.107837045 +0000 UTC m=+1047.327609052" Dec 03 22:28:02.230334 master-0 kubenswrapper[36504]: I1203 22:28:02.230108 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.128.1.26:3000/\": dial tcp 10.128.1.26:3000: connect: connection refused" Dec 03 22:28:03.110443 master-0 kubenswrapper[36504]: I1203 22:28:03.110270 36504 generic.go:334] "Generic (PLEG): container finished" podID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerID="1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296" exitCode=0 Dec 03 22:28:03.110443 master-0 kubenswrapper[36504]: I1203 22:28:03.110331 36504 generic.go:334] "Generic (PLEG): container finished" podID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerID="ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d" exitCode=0 Dec 03 22:28:03.118196 master-0 kubenswrapper[36504]: I1203 22:28:03.118129 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerDied","Data":"1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296"} Dec 03 22:28:03.118196 master-0 kubenswrapper[36504]: I1203 22:28:03.118197 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerDied","Data":"ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d"} Dec 03 22:28:03.119489 master-0 kubenswrapper[36504]: I1203 22:28:03.118211 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerStarted","Data":"4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499"} Dec 03 22:28:03.286972 master-0 kubenswrapper[36504]: I1203 22:28:03.286863 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 22:28:03.570873 master-0 kubenswrapper[36504]: I1203 22:28:03.570808 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 22:28:03.881603 master-0 kubenswrapper[36504]: I1203 22:28:03.880730 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:03.944734 master-0 kubenswrapper[36504]: I1203 22:28:03.944659 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-combined-ca-bundle\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.945141 master-0 kubenswrapper[36504]: I1203 22:28:03.944881 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-config-data\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.945141 master-0 kubenswrapper[36504]: I1203 22:28:03.944917 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-run-httpd\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.945141 master-0 kubenswrapper[36504]: I1203 22:28:03.945108 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbzf2\" (UniqueName: \"kubernetes.io/projected/cea2d208-6be3-4b2f-8624-79136866f5b4-kube-api-access-jbzf2\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.945286 master-0 kubenswrapper[36504]: I1203 22:28:03.945234 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-sg-core-conf-yaml\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.945341 master-0 kubenswrapper[36504]: I1203 22:28:03.945321 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-scripts\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.945405 master-0 kubenswrapper[36504]: I1203 22:28:03.945352 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-log-httpd\") pod \"cea2d208-6be3-4b2f-8624-79136866f5b4\" (UID: \"cea2d208-6be3-4b2f-8624-79136866f5b4\") " Dec 03 22:28:03.947015 master-0 kubenswrapper[36504]: I1203 22:28:03.946940 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:03.951182 master-0 kubenswrapper[36504]: I1203 22:28:03.951112 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:03.960751 master-0 kubenswrapper[36504]: I1203 22:28:03.960677 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-scripts" (OuterVolumeSpecName: "scripts") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:03.977554 master-0 kubenswrapper[36504]: I1203 22:28:03.977397 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea2d208-6be3-4b2f-8624-79136866f5b4-kube-api-access-jbzf2" (OuterVolumeSpecName: "kube-api-access-jbzf2") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "kube-api-access-jbzf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:04.005663 master-0 kubenswrapper[36504]: I1203 22:28:04.005561 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:04.056815 master-0 kubenswrapper[36504]: I1203 22:28:04.051553 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.056815 master-0 kubenswrapper[36504]: I1203 22:28:04.051618 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbzf2\" (UniqueName: \"kubernetes.io/projected/cea2d208-6be3-4b2f-8624-79136866f5b4-kube-api-access-jbzf2\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.056815 master-0 kubenswrapper[36504]: I1203 22:28:04.051634 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.056815 master-0 kubenswrapper[36504]: I1203 22:28:04.051646 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.056815 master-0 kubenswrapper[36504]: I1203 22:28:04.051656 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea2d208-6be3-4b2f-8624-79136866f5b4-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.121987 master-0 kubenswrapper[36504]: I1203 22:28:04.121915 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:04.138677 master-0 kubenswrapper[36504]: I1203 22:28:04.138595 36504 generic.go:334] "Generic (PLEG): container finished" podID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerID="a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214" exitCode=0 Dec 03 22:28:04.139029 master-0 kubenswrapper[36504]: I1203 22:28:04.138660 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerDied","Data":"a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214"} Dec 03 22:28:04.139029 master-0 kubenswrapper[36504]: I1203 22:28:04.138739 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cea2d208-6be3-4b2f-8624-79136866f5b4","Type":"ContainerDied","Data":"cad6f657558b3e3f15fab1f51e907b23de08fe072efff8ccb6d25b33d25c3633"} Dec 03 22:28:04.139029 master-0 kubenswrapper[36504]: I1203 22:28:04.138746 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:04.139551 master-0 kubenswrapper[36504]: I1203 22:28:04.138761 36504 scope.go:117] "RemoveContainer" containerID="1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296" Dec 03 22:28:04.156418 master-0 kubenswrapper[36504]: I1203 22:28:04.155837 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.189816 master-0 kubenswrapper[36504]: I1203 22:28:04.189721 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-config-data" (OuterVolumeSpecName: "config-data") pod "cea2d208-6be3-4b2f-8624-79136866f5b4" (UID: "cea2d208-6be3-4b2f-8624-79136866f5b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:04.260388 master-0 kubenswrapper[36504]: I1203 22:28:04.260196 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea2d208-6be3-4b2f-8624-79136866f5b4-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:04.489679 master-0 kubenswrapper[36504]: I1203 22:28:04.486146 36504 scope.go:117] "RemoveContainer" containerID="511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18" Dec 03 22:28:04.495650 master-0 kubenswrapper[36504]: I1203 22:28:04.495587 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:04.519130 master-0 kubenswrapper[36504]: I1203 22:28:04.517601 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:04.550491 master-0 kubenswrapper[36504]: I1203 22:28:04.549039 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: E1203 22:28:04.556332 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-central-agent" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.556381 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-central-agent" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: E1203 22:28:04.556455 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-notification-agent" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.556463 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-notification-agent" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: E1203 22:28:04.556522 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="sg-core" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.556538 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="sg-core" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: E1203 22:28:04.556564 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="proxy-httpd" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.556573 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="proxy-httpd" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.556904 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-central-agent" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.556922 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="sg-core" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.557017 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="proxy-httpd" Dec 03 22:28:04.561051 master-0 kubenswrapper[36504]: I1203 22:28:04.557034 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" containerName="ceilometer-notification-agent" Dec 03 22:28:04.575654 master-0 kubenswrapper[36504]: I1203 22:28:04.575590 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:04.585423 master-0 kubenswrapper[36504]: I1203 22:28:04.585361 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:04.585657 master-0 kubenswrapper[36504]: I1203 22:28:04.585550 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:04.592480 master-0 kubenswrapper[36504]: I1203 22:28:04.592420 36504 scope.go:117] "RemoveContainer" containerID="a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214" Dec 03 22:28:04.623970 master-0 kubenswrapper[36504]: I1203 22:28:04.621090 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.673658 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.673736 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-scripts\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.673815 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbv7j\" (UniqueName: \"kubernetes.io/projected/ffda3a7c-4c18-4e58-b7cb-a20168389521-kube-api-access-xbv7j\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.673841 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-config-data\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.673877 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-log-httpd\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.673977 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.676939 master-0 kubenswrapper[36504]: I1203 22:28:04.674061 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-run-httpd\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.694841 master-0 kubenswrapper[36504]: I1203 22:28:04.694086 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:28:04.694841 master-0 kubenswrapper[36504]: I1203 22:28:04.694183 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:28:04.780060 master-0 kubenswrapper[36504]: I1203 22:28:04.779872 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-config-data\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.780060 master-0 kubenswrapper[36504]: I1203 22:28:04.780041 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-log-httpd\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.780995 master-0 kubenswrapper[36504]: I1203 22:28:04.780970 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-log-httpd\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.782075 master-0 kubenswrapper[36504]: I1203 22:28:04.782041 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.784460 master-0 kubenswrapper[36504]: I1203 22:28:04.783337 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-run-httpd\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.784460 master-0 kubenswrapper[36504]: I1203 22:28:04.783544 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.784460 master-0 kubenswrapper[36504]: I1203 22:28:04.783599 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-scripts\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.784460 master-0 kubenswrapper[36504]: I1203 22:28:04.783918 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbv7j\" (UniqueName: \"kubernetes.io/projected/ffda3a7c-4c18-4e58-b7cb-a20168389521-kube-api-access-xbv7j\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.804045 master-0 kubenswrapper[36504]: I1203 22:28:04.796873 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-config-data\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.804045 master-0 kubenswrapper[36504]: I1203 22:28:04.797933 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.804045 master-0 kubenswrapper[36504]: I1203 22:28:04.799840 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.804045 master-0 kubenswrapper[36504]: I1203 22:28:04.800216 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-run-httpd\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.804045 master-0 kubenswrapper[36504]: I1203 22:28:04.803043 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-scripts\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.811965 master-0 kubenswrapper[36504]: I1203 22:28:04.811891 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbv7j\" (UniqueName: \"kubernetes.io/projected/ffda3a7c-4c18-4e58-b7cb-a20168389521-kube-api-access-xbv7j\") pod \"ceilometer-0\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " pod="openstack/ceilometer-0" Dec 03 22:28:04.884887 master-0 kubenswrapper[36504]: I1203 22:28:04.884805 36504 scope.go:117] "RemoveContainer" containerID="ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d" Dec 03 22:28:04.913229 master-0 kubenswrapper[36504]: I1203 22:28:04.910148 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:05.066804 master-0 kubenswrapper[36504]: I1203 22:28:05.061050 36504 scope.go:117] "RemoveContainer" containerID="1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296" Dec 03 22:28:05.066804 master-0 kubenswrapper[36504]: E1203 22:28:05.065609 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296\": container with ID starting with 1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296 not found: ID does not exist" containerID="1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296" Dec 03 22:28:05.066804 master-0 kubenswrapper[36504]: I1203 22:28:05.065677 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296"} err="failed to get container status \"1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296\": rpc error: code = NotFound desc = could not find container \"1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296\": container with ID starting with 1ed408bc1bacd37a1e82fa202b3614b150ef4481c285008e5292b20c3f167296 not found: ID does not exist" Dec 03 22:28:05.066804 master-0 kubenswrapper[36504]: I1203 22:28:05.065721 36504 scope.go:117] "RemoveContainer" containerID="511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18" Dec 03 22:28:05.077027 master-0 kubenswrapper[36504]: E1203 22:28:05.075105 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18\": container with ID starting with 511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18 not found: ID does not exist" containerID="511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18" Dec 03 22:28:05.077027 master-0 kubenswrapper[36504]: I1203 22:28:05.075179 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18"} err="failed to get container status \"511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18\": rpc error: code = NotFound desc = could not find container \"511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18\": container with ID starting with 511967f7405bedb426f89b578eb331fcfd32bd3a288ec705c694cc437a894e18 not found: ID does not exist" Dec 03 22:28:05.077027 master-0 kubenswrapper[36504]: I1203 22:28:05.075219 36504 scope.go:117] "RemoveContainer" containerID="a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214" Dec 03 22:28:05.078299 master-0 kubenswrapper[36504]: E1203 22:28:05.078202 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214\": container with ID starting with a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214 not found: ID does not exist" containerID="a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214" Dec 03 22:28:05.078299 master-0 kubenswrapper[36504]: I1203 22:28:05.078263 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214"} err="failed to get container status \"a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214\": rpc error: code = NotFound desc = could not find container \"a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214\": container with ID starting with a17f19bcc5f9707de6f5416f38d363ae80acd58a12f263b2f4c8749238c08214 not found: ID does not exist" Dec 03 22:28:05.078299 master-0 kubenswrapper[36504]: I1203 22:28:05.078297 36504 scope.go:117] "RemoveContainer" containerID="ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d" Dec 03 22:28:05.083810 master-0 kubenswrapper[36504]: E1203 22:28:05.080359 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d\": container with ID starting with ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d not found: ID does not exist" containerID="ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d" Dec 03 22:28:05.083810 master-0 kubenswrapper[36504]: I1203 22:28:05.080414 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d"} err="failed to get container status \"ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d\": rpc error: code = NotFound desc = could not find container \"ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d\": container with ID starting with ca8b642b6800790b6372b620555442774b62711ad763a907c8df5b7ed7ed829d not found: ID does not exist" Dec 03 22:28:05.186805 master-0 kubenswrapper[36504]: I1203 22:28:05.185801 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea2d208-6be3-4b2f-8624-79136866f5b4" path="/var/lib/kubelet/pods/cea2d208-6be3-4b2f-8624-79136866f5b4/volumes" Dec 03 22:28:05.809393 master-0 kubenswrapper[36504]: I1203 22:28:05.809269 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:06.279951 master-0 kubenswrapper[36504]: I1203 22:28:06.278766 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:06.289886 master-0 kubenswrapper[36504]: I1203 22:28:06.289147 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerStarted","Data":"e4721d99fdfbd70c2aa89b60cdc49b3312b6a5bbdccc6e2fc7a3b3d512c358c1"} Dec 03 22:28:06.302354 master-0 kubenswrapper[36504]: I1203 22:28:06.302238 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerStarted","Data":"8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48"} Dec 03 22:28:07.287510 master-0 kubenswrapper[36504]: I1203 22:28:07.287327 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:07.287510 master-0 kubenswrapper[36504]: I1203 22:28:07.287415 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:07.320361 master-0 kubenswrapper[36504]: I1203 22:28:07.320292 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerStarted","Data":"f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa"} Dec 03 22:28:07.320936 master-0 kubenswrapper[36504]: I1203 22:28:07.320910 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-api" containerID="cri-o://f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea" gracePeriod=30 Dec 03 22:28:07.321896 master-0 kubenswrapper[36504]: I1203 22:28:07.321871 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-listener" containerID="cri-o://f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa" gracePeriod=30 Dec 03 22:28:07.322102 master-0 kubenswrapper[36504]: I1203 22:28:07.322083 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-notifier" containerID="cri-o://8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48" gracePeriod=30 Dec 03 22:28:07.322309 master-0 kubenswrapper[36504]: I1203 22:28:07.322293 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-evaluator" containerID="cri-o://4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499" gracePeriod=30 Dec 03 22:28:07.333086 master-0 kubenswrapper[36504]: I1203 22:28:07.332679 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerStarted","Data":"fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be"} Dec 03 22:28:07.374204 master-0 kubenswrapper[36504]: I1203 22:28:07.373909 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.261713931 podStartE2EDuration="9.373873621s" podCreationTimestamp="2025-12-03 22:27:58 +0000 UTC" firstStartedPulling="2025-12-03 22:27:59.613263932 +0000 UTC m=+1044.833035939" lastFinishedPulling="2025-12-03 22:28:06.725423622 +0000 UTC m=+1051.945195629" observedRunningTime="2025-12-03 22:28:07.354126032 +0000 UTC m=+1052.573898039" watchObservedRunningTime="2025-12-03 22:28:07.373873621 +0000 UTC m=+1052.593645628" Dec 03 22:28:08.255353 master-0 kubenswrapper[36504]: I1203 22:28:08.255285 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 22:28:08.363926 master-0 kubenswrapper[36504]: I1203 22:28:08.362611 36504 generic.go:334] "Generic (PLEG): container finished" podID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerID="8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48" exitCode=0 Dec 03 22:28:08.363926 master-0 kubenswrapper[36504]: I1203 22:28:08.362664 36504 generic.go:334] "Generic (PLEG): container finished" podID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerID="4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499" exitCode=0 Dec 03 22:28:08.363926 master-0 kubenswrapper[36504]: I1203 22:28:08.362677 36504 generic.go:334] "Generic (PLEG): container finished" podID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerID="f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea" exitCode=0 Dec 03 22:28:08.363926 master-0 kubenswrapper[36504]: I1203 22:28:08.362713 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerDied","Data":"8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48"} Dec 03 22:28:08.363926 master-0 kubenswrapper[36504]: I1203 22:28:08.362824 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerDied","Data":"4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499"} Dec 03 22:28:08.363926 master-0 kubenswrapper[36504]: I1203 22:28:08.362843 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerDied","Data":"f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea"} Dec 03 22:28:08.366099 master-0 kubenswrapper[36504]: I1203 22:28:08.366000 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerStarted","Data":"40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0"} Dec 03 22:28:08.370103 master-0 kubenswrapper[36504]: I1203 22:28:08.370000 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.36:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:08.370897 master-0 kubenswrapper[36504]: I1203 22:28:08.370011 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.36:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:08.570343 master-0 kubenswrapper[36504]: I1203 22:28:08.570293 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 22:28:08.621394 master-0 kubenswrapper[36504]: I1203 22:28:08.621308 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 22:28:09.406801 master-0 kubenswrapper[36504]: I1203 22:28:09.405919 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerStarted","Data":"2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6"} Dec 03 22:28:09.457801 master-0 kubenswrapper[36504]: I1203 22:28:09.457523 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 22:28:09.694862 master-0 kubenswrapper[36504]: I1203 22:28:09.694587 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:28:09.694862 master-0 kubenswrapper[36504]: I1203 22:28:09.694654 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:28:10.710336 master-0 kubenswrapper[36504]: I1203 22:28:10.710131 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.40:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:10.710336 master-0 kubenswrapper[36504]: I1203 22:28:10.710159 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.40:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:11.446046 master-0 kubenswrapper[36504]: I1203 22:28:11.445980 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerStarted","Data":"3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98"} Dec 03 22:28:11.446739 master-0 kubenswrapper[36504]: I1203 22:28:11.446708 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-central-agent" containerID="cri-o://fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be" gracePeriod=30 Dec 03 22:28:11.447191 master-0 kubenswrapper[36504]: I1203 22:28:11.447155 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:11.447266 master-0 kubenswrapper[36504]: I1203 22:28:11.447166 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="sg-core" containerID="cri-o://2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6" gracePeriod=30 Dec 03 22:28:11.447266 master-0 kubenswrapper[36504]: I1203 22:28:11.447210 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="proxy-httpd" containerID="cri-o://3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98" gracePeriod=30 Dec 03 22:28:11.447373 master-0 kubenswrapper[36504]: I1203 22:28:11.447267 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-notification-agent" containerID="cri-o://40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0" gracePeriod=30 Dec 03 22:28:11.485859 master-0 kubenswrapper[36504]: I1203 22:28:11.485736 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.150270195 podStartE2EDuration="7.485707711s" podCreationTimestamp="2025-12-03 22:28:04 +0000 UTC" firstStartedPulling="2025-12-03 22:28:05.817214603 +0000 UTC m=+1051.036986610" lastFinishedPulling="2025-12-03 22:28:10.152652119 +0000 UTC m=+1055.372424126" observedRunningTime="2025-12-03 22:28:11.485161494 +0000 UTC m=+1056.704933501" watchObservedRunningTime="2025-12-03 22:28:11.485707711 +0000 UTC m=+1056.705479718" Dec 03 22:28:12.487619 master-0 kubenswrapper[36504]: I1203 22:28:12.487525 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerID="3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98" exitCode=0 Dec 03 22:28:12.487619 master-0 kubenswrapper[36504]: I1203 22:28:12.487581 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerID="2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6" exitCode=2 Dec 03 22:28:12.487619 master-0 kubenswrapper[36504]: I1203 22:28:12.487591 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerID="40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0" exitCode=0 Dec 03 22:28:12.487619 master-0 kubenswrapper[36504]: I1203 22:28:12.487619 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerDied","Data":"3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98"} Dec 03 22:28:12.488522 master-0 kubenswrapper[36504]: I1203 22:28:12.487659 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerDied","Data":"2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6"} Dec 03 22:28:12.488522 master-0 kubenswrapper[36504]: I1203 22:28:12.487672 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerDied","Data":"40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0"} Dec 03 22:28:15.153470 master-0 kubenswrapper[36504]: I1203 22:28:15.153382 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:15.256054 master-0 kubenswrapper[36504]: I1203 22:28:15.255968 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbv7j\" (UniqueName: \"kubernetes.io/projected/ffda3a7c-4c18-4e58-b7cb-a20168389521-kube-api-access-xbv7j\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.256403 master-0 kubenswrapper[36504]: I1203 22:28:15.256175 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-sg-core-conf-yaml\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.256403 master-0 kubenswrapper[36504]: I1203 22:28:15.256343 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-log-httpd\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.256519 master-0 kubenswrapper[36504]: I1203 22:28:15.256490 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-scripts\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.256580 master-0 kubenswrapper[36504]: I1203 22:28:15.256549 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-run-httpd\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.256679 master-0 kubenswrapper[36504]: I1203 22:28:15.256649 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-combined-ca-bundle\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.256938 master-0 kubenswrapper[36504]: I1203 22:28:15.256905 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-config-data\") pod \"ffda3a7c-4c18-4e58-b7cb-a20168389521\" (UID: \"ffda3a7c-4c18-4e58-b7cb-a20168389521\") " Dec 03 22:28:15.257291 master-0 kubenswrapper[36504]: I1203 22:28:15.257230 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:15.257357 master-0 kubenswrapper[36504]: I1203 22:28:15.257248 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:15.257942 master-0 kubenswrapper[36504]: I1203 22:28:15.257906 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.257942 master-0 kubenswrapper[36504]: I1203 22:28:15.257936 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ffda3a7c-4c18-4e58-b7cb-a20168389521-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.260610 master-0 kubenswrapper[36504]: I1203 22:28:15.260429 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffda3a7c-4c18-4e58-b7cb-a20168389521-kube-api-access-xbv7j" (OuterVolumeSpecName: "kube-api-access-xbv7j") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "kube-api-access-xbv7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:15.261641 master-0 kubenswrapper[36504]: I1203 22:28:15.261572 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-scripts" (OuterVolumeSpecName: "scripts") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:15.292242 master-0 kubenswrapper[36504]: I1203 22:28:15.292172 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:15.359849 master-0 kubenswrapper[36504]: I1203 22:28:15.359748 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbv7j\" (UniqueName: \"kubernetes.io/projected/ffda3a7c-4c18-4e58-b7cb-a20168389521-kube-api-access-xbv7j\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.359849 master-0 kubenswrapper[36504]: I1203 22:28:15.359828 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.359849 master-0 kubenswrapper[36504]: I1203 22:28:15.359842 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.378995 master-0 kubenswrapper[36504]: I1203 22:28:15.378889 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:15.392348 master-0 kubenswrapper[36504]: I1203 22:28:15.392286 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-config-data" (OuterVolumeSpecName: "config-data") pod "ffda3a7c-4c18-4e58-b7cb-a20168389521" (UID: "ffda3a7c-4c18-4e58-b7cb-a20168389521"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:15.464262 master-0 kubenswrapper[36504]: I1203 22:28:15.464169 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.464262 master-0 kubenswrapper[36504]: I1203 22:28:15.464229 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffda3a7c-4c18-4e58-b7cb-a20168389521-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:15.540659 master-0 kubenswrapper[36504]: I1203 22:28:15.540479 36504 generic.go:334] "Generic (PLEG): container finished" podID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerID="fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be" exitCode=0 Dec 03 22:28:15.540659 master-0 kubenswrapper[36504]: I1203 22:28:15.540539 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerDied","Data":"fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be"} Dec 03 22:28:15.540659 master-0 kubenswrapper[36504]: I1203 22:28:15.540603 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ffda3a7c-4c18-4e58-b7cb-a20168389521","Type":"ContainerDied","Data":"e4721d99fdfbd70c2aa89b60cdc49b3312b6a5bbdccc6e2fc7a3b3d512c358c1"} Dec 03 22:28:15.540659 master-0 kubenswrapper[36504]: I1203 22:28:15.540628 36504 scope.go:117] "RemoveContainer" containerID="3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98" Dec 03 22:28:15.541087 master-0 kubenswrapper[36504]: I1203 22:28:15.540653 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:15.572407 master-0 kubenswrapper[36504]: I1203 22:28:15.572354 36504 scope.go:117] "RemoveContainer" containerID="2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6" Dec 03 22:28:15.605025 master-0 kubenswrapper[36504]: I1203 22:28:15.604932 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:15.611069 master-0 kubenswrapper[36504]: I1203 22:28:15.610709 36504 scope.go:117] "RemoveContainer" containerID="40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0" Dec 03 22:28:15.628895 master-0 kubenswrapper[36504]: I1203 22:28:15.628377 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:15.654954 master-0 kubenswrapper[36504]: I1203 22:28:15.654866 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: E1203 22:28:15.656870 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-notification-agent" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: I1203 22:28:15.656906 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-notification-agent" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: E1203 22:28:15.656975 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="sg-core" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: I1203 22:28:15.656985 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="sg-core" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: E1203 22:28:15.657052 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="proxy-httpd" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: I1203 22:28:15.657061 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="proxy-httpd" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: E1203 22:28:15.657088 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-central-agent" Dec 03 22:28:15.657873 master-0 kubenswrapper[36504]: I1203 22:28:15.657097 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-central-agent" Dec 03 22:28:15.658487 master-0 kubenswrapper[36504]: I1203 22:28:15.657897 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="sg-core" Dec 03 22:28:15.658487 master-0 kubenswrapper[36504]: I1203 22:28:15.658020 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-central-agent" Dec 03 22:28:15.658487 master-0 kubenswrapper[36504]: I1203 22:28:15.658039 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="ceilometer-notification-agent" Dec 03 22:28:15.658487 master-0 kubenswrapper[36504]: I1203 22:28:15.658077 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" containerName="proxy-httpd" Dec 03 22:28:15.694491 master-0 kubenswrapper[36504]: I1203 22:28:15.694409 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:15.697924 master-0 kubenswrapper[36504]: I1203 22:28:15.697876 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:15.698192 master-0 kubenswrapper[36504]: I1203 22:28:15.698059 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:15.711680 master-0 kubenswrapper[36504]: I1203 22:28:15.711591 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:15.730068 master-0 kubenswrapper[36504]: I1203 22:28:15.729850 36504 scope.go:117] "RemoveContainer" containerID="fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be" Dec 03 22:28:15.761479 master-0 kubenswrapper[36504]: I1203 22:28:15.761431 36504 scope.go:117] "RemoveContainer" containerID="3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98" Dec 03 22:28:15.762148 master-0 kubenswrapper[36504]: E1203 22:28:15.762111 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98\": container with ID starting with 3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98 not found: ID does not exist" containerID="3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98" Dec 03 22:28:15.762209 master-0 kubenswrapper[36504]: I1203 22:28:15.762154 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98"} err="failed to get container status \"3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98\": rpc error: code = NotFound desc = could not find container \"3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98\": container with ID starting with 3d23f167af64999ec11012fe6627c5e9cd8e0984a301a242b3c32f2dc4a3ec98 not found: ID does not exist" Dec 03 22:28:15.762209 master-0 kubenswrapper[36504]: I1203 22:28:15.762188 36504 scope.go:117] "RemoveContainer" containerID="2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6" Dec 03 22:28:15.762835 master-0 kubenswrapper[36504]: E1203 22:28:15.762788 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6\": container with ID starting with 2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6 not found: ID does not exist" containerID="2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6" Dec 03 22:28:15.762980 master-0 kubenswrapper[36504]: I1203 22:28:15.762941 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6"} err="failed to get container status \"2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6\": rpc error: code = NotFound desc = could not find container \"2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6\": container with ID starting with 2b1f84ed21e3916c35cf9290c7acbe9bc634c5d03e73c0b9559033c1a33b1db6 not found: ID does not exist" Dec 03 22:28:15.763060 master-0 kubenswrapper[36504]: I1203 22:28:15.763047 36504 scope.go:117] "RemoveContainer" containerID="40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0" Dec 03 22:28:15.763539 master-0 kubenswrapper[36504]: E1203 22:28:15.763497 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0\": container with ID starting with 40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0 not found: ID does not exist" containerID="40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0" Dec 03 22:28:15.763597 master-0 kubenswrapper[36504]: I1203 22:28:15.763556 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0"} err="failed to get container status \"40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0\": rpc error: code = NotFound desc = could not find container \"40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0\": container with ID starting with 40121e0327e5ae83d26c50c127496650e7039f59ebacbf07a7ee55ea8bd16cc0 not found: ID does not exist" Dec 03 22:28:15.763597 master-0 kubenswrapper[36504]: I1203 22:28:15.763580 36504 scope.go:117] "RemoveContainer" containerID="fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be" Dec 03 22:28:15.763981 master-0 kubenswrapper[36504]: E1203 22:28:15.763936 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be\": container with ID starting with fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be not found: ID does not exist" containerID="fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be" Dec 03 22:28:15.763981 master-0 kubenswrapper[36504]: I1203 22:28:15.763971 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be"} err="failed to get container status \"fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be\": rpc error: code = NotFound desc = could not find container \"fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be\": container with ID starting with fc2d3c4eac2ab669eddb09ba6304b9fb9dea4d91b43e45bd96c6b408efeca3be not found: ID does not exist" Dec 03 22:28:15.881945 master-0 kubenswrapper[36504]: I1203 22:28:15.881852 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-scripts\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.882383 master-0 kubenswrapper[36504]: I1203 22:28:15.882108 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.882383 master-0 kubenswrapper[36504]: I1203 22:28:15.882176 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-log-httpd\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.882383 master-0 kubenswrapper[36504]: I1203 22:28:15.882208 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-config-data\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.882383 master-0 kubenswrapper[36504]: I1203 22:28:15.882249 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.882383 master-0 kubenswrapper[36504]: I1203 22:28:15.882275 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-run-httpd\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.882383 master-0 kubenswrapper[36504]: I1203 22:28:15.882309 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlcjn\" (UniqueName: \"kubernetes.io/projected/0ecc77c8-1691-47bc-a507-30c1415d3488-kube-api-access-dlcjn\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985336 master-0 kubenswrapper[36504]: I1203 22:28:15.985240 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985696 master-0 kubenswrapper[36504]: I1203 22:28:15.985387 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-log-httpd\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985696 master-0 kubenswrapper[36504]: I1203 22:28:15.985439 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-config-data\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985696 master-0 kubenswrapper[36504]: I1203 22:28:15.985494 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985696 master-0 kubenswrapper[36504]: I1203 22:28:15.985538 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-run-httpd\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985696 master-0 kubenswrapper[36504]: I1203 22:28:15.985573 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlcjn\" (UniqueName: \"kubernetes.io/projected/0ecc77c8-1691-47bc-a507-30c1415d3488-kube-api-access-dlcjn\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.985696 master-0 kubenswrapper[36504]: I1203 22:28:15.985594 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-scripts\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.987058 master-0 kubenswrapper[36504]: I1203 22:28:15.986990 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-run-httpd\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.987251 master-0 kubenswrapper[36504]: I1203 22:28:15.987176 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-log-httpd\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.990825 master-0 kubenswrapper[36504]: I1203 22:28:15.990762 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.991219 master-0 kubenswrapper[36504]: I1203 22:28:15.991164 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-scripts\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.991336 master-0 kubenswrapper[36504]: I1203 22:28:15.991301 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-config-data\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:15.991712 master-0 kubenswrapper[36504]: I1203 22:28:15.991675 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:16.005322 master-0 kubenswrapper[36504]: I1203 22:28:16.005258 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlcjn\" (UniqueName: \"kubernetes.io/projected/0ecc77c8-1691-47bc-a507-30c1415d3488-kube-api-access-dlcjn\") pod \"ceilometer-0\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " pod="openstack/ceilometer-0" Dec 03 22:28:16.027551 master-0 kubenswrapper[36504]: I1203 22:28:16.027465 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:16.558897 master-0 kubenswrapper[36504]: I1203 22:28:16.558559 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:16.568074 master-0 kubenswrapper[36504]: W1203 22:28:16.567977 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ecc77c8_1691_47bc_a507_30c1415d3488.slice/crio-e5971531ff7da38e1d6b974b7efd6b258b63f9f7b2594625e4f8c582623412cd WatchSource:0}: Error finding container e5971531ff7da38e1d6b974b7efd6b258b63f9f7b2594625e4f8c582623412cd: Status 404 returned error can't find the container with id e5971531ff7da38e1d6b974b7efd6b258b63f9f7b2594625e4f8c582623412cd Dec 03 22:28:17.118644 master-0 kubenswrapper[36504]: I1203 22:28:17.118572 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffda3a7c-4c18-4e58-b7cb-a20168389521" path="/var/lib/kubelet/pods/ffda3a7c-4c18-4e58-b7cb-a20168389521/volumes" Dec 03 22:28:17.293147 master-0 kubenswrapper[36504]: I1203 22:28:17.292926 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:28:17.293601 master-0 kubenswrapper[36504]: I1203 22:28:17.293268 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:28:17.294244 master-0 kubenswrapper[36504]: I1203 22:28:17.294218 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:28:17.294244 master-0 kubenswrapper[36504]: I1203 22:28:17.294240 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:28:17.297084 master-0 kubenswrapper[36504]: I1203 22:28:17.297051 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:28:17.297693 master-0 kubenswrapper[36504]: I1203 22:28:17.297660 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:28:17.857948 master-0 kubenswrapper[36504]: I1203 22:28:17.789862 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b7c8d469-2kcr9"] Dec 03 22:28:17.857948 master-0 kubenswrapper[36504]: I1203 22:28:17.792728 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.857948 master-0 kubenswrapper[36504]: I1203 22:28:17.804233 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerStarted","Data":"514d4bb428ed1bc8562870409baee9eb4590227595332dc310b9ada9d52e2210"} Dec 03 22:28:17.857948 master-0 kubenswrapper[36504]: I1203 22:28:17.804310 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerStarted","Data":"e5971531ff7da38e1d6b974b7efd6b258b63f9f7b2594625e4f8c582623412cd"} Dec 03 22:28:17.868821 master-0 kubenswrapper[36504]: I1203 22:28:17.866257 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b7c8d469-2kcr9"] Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.951250 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-ovsdbserver-sb\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.951363 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-config\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.951616 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-dns-swift-storage-0\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.951943 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-ovsdbserver-nb\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.952003 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2ft\" (UniqueName: \"kubernetes.io/projected/a011963b-fb76-469e-bff2-9c9aa715ebc7-kube-api-access-pt2ft\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.952274 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-dns-svc\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:17.955442 master-0 kubenswrapper[36504]: I1203 22:28:17.952409 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-edpm\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.091895 master-0 kubenswrapper[36504]: I1203 22:28:18.091358 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-ovsdbserver-nb\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.092956 master-0 kubenswrapper[36504]: I1203 22:28:18.092286 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2ft\" (UniqueName: \"kubernetes.io/projected/a011963b-fb76-469e-bff2-9c9aa715ebc7-kube-api-access-pt2ft\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.097697 master-0 kubenswrapper[36504]: I1203 22:28:18.097608 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-dns-svc\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.098129 master-0 kubenswrapper[36504]: I1203 22:28:18.098106 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-edpm\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.098348 master-0 kubenswrapper[36504]: I1203 22:28:18.098307 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-ovsdbserver-nb\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.098707 master-0 kubenswrapper[36504]: I1203 22:28:18.098687 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-ovsdbserver-sb\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.098834 master-0 kubenswrapper[36504]: I1203 22:28:18.098820 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-config\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.099740 master-0 kubenswrapper[36504]: I1203 22:28:18.099274 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-edpm\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.099740 master-0 kubenswrapper[36504]: I1203 22:28:18.099360 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-dns-swift-storage-0\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.099740 master-0 kubenswrapper[36504]: I1203 22:28:18.099626 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-dns-svc\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.100300 master-0 kubenswrapper[36504]: I1203 22:28:18.100266 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-dns-swift-storage-0\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.100931 master-0 kubenswrapper[36504]: I1203 22:28:18.100887 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-ovsdbserver-sb\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.101380 master-0 kubenswrapper[36504]: I1203 22:28:18.101356 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a011963b-fb76-469e-bff2-9c9aa715ebc7-config\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.115265 master-0 kubenswrapper[36504]: I1203 22:28:18.115224 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2ft\" (UniqueName: \"kubernetes.io/projected/a011963b-fb76-469e-bff2-9c9aa715ebc7-kube-api-access-pt2ft\") pod \"dnsmasq-dns-55b7c8d469-2kcr9\" (UID: \"a011963b-fb76-469e-bff2-9c9aa715ebc7\") " pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.179808 master-0 kubenswrapper[36504]: I1203 22:28:18.179705 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:18.794638 master-0 kubenswrapper[36504]: I1203 22:28:18.794509 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b7c8d469-2kcr9"] Dec 03 22:28:18.828103 master-0 kubenswrapper[36504]: I1203 22:28:18.827968 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" event={"ID":"a011963b-fb76-469e-bff2-9c9aa715ebc7","Type":"ContainerStarted","Data":"e055829e00292206658a7601bf94ccdbd64643c506b7caf6f531d3de37f6c338"} Dec 03 22:28:18.844455 master-0 kubenswrapper[36504]: I1203 22:28:18.844384 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerStarted","Data":"c84dd56063e6e24217b7bb562b8223389d0be60b42b1e90381fe11f821e62b9f"} Dec 03 22:28:19.700494 master-0 kubenswrapper[36504]: I1203 22:28:19.700416 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:28:19.701489 master-0 kubenswrapper[36504]: I1203 22:28:19.701396 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:28:19.706187 master-0 kubenswrapper[36504]: I1203 22:28:19.706123 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:28:19.873963 master-0 kubenswrapper[36504]: I1203 22:28:19.872890 36504 generic.go:334] "Generic (PLEG): container finished" podID="a011963b-fb76-469e-bff2-9c9aa715ebc7" containerID="cf34644bcb8408896f39d8d96888d5f7890ffe0234fc84927d1e57e175459b33" exitCode=0 Dec 03 22:28:19.876613 master-0 kubenswrapper[36504]: I1203 22:28:19.874903 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" event={"ID":"a011963b-fb76-469e-bff2-9c9aa715ebc7","Type":"ContainerDied","Data":"cf34644bcb8408896f39d8d96888d5f7890ffe0234fc84927d1e57e175459b33"} Dec 03 22:28:19.889878 master-0 kubenswrapper[36504]: I1203 22:28:19.889735 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerStarted","Data":"a46b7902b80998be3b8d804088550ce9bb88118b86b9f62957911df7f8b2839a"} Dec 03 22:28:19.956308 master-0 kubenswrapper[36504]: I1203 22:28:19.956130 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:28:20.666600 master-0 kubenswrapper[36504]: I1203 22:28:20.666506 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:20.918458 master-0 kubenswrapper[36504]: I1203 22:28:20.918292 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:20.920093 master-0 kubenswrapper[36504]: I1203 22:28:20.919802 36504 generic.go:334] "Generic (PLEG): container finished" podID="4f60ec5c-4977-4191-a3a1-399d0859e1f7" containerID="5ab7979fd89d941a5c3e5b09e4170705d1c54e3de7e98cfb055fb1fdaad5203c" exitCode=137 Dec 03 22:28:20.920371 master-0 kubenswrapper[36504]: I1203 22:28:20.919914 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f60ec5c-4977-4191-a3a1-399d0859e1f7","Type":"ContainerDied","Data":"5ab7979fd89d941a5c3e5b09e4170705d1c54e3de7e98cfb055fb1fdaad5203c"} Dec 03 22:28:20.920452 master-0 kubenswrapper[36504]: I1203 22:28:20.920419 36504 scope.go:117] "RemoveContainer" containerID="5ab7979fd89d941a5c3e5b09e4170705d1c54e3de7e98cfb055fb1fdaad5203c" Dec 03 22:28:20.931638 master-0 kubenswrapper[36504]: I1203 22:28:20.931588 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" event={"ID":"a011963b-fb76-469e-bff2-9c9aa715ebc7","Type":"ContainerStarted","Data":"da5e037cdb3e9e85156a6d7df18958498240e45048bfd132cd97ea3a7876f750"} Dec 03 22:28:21.014798 master-0 kubenswrapper[36504]: I1203 22:28:21.012395 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccx6\" (UniqueName: \"kubernetes.io/projected/4f60ec5c-4977-4191-a3a1-399d0859e1f7-kube-api-access-6ccx6\") pod \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " Dec 03 22:28:21.014798 master-0 kubenswrapper[36504]: I1203 22:28:21.012658 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-combined-ca-bundle\") pod \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " Dec 03 22:28:21.014798 master-0 kubenswrapper[36504]: I1203 22:28:21.012900 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-config-data\") pod \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\" (UID: \"4f60ec5c-4977-4191-a3a1-399d0859e1f7\") " Dec 03 22:28:21.035558 master-0 kubenswrapper[36504]: I1203 22:28:21.035444 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f60ec5c-4977-4191-a3a1-399d0859e1f7-kube-api-access-6ccx6" (OuterVolumeSpecName: "kube-api-access-6ccx6") pod "4f60ec5c-4977-4191-a3a1-399d0859e1f7" (UID: "4f60ec5c-4977-4191-a3a1-399d0859e1f7"). InnerVolumeSpecName "kube-api-access-6ccx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:21.090586 master-0 kubenswrapper[36504]: I1203 22:28:21.090463 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-config-data" (OuterVolumeSpecName: "config-data") pod "4f60ec5c-4977-4191-a3a1-399d0859e1f7" (UID: "4f60ec5c-4977-4191-a3a1-399d0859e1f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:21.095704 master-0 kubenswrapper[36504]: I1203 22:28:21.095281 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" podStartSLOduration=4.095249731 podStartE2EDuration="4.095249731s" podCreationTimestamp="2025-12-03 22:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:21.031287099 +0000 UTC m=+1066.251059106" watchObservedRunningTime="2025-12-03 22:28:21.095249731 +0000 UTC m=+1066.315021728" Dec 03 22:28:21.132815 master-0 kubenswrapper[36504]: I1203 22:28:21.132334 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:21.132815 master-0 kubenswrapper[36504]: I1203 22:28:21.132442 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccx6\" (UniqueName: \"kubernetes.io/projected/4f60ec5c-4977-4191-a3a1-399d0859e1f7-kube-api-access-6ccx6\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:21.172863 master-0 kubenswrapper[36504]: I1203 22:28:21.172658 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:21.173264 master-0 kubenswrapper[36504]: I1203 22:28:21.173189 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-log" containerID="cri-o://43c1a5eeadb83673a4c10c55d3907be3b11a2d08b50cc6cf87ee9c72533cd2e2" gracePeriod=30 Dec 03 22:28:21.174153 master-0 kubenswrapper[36504]: I1203 22:28:21.174126 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-api" containerID="cri-o://fe30345ae2400975eb7a58519365c17bdd6cc1822b87000eba394aa53e15d337" gracePeriod=30 Dec 03 22:28:21.242591 master-0 kubenswrapper[36504]: I1203 22:28:21.242044 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f60ec5c-4977-4191-a3a1-399d0859e1f7" (UID: "4f60ec5c-4977-4191-a3a1-399d0859e1f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:21.339241 master-0 kubenswrapper[36504]: I1203 22:28:21.339159 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f60ec5c-4977-4191-a3a1-399d0859e1f7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:21.948465 master-0 kubenswrapper[36504]: I1203 22:28:21.948373 36504 generic.go:334] "Generic (PLEG): container finished" podID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerID="43c1a5eeadb83673a4c10c55d3907be3b11a2d08b50cc6cf87ee9c72533cd2e2" exitCode=143 Dec 03 22:28:21.949375 master-0 kubenswrapper[36504]: I1203 22:28:21.948473 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6490dcf0-f493-4c1f-8a1c-38395f86f4fa","Type":"ContainerDied","Data":"43c1a5eeadb83673a4c10c55d3907be3b11a2d08b50cc6cf87ee9c72533cd2e2"} Dec 03 22:28:21.952256 master-0 kubenswrapper[36504]: I1203 22:28:21.952196 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4f60ec5c-4977-4191-a3a1-399d0859e1f7","Type":"ContainerDied","Data":"108286f14b2d345aabe20ed9c1d9ce8508ebdbed73c6f56f3ef0e27a91e0f456"} Dec 03 22:28:21.952393 master-0 kubenswrapper[36504]: I1203 22:28:21.952295 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:21.956987 master-0 kubenswrapper[36504]: I1203 22:28:21.956893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerStarted","Data":"b4f4f210a7bdc9624d173465590457fe2a80d300256d8ba731b1a2f15399f128"} Dec 03 22:28:21.957129 master-0 kubenswrapper[36504]: I1203 22:28:21.957055 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="sg-core" containerID="cri-o://a46b7902b80998be3b8d804088550ce9bb88118b86b9f62957911df7f8b2839a" gracePeriod=30 Dec 03 22:28:21.957129 master-0 kubenswrapper[36504]: I1203 22:28:21.957060 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="proxy-httpd" containerID="cri-o://b4f4f210a7bdc9624d173465590457fe2a80d300256d8ba731b1a2f15399f128" gracePeriod=30 Dec 03 22:28:21.957216 master-0 kubenswrapper[36504]: I1203 22:28:21.957088 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-notification-agent" containerID="cri-o://c84dd56063e6e24217b7bb562b8223389d0be60b42b1e90381fe11f821e62b9f" gracePeriod=30 Dec 03 22:28:21.957216 master-0 kubenswrapper[36504]: I1203 22:28:21.957070 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-central-agent" containerID="cri-o://514d4bb428ed1bc8562870409baee9eb4590227595332dc310b9ada9d52e2210" gracePeriod=30 Dec 03 22:28:21.957296 master-0 kubenswrapper[36504]: I1203 22:28:21.957258 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:21.958864 master-0 kubenswrapper[36504]: I1203 22:28:21.958833 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:22.008411 master-0 kubenswrapper[36504]: I1203 22:28:22.008287 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.982775402 podStartE2EDuration="7.008252259s" podCreationTimestamp="2025-12-03 22:28:15 +0000 UTC" firstStartedPulling="2025-12-03 22:28:16.572370825 +0000 UTC m=+1061.792142832" lastFinishedPulling="2025-12-03 22:28:20.597847682 +0000 UTC m=+1065.817619689" observedRunningTime="2025-12-03 22:28:22.000077417 +0000 UTC m=+1067.219849434" watchObservedRunningTime="2025-12-03 22:28:22.008252259 +0000 UTC m=+1067.228024266" Dec 03 22:28:22.048846 master-0 kubenswrapper[36504]: I1203 22:28:22.041007 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:22.061928 master-0 kubenswrapper[36504]: I1203 22:28:22.060592 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:22.141279 master-0 kubenswrapper[36504]: I1203 22:28:22.141205 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:22.159321 master-0 kubenswrapper[36504]: E1203 22:28:22.159220 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f60ec5c-4977-4191-a3a1-399d0859e1f7" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 22:28:22.159321 master-0 kubenswrapper[36504]: I1203 22:28:22.159306 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f60ec5c-4977-4191-a3a1-399d0859e1f7" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 22:28:22.160168 master-0 kubenswrapper[36504]: I1203 22:28:22.160130 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f60ec5c-4977-4191-a3a1-399d0859e1f7" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 22:28:22.161691 master-0 kubenswrapper[36504]: I1203 22:28:22.161655 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.167546 master-0 kubenswrapper[36504]: I1203 22:28:22.167495 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 22:28:22.168021 master-0 kubenswrapper[36504]: I1203 22:28:22.167903 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 22:28:22.168277 master-0 kubenswrapper[36504]: I1203 22:28:22.168245 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 22:28:22.191343 master-0 kubenswrapper[36504]: I1203 22:28:22.191251 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:22.282478 master-0 kubenswrapper[36504]: I1203 22:28:22.278754 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.282478 master-0 kubenswrapper[36504]: I1203 22:28:22.279144 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.282478 master-0 kubenswrapper[36504]: I1203 22:28:22.279294 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.282478 master-0 kubenswrapper[36504]: I1203 22:28:22.279422 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg85r\" (UniqueName: \"kubernetes.io/projected/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-kube-api-access-xg85r\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.282478 master-0 kubenswrapper[36504]: I1203 22:28:22.279552 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.382920 master-0 kubenswrapper[36504]: I1203 22:28:22.382857 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.383302 master-0 kubenswrapper[36504]: I1203 22:28:22.383284 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.383429 master-0 kubenswrapper[36504]: I1203 22:28:22.383414 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.383533 master-0 kubenswrapper[36504]: I1203 22:28:22.383517 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.383691 master-0 kubenswrapper[36504]: I1203 22:28:22.383670 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg85r\" (UniqueName: \"kubernetes.io/projected/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-kube-api-access-xg85r\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.391541 master-0 kubenswrapper[36504]: I1203 22:28:22.391439 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.391541 master-0 kubenswrapper[36504]: I1203 22:28:22.391528 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.391912 master-0 kubenswrapper[36504]: I1203 22:28:22.391552 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.393197 master-0 kubenswrapper[36504]: I1203 22:28:22.393048 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.410685 master-0 kubenswrapper[36504]: I1203 22:28:22.410631 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg85r\" (UniqueName: \"kubernetes.io/projected/27c676a0-ccb5-4a1d-af3a-be2ef8155dc8-kube-api-access-xg85r\") pod \"nova-cell1-novncproxy-0\" (UID: \"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:22.567862 master-0 kubenswrapper[36504]: I1203 22:28:22.567804 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:23.006932 master-0 kubenswrapper[36504]: I1203 22:28:23.006841 36504 generic.go:334] "Generic (PLEG): container finished" podID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerID="b4f4f210a7bdc9624d173465590457fe2a80d300256d8ba731b1a2f15399f128" exitCode=0 Dec 03 22:28:23.006932 master-0 kubenswrapper[36504]: I1203 22:28:23.006912 36504 generic.go:334] "Generic (PLEG): container finished" podID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerID="a46b7902b80998be3b8d804088550ce9bb88118b86b9f62957911df7f8b2839a" exitCode=2 Dec 03 22:28:23.006932 master-0 kubenswrapper[36504]: I1203 22:28:23.006926 36504 generic.go:334] "Generic (PLEG): container finished" podID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerID="c84dd56063e6e24217b7bb562b8223389d0be60b42b1e90381fe11f821e62b9f" exitCode=0 Dec 03 22:28:23.009247 master-0 kubenswrapper[36504]: I1203 22:28:23.009174 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerDied","Data":"b4f4f210a7bdc9624d173465590457fe2a80d300256d8ba731b1a2f15399f128"} Dec 03 22:28:23.009247 master-0 kubenswrapper[36504]: I1203 22:28:23.009248 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerDied","Data":"a46b7902b80998be3b8d804088550ce9bb88118b86b9f62957911df7f8b2839a"} Dec 03 22:28:23.009391 master-0 kubenswrapper[36504]: I1203 22:28:23.009262 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerDied","Data":"c84dd56063e6e24217b7bb562b8223389d0be60b42b1e90381fe11f821e62b9f"} Dec 03 22:28:23.116271 master-0 kubenswrapper[36504]: I1203 22:28:23.115611 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f60ec5c-4977-4191-a3a1-399d0859e1f7" path="/var/lib/kubelet/pods/4f60ec5c-4977-4191-a3a1-399d0859e1f7/volumes" Dec 03 22:28:23.140834 master-0 kubenswrapper[36504]: I1203 22:28:23.140391 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 22:28:24.027594 master-0 kubenswrapper[36504]: I1203 22:28:24.025549 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8","Type":"ContainerStarted","Data":"849c4bf806877932c06a9d747cebf99c579e63c79e8a5aead8e6a329a21ec14e"} Dec 03 22:28:24.027594 master-0 kubenswrapper[36504]: I1203 22:28:24.025645 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"27c676a0-ccb5-4a1d-af3a-be2ef8155dc8","Type":"ContainerStarted","Data":"0e22e3140b43b7c11b4c30247799d538fc985e296a99347f564b74931c5dadcb"} Dec 03 22:28:24.054906 master-0 kubenswrapper[36504]: I1203 22:28:24.053922 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.053886427 podStartE2EDuration="2.053886427s" podCreationTimestamp="2025-12-03 22:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:24.050243664 +0000 UTC m=+1069.270015681" watchObservedRunningTime="2025-12-03 22:28:24.053886427 +0000 UTC m=+1069.273658434" Dec 03 22:28:25.051209 master-0 kubenswrapper[36504]: I1203 22:28:25.047699 36504 generic.go:334] "Generic (PLEG): container finished" podID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerID="fe30345ae2400975eb7a58519365c17bdd6cc1822b87000eba394aa53e15d337" exitCode=0 Dec 03 22:28:25.051209 master-0 kubenswrapper[36504]: I1203 22:28:25.047797 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6490dcf0-f493-4c1f-8a1c-38395f86f4fa","Type":"ContainerDied","Data":"fe30345ae2400975eb7a58519365c17bdd6cc1822b87000eba394aa53e15d337"} Dec 03 22:28:25.280315 master-0 kubenswrapper[36504]: I1203 22:28:25.280198 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:25.377593 master-0 kubenswrapper[36504]: I1203 22:28:25.377529 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-logs\") pod \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " Dec 03 22:28:25.378381 master-0 kubenswrapper[36504]: I1203 22:28:25.378246 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-logs" (OuterVolumeSpecName: "logs") pod "6490dcf0-f493-4c1f-8a1c-38395f86f4fa" (UID: "6490dcf0-f493-4c1f-8a1c-38395f86f4fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:25.378598 master-0 kubenswrapper[36504]: I1203 22:28:25.378575 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98wsh\" (UniqueName: \"kubernetes.io/projected/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-kube-api-access-98wsh\") pod \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " Dec 03 22:28:25.378906 master-0 kubenswrapper[36504]: I1203 22:28:25.378881 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-config-data\") pod \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " Dec 03 22:28:25.380048 master-0 kubenswrapper[36504]: I1203 22:28:25.380017 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-combined-ca-bundle\") pod \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\" (UID: \"6490dcf0-f493-4c1f-8a1c-38395f86f4fa\") " Dec 03 22:28:25.392430 master-0 kubenswrapper[36504]: I1203 22:28:25.389588 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-kube-api-access-98wsh" (OuterVolumeSpecName: "kube-api-access-98wsh") pod "6490dcf0-f493-4c1f-8a1c-38395f86f4fa" (UID: "6490dcf0-f493-4c1f-8a1c-38395f86f4fa"). InnerVolumeSpecName "kube-api-access-98wsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:25.397453 master-0 kubenswrapper[36504]: I1203 22:28:25.396913 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98wsh\" (UniqueName: \"kubernetes.io/projected/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-kube-api-access-98wsh\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:25.397453 master-0 kubenswrapper[36504]: I1203 22:28:25.397290 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:25.419322 master-0 kubenswrapper[36504]: I1203 22:28:25.419237 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-config-data" (OuterVolumeSpecName: "config-data") pod "6490dcf0-f493-4c1f-8a1c-38395f86f4fa" (UID: "6490dcf0-f493-4c1f-8a1c-38395f86f4fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:25.425637 master-0 kubenswrapper[36504]: I1203 22:28:25.425591 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6490dcf0-f493-4c1f-8a1c-38395f86f4fa" (UID: "6490dcf0-f493-4c1f-8a1c-38395f86f4fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:25.502379 master-0 kubenswrapper[36504]: I1203 22:28:25.501340 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:25.502379 master-0 kubenswrapper[36504]: I1203 22:28:25.501406 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6490dcf0-f493-4c1f-8a1c-38395f86f4fa-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:26.073438 master-0 kubenswrapper[36504]: I1203 22:28:26.073336 36504 generic.go:334] "Generic (PLEG): container finished" podID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerID="514d4bb428ed1bc8562870409baee9eb4590227595332dc310b9ada9d52e2210" exitCode=0 Dec 03 22:28:26.074081 master-0 kubenswrapper[36504]: I1203 22:28:26.073427 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerDied","Data":"514d4bb428ed1bc8562870409baee9eb4590227595332dc310b9ada9d52e2210"} Dec 03 22:28:26.076462 master-0 kubenswrapper[36504]: I1203 22:28:26.076420 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6490dcf0-f493-4c1f-8a1c-38395f86f4fa","Type":"ContainerDied","Data":"e1481c0cce24ae41099a6a589d471b5bc5beb77269614807796be7f03b87f399"} Dec 03 22:28:26.076531 master-0 kubenswrapper[36504]: I1203 22:28:26.076473 36504 scope.go:117] "RemoveContainer" containerID="fe30345ae2400975eb7a58519365c17bdd6cc1822b87000eba394aa53e15d337" Dec 03 22:28:26.076531 master-0 kubenswrapper[36504]: I1203 22:28:26.076501 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:26.138217 master-0 kubenswrapper[36504]: I1203 22:28:26.134676 36504 scope.go:117] "RemoveContainer" containerID="43c1a5eeadb83673a4c10c55d3907be3b11a2d08b50cc6cf87ee9c72533cd2e2" Dec 03 22:28:26.162335 master-0 kubenswrapper[36504]: I1203 22:28:26.162074 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:26.207031 master-0 kubenswrapper[36504]: I1203 22:28:26.206197 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: I1203 22:28:26.224739 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: E1203 22:28:26.225697 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-api" Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: I1203 22:28:26.225720 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-api" Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: E1203 22:28:26.225748 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-log" Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: I1203 22:28:26.225754 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-log" Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: I1203 22:28:26.226089 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-api" Dec 03 22:28:26.227676 master-0 kubenswrapper[36504]: I1203 22:28:26.226126 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" containerName="nova-api-log" Dec 03 22:28:26.229880 master-0 kubenswrapper[36504]: I1203 22:28:26.229297 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:26.234785 master-0 kubenswrapper[36504]: I1203 22:28:26.233954 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 22:28:26.235839 master-0 kubenswrapper[36504]: I1203 22:28:26.235001 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:28:26.235839 master-0 kubenswrapper[36504]: I1203 22:28:26.235065 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 22:28:26.242426 master-0 kubenswrapper[36504]: I1203 22:28:26.239133 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:26.369009 master-0 kubenswrapper[36504]: I1203 22:28:26.368941 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.369280 master-0 kubenswrapper[36504]: I1203 22:28:26.369049 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjspj\" (UniqueName: \"kubernetes.io/projected/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-kube-api-access-kjspj\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.369339 master-0 kubenswrapper[36504]: I1203 22:28:26.369311 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.369386 master-0 kubenswrapper[36504]: I1203 22:28:26.369371 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-config-data\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.369547 master-0 kubenswrapper[36504]: I1203 22:28:26.369519 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.369968 master-0 kubenswrapper[36504]: I1203 22:28:26.369936 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-logs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.474892 master-0 kubenswrapper[36504]: I1203 22:28:26.474377 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.474892 master-0 kubenswrapper[36504]: I1203 22:28:26.474454 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-config-data\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.474892 master-0 kubenswrapper[36504]: I1203 22:28:26.474526 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.474892 master-0 kubenswrapper[36504]: I1203 22:28:26.474648 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-logs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.474892 master-0 kubenswrapper[36504]: I1203 22:28:26.474719 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.474892 master-0 kubenswrapper[36504]: I1203 22:28:26.474740 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjspj\" (UniqueName: \"kubernetes.io/projected/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-kube-api-access-kjspj\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.478788 master-0 kubenswrapper[36504]: I1203 22:28:26.475683 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-logs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.478892 master-0 kubenswrapper[36504]: I1203 22:28:26.478829 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.482789 master-0 kubenswrapper[36504]: I1203 22:28:26.479267 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.482789 master-0 kubenswrapper[36504]: I1203 22:28:26.481716 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.488793 master-0 kubenswrapper[36504]: I1203 22:28:26.488444 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-config-data\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.505181 master-0 kubenswrapper[36504]: I1203 22:28:26.503941 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjspj\" (UniqueName: \"kubernetes.io/projected/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-kube-api-access-kjspj\") pod \"nova-api-0\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " pod="openstack/nova-api-0" Dec 03 22:28:26.556488 master-0 kubenswrapper[36504]: I1203 22:28:26.556412 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:26.706623 master-0 kubenswrapper[36504]: I1203 22:28:26.706558 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:26.802680 master-0 kubenswrapper[36504]: I1203 22:28:26.802042 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-config-data\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.802680 master-0 kubenswrapper[36504]: I1203 22:28:26.802192 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-combined-ca-bundle\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.802680 master-0 kubenswrapper[36504]: I1203 22:28:26.802614 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlcjn\" (UniqueName: \"kubernetes.io/projected/0ecc77c8-1691-47bc-a507-30c1415d3488-kube-api-access-dlcjn\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.803114 master-0 kubenswrapper[36504]: I1203 22:28:26.802799 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-sg-core-conf-yaml\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.803114 master-0 kubenswrapper[36504]: I1203 22:28:26.802882 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-log-httpd\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.803114 master-0 kubenswrapper[36504]: I1203 22:28:26.802984 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-run-httpd\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.803114 master-0 kubenswrapper[36504]: I1203 22:28:26.803046 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-scripts\") pod \"0ecc77c8-1691-47bc-a507-30c1415d3488\" (UID: \"0ecc77c8-1691-47bc-a507-30c1415d3488\") " Dec 03 22:28:26.813232 master-0 kubenswrapper[36504]: I1203 22:28:26.813084 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:26.815866 master-0 kubenswrapper[36504]: I1203 22:28:26.813959 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:26.815866 master-0 kubenswrapper[36504]: I1203 22:28:26.814528 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:26.818935 master-0 kubenswrapper[36504]: I1203 22:28:26.818573 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-scripts" (OuterVolumeSpecName: "scripts") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26.832206 master-0 kubenswrapper[36504]: I1203 22:28:26.832045 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecc77c8-1691-47bc-a507-30c1415d3488-kube-api-access-dlcjn" (OuterVolumeSpecName: "kube-api-access-dlcjn") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "kube-api-access-dlcjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:26.874116 master-0 kubenswrapper[36504]: I1203 22:28:26.874042 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:26.916634 master-0 kubenswrapper[36504]: I1203 22:28:26.916532 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:26.916634 master-0 kubenswrapper[36504]: I1203 22:28:26.916602 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlcjn\" (UniqueName: \"kubernetes.io/projected/0ecc77c8-1691-47bc-a507-30c1415d3488-kube-api-access-dlcjn\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:26.916634 master-0 kubenswrapper[36504]: I1203 22:28:26.916622 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:26.916634 master-0 kubenswrapper[36504]: I1203 22:28:26.916636 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0ecc77c8-1691-47bc-a507-30c1415d3488-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:26.991942 master-0 kubenswrapper[36504]: I1203 22:28:26.990634 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:27.008309 master-0 kubenswrapper[36504]: I1203 22:28:27.008237 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-config-data" (OuterVolumeSpecName: "config-data") pod "0ecc77c8-1691-47bc-a507-30c1415d3488" (UID: "0ecc77c8-1691-47bc-a507-30c1415d3488"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:27.019487 master-0 kubenswrapper[36504]: I1203 22:28:27.019429 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:27.019487 master-0 kubenswrapper[36504]: I1203 22:28:27.019482 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecc77c8-1691-47bc-a507-30c1415d3488-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:27.143804 master-0 kubenswrapper[36504]: I1203 22:28:27.143721 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:27.148927 master-0 kubenswrapper[36504]: I1203 22:28:27.148846 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6490dcf0-f493-4c1f-8a1c-38395f86f4fa" path="/var/lib/kubelet/pods/6490dcf0-f493-4c1f-8a1c-38395f86f4fa/volumes" Dec 03 22:28:27.156937 master-0 kubenswrapper[36504]: I1203 22:28:27.156828 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0ecc77c8-1691-47bc-a507-30c1415d3488","Type":"ContainerDied","Data":"e5971531ff7da38e1d6b974b7efd6b258b63f9f7b2594625e4f8c582623412cd"} Dec 03 22:28:27.157191 master-0 kubenswrapper[36504]: I1203 22:28:27.156953 36504 scope.go:117] "RemoveContainer" containerID="b4f4f210a7bdc9624d173465590457fe2a80d300256d8ba731b1a2f15399f128" Dec 03 22:28:27.187876 master-0 kubenswrapper[36504]: I1203 22:28:27.187815 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:27.198438 master-0 kubenswrapper[36504]: W1203 22:28:27.198381 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4ad88f6_c451_44f9_b8be_52f9acbe5b82.slice/crio-2572c27522d30d2673352e936b3c20c078ae1f2233077242ced7f5d20abf02a2 WatchSource:0}: Error finding container 2572c27522d30d2673352e936b3c20c078ae1f2233077242ced7f5d20abf02a2: Status 404 returned error can't find the container with id 2572c27522d30d2673352e936b3c20c078ae1f2233077242ced7f5d20abf02a2 Dec 03 22:28:27.209148 master-0 kubenswrapper[36504]: I1203 22:28:27.209088 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27.220321 master-0 kubenswrapper[36504]: I1203 22:28:27.219987 36504 scope.go:117] "RemoveContainer" containerID="a46b7902b80998be3b8d804088550ce9bb88118b86b9f62957911df7f8b2839a" Dec 03 22:28:27.293732 master-0 kubenswrapper[36504]: I1203 22:28:27.292139 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27.322962 master-0 kubenswrapper[36504]: I1203 22:28:27.322866 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27.323920 master-0 kubenswrapper[36504]: E1203 22:28:27.323882 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-central-agent" Dec 03 22:28:27.323920 master-0 kubenswrapper[36504]: I1203 22:28:27.323918 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-central-agent" Dec 03 22:28:27.324022 master-0 kubenswrapper[36504]: E1203 22:28:27.323943 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="sg-core" Dec 03 22:28:27.324022 master-0 kubenswrapper[36504]: I1203 22:28:27.323953 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="sg-core" Dec 03 22:28:27.324022 master-0 kubenswrapper[36504]: E1203 22:28:27.323991 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-notification-agent" Dec 03 22:28:27.324022 master-0 kubenswrapper[36504]: I1203 22:28:27.324001 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-notification-agent" Dec 03 22:28:27.324207 master-0 kubenswrapper[36504]: E1203 22:28:27.324070 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="proxy-httpd" Dec 03 22:28:27.324207 master-0 kubenswrapper[36504]: I1203 22:28:27.324082 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="proxy-httpd" Dec 03 22:28:27.324507 master-0 kubenswrapper[36504]: I1203 22:28:27.324458 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="proxy-httpd" Dec 03 22:28:27.324563 master-0 kubenswrapper[36504]: I1203 22:28:27.324511 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-central-agent" Dec 03 22:28:27.324563 master-0 kubenswrapper[36504]: I1203 22:28:27.324538 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="ceilometer-notification-agent" Dec 03 22:28:27.324563 master-0 kubenswrapper[36504]: I1203 22:28:27.324556 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" containerName="sg-core" Dec 03 22:28:27.329046 master-0 kubenswrapper[36504]: I1203 22:28:27.328991 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:27.333151 master-0 kubenswrapper[36504]: I1203 22:28:27.333111 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:28:27.335551 master-0 kubenswrapper[36504]: I1203 22:28:27.335508 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:28:27.337525 master-0 kubenswrapper[36504]: I1203 22:28:27.337438 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdps5\" (UniqueName: \"kubernetes.io/projected/8090e570-85c4-4562-8305-e48cc2b59936-kube-api-access-mdps5\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.337684 master-0 kubenswrapper[36504]: I1203 22:28:27.337557 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.337684 master-0 kubenswrapper[36504]: I1203 22:28:27.337603 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-log-httpd\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.337762 master-0 kubenswrapper[36504]: I1203 22:28:27.337723 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-run-httpd\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.337762 master-0 kubenswrapper[36504]: I1203 22:28:27.337748 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.338825 master-0 kubenswrapper[36504]: I1203 22:28:27.338755 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-scripts\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.338895 master-0 kubenswrapper[36504]: I1203 22:28:27.338829 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-config-data\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.348415 master-0 kubenswrapper[36504]: I1203 22:28:27.348326 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:27.360434 master-0 kubenswrapper[36504]: I1203 22:28:27.360391 36504 scope.go:117] "RemoveContainer" containerID="c84dd56063e6e24217b7bb562b8223389d0be60b42b1e90381fe11f821e62b9f" Dec 03 22:28:27.396972 master-0 kubenswrapper[36504]: I1203 22:28:27.396806 36504 scope.go:117] "RemoveContainer" containerID="514d4bb428ed1bc8562870409baee9eb4590227595332dc310b9ada9d52e2210" Dec 03 22:28:27.443175 master-0 kubenswrapper[36504]: I1203 22:28:27.443068 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.443400 master-0 kubenswrapper[36504]: I1203 22:28:27.443207 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-log-httpd\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.443609 master-0 kubenswrapper[36504]: I1203 22:28:27.443582 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-run-httpd\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.443691 master-0 kubenswrapper[36504]: I1203 22:28:27.443628 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.443839 master-0 kubenswrapper[36504]: I1203 22:28:27.443808 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-scripts\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.443903 master-0 kubenswrapper[36504]: I1203 22:28:27.443884 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-config-data\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.444038 master-0 kubenswrapper[36504]: I1203 22:28:27.443998 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdps5\" (UniqueName: \"kubernetes.io/projected/8090e570-85c4-4562-8305-e48cc2b59936-kube-api-access-mdps5\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.444154 master-0 kubenswrapper[36504]: I1203 22:28:27.443993 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-log-httpd\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.444563 master-0 kubenswrapper[36504]: I1203 22:28:27.444495 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-run-httpd\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.447991 master-0 kubenswrapper[36504]: I1203 22:28:27.447938 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.449414 master-0 kubenswrapper[36504]: I1203 22:28:27.449368 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-config-data\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.451894 master-0 kubenswrapper[36504]: I1203 22:28:27.451847 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-scripts\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.452453 master-0 kubenswrapper[36504]: I1203 22:28:27.452392 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.465713 master-0 kubenswrapper[36504]: I1203 22:28:27.465624 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdps5\" (UniqueName: \"kubernetes.io/projected/8090e570-85c4-4562-8305-e48cc2b59936-kube-api-access-mdps5\") pod \"ceilometer-0\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " pod="openstack/ceilometer-0" Dec 03 22:28:27.572534 master-0 kubenswrapper[36504]: I1203 22:28:27.568982 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:27.654800 master-0 kubenswrapper[36504]: I1203 22:28:27.654725 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:28:28.171802 master-0 kubenswrapper[36504]: I1203 22:28:28.169650 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4ad88f6-c451-44f9-b8be-52f9acbe5b82","Type":"ContainerStarted","Data":"f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4"} Dec 03 22:28:28.171802 master-0 kubenswrapper[36504]: I1203 22:28:28.169746 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4ad88f6-c451-44f9-b8be-52f9acbe5b82","Type":"ContainerStarted","Data":"3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3"} Dec 03 22:28:28.171802 master-0 kubenswrapper[36504]: I1203 22:28:28.169759 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4ad88f6-c451-44f9-b8be-52f9acbe5b82","Type":"ContainerStarted","Data":"2572c27522d30d2673352e936b3c20c078ae1f2233077242ced7f5d20abf02a2"} Dec 03 22:28:28.185803 master-0 kubenswrapper[36504]: I1203 22:28:28.183101 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55b7c8d469-2kcr9" Dec 03 22:28:28.205090 master-0 kubenswrapper[36504]: I1203 22:28:28.205000 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:28:28.213899 master-0 kubenswrapper[36504]: I1203 22:28:28.210344 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.2103091519999998 podStartE2EDuration="2.210309152s" podCreationTimestamp="2025-12-03 22:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:28.199662114 +0000 UTC m=+1073.419434131" watchObservedRunningTime="2025-12-03 22:28:28.210309152 +0000 UTC m=+1073.430081169" Dec 03 22:28:28.216094 master-0 kubenswrapper[36504]: W1203 22:28:28.216026 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8090e570_85c4_4562_8305_e48cc2b59936.slice/crio-e339ce26f1ffe4f3a8107499a3bf9b5a15eecc65d3872c6e38cde3af561ceff9 WatchSource:0}: Error finding container e339ce26f1ffe4f3a8107499a3bf9b5a15eecc65d3872c6e38cde3af561ceff9: Status 404 returned error can't find the container with id e339ce26f1ffe4f3a8107499a3bf9b5a15eecc65d3872c6e38cde3af561ceff9 Dec 03 22:28:28.300863 master-0 kubenswrapper[36504]: I1203 22:28:28.298670 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b5b7f9cc-hv7jl"] Dec 03 22:28:28.300863 master-0 kubenswrapper[36504]: I1203 22:28:28.299071 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" containerName="dnsmasq-dns" containerID="cri-o://99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9" gracePeriod=10 Dec 03 22:28:29.047616 master-0 kubenswrapper[36504]: I1203 22:28:29.047529 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:28:29.120894 master-0 kubenswrapper[36504]: I1203 22:28:29.120764 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecc77c8-1691-47bc-a507-30c1415d3488" path="/var/lib/kubelet/pods/0ecc77c8-1691-47bc-a507-30c1415d3488/volumes" Dec 03 22:28:29.193829 master-0 kubenswrapper[36504]: I1203 22:28:29.193734 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" event={"ID":"13feaa47-c881-462d-b78d-1716d9552aeb","Type":"ContainerDied","Data":"99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9"} Dec 03 22:28:29.194503 master-0 kubenswrapper[36504]: I1203 22:28:29.193737 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" Dec 03 22:28:29.194503 master-0 kubenswrapper[36504]: I1203 22:28:29.193885 36504 scope.go:117] "RemoveContainer" containerID="99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9" Dec 03 22:28:29.194503 master-0 kubenswrapper[36504]: I1203 22:28:29.193654 36504 generic.go:334] "Generic (PLEG): container finished" podID="13feaa47-c881-462d-b78d-1716d9552aeb" containerID="99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9" exitCode=0 Dec 03 22:28:29.194673 master-0 kubenswrapper[36504]: I1203 22:28:29.194436 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b5b7f9cc-hv7jl" event={"ID":"13feaa47-c881-462d-b78d-1716d9552aeb","Type":"ContainerDied","Data":"86c68dd2c14edd4fbb942b4946db74406df09d3a121089b692c3c2d93892730b"} Dec 03 22:28:29.199686 master-0 kubenswrapper[36504]: I1203 22:28:29.199647 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerStarted","Data":"06378f7a35baeb277d82156dd96ffe06496d39394eed645f16cf5f7848c9200e"} Dec 03 22:28:29.199928 master-0 kubenswrapper[36504]: I1203 22:28:29.199697 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerStarted","Data":"e339ce26f1ffe4f3a8107499a3bf9b5a15eecc65d3872c6e38cde3af561ceff9"} Dec 03 22:28:29.250526 master-0 kubenswrapper[36504]: I1203 22:28:29.238271 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-swift-storage-0\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.250526 master-0 kubenswrapper[36504]: I1203 22:28:29.238402 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-sb\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.250526 master-0 kubenswrapper[36504]: I1203 22:28:29.238501 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-svc\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.250526 master-0 kubenswrapper[36504]: I1203 22:28:29.238595 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwnwf\" (UniqueName: \"kubernetes.io/projected/13feaa47-c881-462d-b78d-1716d9552aeb-kube-api-access-lwnwf\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.250526 master-0 kubenswrapper[36504]: I1203 22:28:29.238665 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-config\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.250526 master-0 kubenswrapper[36504]: I1203 22:28:29.238815 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-edpm\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.276800 master-0 kubenswrapper[36504]: I1203 22:28:29.271686 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13feaa47-c881-462d-b78d-1716d9552aeb-kube-api-access-lwnwf" (OuterVolumeSpecName: "kube-api-access-lwnwf") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "kube-api-access-lwnwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:29.298507 master-0 kubenswrapper[36504]: I1203 22:28:29.295501 36504 scope.go:117] "RemoveContainer" containerID="2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db" Dec 03 22:28:29.298507 master-0 kubenswrapper[36504]: I1203 22:28:29.297415 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-nb\") pod \"13feaa47-c881-462d-b78d-1716d9552aeb\" (UID: \"13feaa47-c881-462d-b78d-1716d9552aeb\") " Dec 03 22:28:29.309810 master-0 kubenswrapper[36504]: I1203 22:28:29.298804 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwnwf\" (UniqueName: \"kubernetes.io/projected/13feaa47-c881-462d-b78d-1716d9552aeb-kube-api-access-lwnwf\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.388550 master-0 kubenswrapper[36504]: I1203 22:28:29.388426 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:29.402850 master-0 kubenswrapper[36504]: I1203 22:28:29.402753 36504 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.414053 master-0 kubenswrapper[36504]: I1203 22:28:29.409722 36504 scope.go:117] "RemoveContainer" containerID="99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9" Dec 03 22:28:29.414053 master-0 kubenswrapper[36504]: E1203 22:28:29.410301 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9\": container with ID starting with 99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9 not found: ID does not exist" containerID="99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9" Dec 03 22:28:29.414053 master-0 kubenswrapper[36504]: I1203 22:28:29.410373 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9"} err="failed to get container status \"99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9\": rpc error: code = NotFound desc = could not find container \"99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9\": container with ID starting with 99db4f1da8ce70f24fe38a6711b7748f229e1f73eeb1f9a38ecb00c0feb671a9 not found: ID does not exist" Dec 03 22:28:29.414053 master-0 kubenswrapper[36504]: I1203 22:28:29.410475 36504 scope.go:117] "RemoveContainer" containerID="2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db" Dec 03 22:28:29.414053 master-0 kubenswrapper[36504]: E1203 22:28:29.411357 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db\": container with ID starting with 2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db not found: ID does not exist" containerID="2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db" Dec 03 22:28:29.414053 master-0 kubenswrapper[36504]: I1203 22:28:29.411380 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db"} err="failed to get container status \"2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db\": rpc error: code = NotFound desc = could not find container \"2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db\": container with ID starting with 2be75bbce47b014c3fcf32c503f10825b7e1d6540e00f94982912f70fcf9d7db not found: ID does not exist" Dec 03 22:28:29.433910 master-0 kubenswrapper[36504]: I1203 22:28:29.433408 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-edpm" (OuterVolumeSpecName: "edpm") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "edpm". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:29.441698 master-0 kubenswrapper[36504]: I1203 22:28:29.441609 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:29.461512 master-0 kubenswrapper[36504]: I1203 22:28:29.461430 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:29.472495 master-0 kubenswrapper[36504]: I1203 22:28:29.472423 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:29.493919 master-0 kubenswrapper[36504]: I1203 22:28:29.479268 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-config" (OuterVolumeSpecName: "config") pod "13feaa47-c881-462d-b78d-1716d9552aeb" (UID: "13feaa47-c881-462d-b78d-1716d9552aeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:28:29.506906 master-0 kubenswrapper[36504]: I1203 22:28:29.506408 36504 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.506906 master-0 kubenswrapper[36504]: I1203 22:28:29.506463 36504 reconciler_common.go:293] "Volume detached for volume \"edpm\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.506906 master-0 kubenswrapper[36504]: I1203 22:28:29.506475 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.506906 master-0 kubenswrapper[36504]: I1203 22:28:29.506486 36504 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.506906 master-0 kubenswrapper[36504]: I1203 22:28:29.506499 36504 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13feaa47-c881-462d-b78d-1716d9552aeb-dns-svc\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:29.618445 master-0 kubenswrapper[36504]: I1203 22:28:29.618350 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b5b7f9cc-hv7jl"] Dec 03 22:28:29.634942 master-0 kubenswrapper[36504]: I1203 22:28:29.634842 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b5b7f9cc-hv7jl"] Dec 03 22:28:30.228871 master-0 kubenswrapper[36504]: I1203 22:28:30.227816 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerStarted","Data":"2e1195f7c14306b10e00e53907ae64b41ec55a6cfecbb723f31437fbc1228799"} Dec 03 22:28:31.112538 master-0 kubenswrapper[36504]: I1203 22:28:31.112293 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" path="/var/lib/kubelet/pods/13feaa47-c881-462d-b78d-1716d9552aeb/volumes" Dec 03 22:28:31.250992 master-0 kubenswrapper[36504]: I1203 22:28:31.250894 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerStarted","Data":"410682d4622016aedc9341f9ee150e7a656ba0a58d7a6865d5e7fac7cd751f8b"} Dec 03 22:28:32.096025 master-0 kubenswrapper[36504]: I1203 22:28:32.095954 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:28:32.568319 master-0 kubenswrapper[36504]: I1203 22:28:32.568255 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:32.592983 master-0 kubenswrapper[36504]: I1203 22:28:32.592892 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:33.288169 master-0 kubenswrapper[36504]: I1203 22:28:33.288061 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerStarted","Data":"1240ac558d6fad0b18f98df9311dcfd6c5752f54920f5c1b317ddc02cbe0ca26"} Dec 03 22:28:33.310716 master-0 kubenswrapper[36504]: I1203 22:28:33.310648 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 22:28:33.346280 master-0 kubenswrapper[36504]: I1203 22:28:33.346136 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.514584196 podStartE2EDuration="6.346095301s" podCreationTimestamp="2025-12-03 22:28:27 +0000 UTC" firstStartedPulling="2025-12-03 22:28:28.219557367 +0000 UTC m=+1073.439329384" lastFinishedPulling="2025-12-03 22:28:32.051068482 +0000 UTC m=+1077.270840489" observedRunningTime="2025-12-03 22:28:33.311879186 +0000 UTC m=+1078.531651213" watchObservedRunningTime="2025-12-03 22:28:33.346095301 +0000 UTC m=+1078.565867318" Dec 03 22:28:33.561011 master-0 kubenswrapper[36504]: I1203 22:28:33.560836 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fzdd5"] Dec 03 22:28:33.561475 master-0 kubenswrapper[36504]: E1203 22:28:33.561451 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" containerName="dnsmasq-dns" Dec 03 22:28:33.561475 master-0 kubenswrapper[36504]: I1203 22:28:33.561474 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" containerName="dnsmasq-dns" Dec 03 22:28:33.561611 master-0 kubenswrapper[36504]: E1203 22:28:33.561488 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" containerName="init" Dec 03 22:28:33.561611 master-0 kubenswrapper[36504]: I1203 22:28:33.561496 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" containerName="init" Dec 03 22:28:33.561860 master-0 kubenswrapper[36504]: I1203 22:28:33.561838 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="13feaa47-c881-462d-b78d-1716d9552aeb" containerName="dnsmasq-dns" Dec 03 22:28:33.562713 master-0 kubenswrapper[36504]: I1203 22:28:33.562677 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.566711 master-0 kubenswrapper[36504]: I1203 22:28:33.566631 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 22:28:33.567431 master-0 kubenswrapper[36504]: I1203 22:28:33.567382 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 22:28:33.587802 master-0 kubenswrapper[36504]: I1203 22:28:33.587547 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzdd5"] Dec 03 22:28:33.688284 master-0 kubenswrapper[36504]: I1203 22:28:33.688204 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8h7t\" (UniqueName: \"kubernetes.io/projected/b524cbe9-1597-4642-a543-a29a92375c0d-kube-api-access-q8h7t\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.688742 master-0 kubenswrapper[36504]: I1203 22:28:33.688725 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.688908 master-0 kubenswrapper[36504]: I1203 22:28:33.688888 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-config-data\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.689411 master-0 kubenswrapper[36504]: I1203 22:28:33.689353 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-scripts\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.793948 master-0 kubenswrapper[36504]: I1203 22:28:33.793880 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-scripts\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.794329 master-0 kubenswrapper[36504]: I1203 22:28:33.794217 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8h7t\" (UniqueName: \"kubernetes.io/projected/b524cbe9-1597-4642-a543-a29a92375c0d-kube-api-access-q8h7t\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.794329 master-0 kubenswrapper[36504]: I1203 22:28:33.794287 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.797091 master-0 kubenswrapper[36504]: I1203 22:28:33.795049 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-config-data\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.798517 master-0 kubenswrapper[36504]: I1203 22:28:33.798491 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.799205 master-0 kubenswrapper[36504]: I1203 22:28:33.799181 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-scripts\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.808870 master-0 kubenswrapper[36504]: I1203 22:28:33.804440 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-config-data\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.816546 master-0 kubenswrapper[36504]: I1203 22:28:33.816429 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8h7t\" (UniqueName: \"kubernetes.io/projected/b524cbe9-1597-4642-a543-a29a92375c0d-kube-api-access-q8h7t\") pod \"nova-cell1-cell-mapping-fzdd5\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:33.920026 master-0 kubenswrapper[36504]: I1203 22:28:33.919940 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:34.305175 master-0 kubenswrapper[36504]: I1203 22:28:34.305121 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:28:34.471810 master-0 kubenswrapper[36504]: I1203 22:28:34.470427 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzdd5"] Dec 03 22:28:35.328021 master-0 kubenswrapper[36504]: I1203 22:28:35.327925 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzdd5" event={"ID":"b524cbe9-1597-4642-a543-a29a92375c0d","Type":"ContainerStarted","Data":"31d50adcee68d6ae036c3da047ab3a89d7d449701ffbbf10d419bc89bae5d3b0"} Dec 03 22:28:35.328021 master-0 kubenswrapper[36504]: I1203 22:28:35.328000 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzdd5" event={"ID":"b524cbe9-1597-4642-a543-a29a92375c0d","Type":"ContainerStarted","Data":"afe492814df87bf479a68a72ed77078ce4d03ecdae5cf1434f1246e4c16322f3"} Dec 03 22:28:35.367479 master-0 kubenswrapper[36504]: I1203 22:28:35.367222 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fzdd5" podStartSLOduration=2.367182272 podStartE2EDuration="2.367182272s" podCreationTimestamp="2025-12-03 22:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:35.353339815 +0000 UTC m=+1080.573111822" watchObservedRunningTime="2025-12-03 22:28:35.367182272 +0000 UTC m=+1080.586954279" Dec 03 22:28:35.743748 master-0 kubenswrapper[36504]: E1203 22:28:35.743648 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:28:36.557655 master-0 kubenswrapper[36504]: I1203 22:28:36.557552 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:36.557655 master-0 kubenswrapper[36504]: I1203 22:28:36.557642 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:37.576314 master-0 kubenswrapper[36504]: I1203 22:28:37.575940 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.45:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:37.576314 master-0 kubenswrapper[36504]: I1203 22:28:37.576027 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.45:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:38.378544 master-0 kubenswrapper[36504]: I1203 22:28:38.378472 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 22:28:38.378544 master-0 kubenswrapper[36504]: I1203 22:28:38.378494 36504 generic.go:334] "Generic (PLEG): container finished" podID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerID="f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa" exitCode=137 Dec 03 22:28:38.378941 master-0 kubenswrapper[36504]: I1203 22:28:38.378559 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerDied","Data":"f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa"} Dec 03 22:28:38.378941 master-0 kubenswrapper[36504]: I1203 22:28:38.378637 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f2bbce50-6964-456d-affb-e8b38e7311b4","Type":"ContainerDied","Data":"8bc68641b7eb82f56a637086a9b85a4e219efb4f8063e7841135e7bad1ee1f12"} Dec 03 22:28:38.378941 master-0 kubenswrapper[36504]: I1203 22:28:38.378659 36504 scope.go:117] "RemoveContainer" containerID="f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa" Dec 03 22:28:38.476458 master-0 kubenswrapper[36504]: I1203 22:28:38.476398 36504 scope.go:117] "RemoveContainer" containerID="8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48" Dec 03 22:28:38.477790 master-0 kubenswrapper[36504]: I1203 22:28:38.477718 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-config-data\") pod \"f2bbce50-6964-456d-affb-e8b38e7311b4\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " Dec 03 22:28:38.478032 master-0 kubenswrapper[36504]: I1203 22:28:38.478009 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-scripts\") pod \"f2bbce50-6964-456d-affb-e8b38e7311b4\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " Dec 03 22:28:38.478201 master-0 kubenswrapper[36504]: I1203 22:28:38.478164 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-combined-ca-bundle\") pod \"f2bbce50-6964-456d-affb-e8b38e7311b4\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " Dec 03 22:28:38.478319 master-0 kubenswrapper[36504]: I1203 22:28:38.478304 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc8pl\" (UniqueName: \"kubernetes.io/projected/f2bbce50-6964-456d-affb-e8b38e7311b4-kube-api-access-tc8pl\") pod \"f2bbce50-6964-456d-affb-e8b38e7311b4\" (UID: \"f2bbce50-6964-456d-affb-e8b38e7311b4\") " Dec 03 22:28:38.482521 master-0 kubenswrapper[36504]: I1203 22:28:38.482466 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-scripts" (OuterVolumeSpecName: "scripts") pod "f2bbce50-6964-456d-affb-e8b38e7311b4" (UID: "f2bbce50-6964-456d-affb-e8b38e7311b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38.486055 master-0 kubenswrapper[36504]: I1203 22:28:38.486020 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2bbce50-6964-456d-affb-e8b38e7311b4-kube-api-access-tc8pl" (OuterVolumeSpecName: "kube-api-access-tc8pl") pod "f2bbce50-6964-456d-affb-e8b38e7311b4" (UID: "f2bbce50-6964-456d-affb-e8b38e7311b4"). InnerVolumeSpecName "kube-api-access-tc8pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:38.583176 master-0 kubenswrapper[36504]: I1203 22:28:38.582571 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc8pl\" (UniqueName: \"kubernetes.io/projected/f2bbce50-6964-456d-affb-e8b38e7311b4-kube-api-access-tc8pl\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:38.583176 master-0 kubenswrapper[36504]: I1203 22:28:38.582615 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:38.619369 master-0 kubenswrapper[36504]: I1203 22:28:38.619296 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-config-data" (OuterVolumeSpecName: "config-data") pod "f2bbce50-6964-456d-affb-e8b38e7311b4" (UID: "f2bbce50-6964-456d-affb-e8b38e7311b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38.657403 master-0 kubenswrapper[36504]: I1203 22:28:38.656362 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2bbce50-6964-456d-affb-e8b38e7311b4" (UID: "f2bbce50-6964-456d-affb-e8b38e7311b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:38.686083 master-0 kubenswrapper[36504]: I1203 22:28:38.686012 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:38.686083 master-0 kubenswrapper[36504]: I1203 22:28:38.686075 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2bbce50-6964-456d-affb-e8b38e7311b4-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:38.791155 master-0 kubenswrapper[36504]: I1203 22:28:38.789049 36504 scope.go:117] "RemoveContainer" containerID="4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499" Dec 03 22:28:38.844143 master-0 kubenswrapper[36504]: I1203 22:28:38.844102 36504 scope.go:117] "RemoveContainer" containerID="f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea" Dec 03 22:28:38.874990 master-0 kubenswrapper[36504]: I1203 22:28:38.874510 36504 scope.go:117] "RemoveContainer" containerID="f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa" Dec 03 22:28:38.875143 master-0 kubenswrapper[36504]: E1203 22:28:38.875113 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa\": container with ID starting with f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa not found: ID does not exist" containerID="f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa" Dec 03 22:28:38.875197 master-0 kubenswrapper[36504]: I1203 22:28:38.875157 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa"} err="failed to get container status \"f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa\": rpc error: code = NotFound desc = could not find container \"f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa\": container with ID starting with f2edb38c8bf44410b5a1b650d2ef1e7ca9bfa2cfbacbe5cf7afadf5755d523fa not found: ID does not exist" Dec 03 22:28:38.875264 master-0 kubenswrapper[36504]: I1203 22:28:38.875198 36504 scope.go:117] "RemoveContainer" containerID="8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48" Dec 03 22:28:38.875668 master-0 kubenswrapper[36504]: E1203 22:28:38.875557 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48\": container with ID starting with 8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48 not found: ID does not exist" containerID="8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48" Dec 03 22:28:38.875668 master-0 kubenswrapper[36504]: I1203 22:28:38.875591 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48"} err="failed to get container status \"8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48\": rpc error: code = NotFound desc = could not find container \"8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48\": container with ID starting with 8286b1c9ad35405d97ba16c99d4e8bb72625e4d93cb6b45d3606a76665d5cb48 not found: ID does not exist" Dec 03 22:28:38.875668 master-0 kubenswrapper[36504]: I1203 22:28:38.875608 36504 scope.go:117] "RemoveContainer" containerID="4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499" Dec 03 22:28:38.876236 master-0 kubenswrapper[36504]: E1203 22:28:38.876178 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499\": container with ID starting with 4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499 not found: ID does not exist" containerID="4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499" Dec 03 22:28:38.876298 master-0 kubenswrapper[36504]: I1203 22:28:38.876257 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499"} err="failed to get container status \"4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499\": rpc error: code = NotFound desc = could not find container \"4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499\": container with ID starting with 4dae22d6d5ae05c5f1ebda9d040222888c4e784413794b31984c61c815ce9499 not found: ID does not exist" Dec 03 22:28:38.876335 master-0 kubenswrapper[36504]: I1203 22:28:38.876310 36504 scope.go:117] "RemoveContainer" containerID="f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea" Dec 03 22:28:38.876757 master-0 kubenswrapper[36504]: E1203 22:28:38.876720 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea\": container with ID starting with f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea not found: ID does not exist" containerID="f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea" Dec 03 22:28:38.876835 master-0 kubenswrapper[36504]: I1203 22:28:38.876761 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea"} err="failed to get container status \"f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea\": rpc error: code = NotFound desc = could not find container \"f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea\": container with ID starting with f634a47aa89ea4da4e483bca03d318faa96683aee4c345d3807989981fcbd8ea not found: ID does not exist" Dec 03 22:28:39.404059 master-0 kubenswrapper[36504]: I1203 22:28:39.403895 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 22:28:39.501503 master-0 kubenswrapper[36504]: I1203 22:28:39.501427 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 22:28:39.520675 master-0 kubenswrapper[36504]: I1203 22:28:39.520595 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 22:28:39.600941 master-0 kubenswrapper[36504]: I1203 22:28:39.600849 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: E1203 22:28:39.601786 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-api" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.601812 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-api" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: E1203 22:28:39.601852 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-evaluator" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.601860 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-evaluator" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: E1203 22:28:39.601888 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-notifier" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.601895 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-notifier" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: E1203 22:28:39.601913 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-listener" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.601919 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-listener" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.602260 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-evaluator" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.602282 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-notifier" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.602317 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-listener" Dec 03 22:28:39.604795 master-0 kubenswrapper[36504]: I1203 22:28:39.602364 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" containerName="aodh-api" Dec 03 22:28:39.605444 master-0 kubenswrapper[36504]: I1203 22:28:39.605386 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 22:28:39.611542 master-0 kubenswrapper[36504]: I1203 22:28:39.611454 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 22:28:39.611852 master-0 kubenswrapper[36504]: I1203 22:28:39.611803 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 22:28:39.612124 master-0 kubenswrapper[36504]: I1203 22:28:39.612102 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 22:28:39.617142 master-0 kubenswrapper[36504]: I1203 22:28:39.617097 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 22:28:39.622748 master-0 kubenswrapper[36504]: I1203 22:28:39.622682 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 22:28:39.725802 master-0 kubenswrapper[36504]: I1203 22:28:39.722185 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-config-data\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.725802 master-0 kubenswrapper[36504]: I1203 22:28:39.722278 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfr7t\" (UniqueName: \"kubernetes.io/projected/787b4cf8-9144-4a47-991f-1fca25b680f5-kube-api-access-cfr7t\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.725802 master-0 kubenswrapper[36504]: I1203 22:28:39.722318 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-scripts\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.725802 master-0 kubenswrapper[36504]: I1203 22:28:39.722405 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-public-tls-certs\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.725802 master-0 kubenswrapper[36504]: I1203 22:28:39.722487 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.725802 master-0 kubenswrapper[36504]: I1203 22:28:39.722536 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-internal-tls-certs\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.827447 master-0 kubenswrapper[36504]: I1203 22:28:39.827368 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.827855 master-0 kubenswrapper[36504]: I1203 22:28:39.827498 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-internal-tls-certs\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.827855 master-0 kubenswrapper[36504]: I1203 22:28:39.827703 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-config-data\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.827855 master-0 kubenswrapper[36504]: I1203 22:28:39.827751 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfr7t\" (UniqueName: \"kubernetes.io/projected/787b4cf8-9144-4a47-991f-1fca25b680f5-kube-api-access-cfr7t\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.827855 master-0 kubenswrapper[36504]: I1203 22:28:39.827822 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-scripts\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.828069 master-0 kubenswrapper[36504]: I1203 22:28:39.828001 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-public-tls-certs\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.837672 master-0 kubenswrapper[36504]: I1203 22:28:39.837333 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-public-tls-certs\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.838869 master-0 kubenswrapper[36504]: I1203 22:28:39.838823 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-combined-ca-bundle\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.839186 master-0 kubenswrapper[36504]: I1203 22:28:39.839159 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-config-data\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.840526 master-0 kubenswrapper[36504]: I1203 22:28:39.840474 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-internal-tls-certs\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.840838 master-0 kubenswrapper[36504]: I1203 22:28:39.840821 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/787b4cf8-9144-4a47-991f-1fca25b680f5-scripts\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.847515 master-0 kubenswrapper[36504]: I1203 22:28:39.847479 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfr7t\" (UniqueName: \"kubernetes.io/projected/787b4cf8-9144-4a47-991f-1fca25b680f5-kube-api-access-cfr7t\") pod \"aodh-0\" (UID: \"787b4cf8-9144-4a47-991f-1fca25b680f5\") " pod="openstack/aodh-0" Dec 03 22:28:39.996082 master-0 kubenswrapper[36504]: I1203 22:28:39.995912 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 22:28:40.499415 master-0 kubenswrapper[36504]: W1203 22:28:40.499361 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod787b4cf8_9144_4a47_991f_1fca25b680f5.slice/crio-22317be8bd206404359ca7adc1cf619507bfe8ab7cdba4396a5914d2fc304a8e WatchSource:0}: Error finding container 22317be8bd206404359ca7adc1cf619507bfe8ab7cdba4396a5914d2fc304a8e: Status 404 returned error can't find the container with id 22317be8bd206404359ca7adc1cf619507bfe8ab7cdba4396a5914d2fc304a8e Dec 03 22:28:40.504222 master-0 kubenswrapper[36504]: I1203 22:28:40.504149 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 22:28:41.111902 master-0 kubenswrapper[36504]: I1203 22:28:41.111735 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2bbce50-6964-456d-affb-e8b38e7311b4" path="/var/lib/kubelet/pods/f2bbce50-6964-456d-affb-e8b38e7311b4/volumes" Dec 03 22:28:41.449545 master-0 kubenswrapper[36504]: I1203 22:28:41.449460 36504 generic.go:334] "Generic (PLEG): container finished" podID="b524cbe9-1597-4642-a543-a29a92375c0d" containerID="31d50adcee68d6ae036c3da047ab3a89d7d449701ffbbf10d419bc89bae5d3b0" exitCode=0 Dec 03 22:28:41.449860 master-0 kubenswrapper[36504]: I1203 22:28:41.449605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzdd5" event={"ID":"b524cbe9-1597-4642-a543-a29a92375c0d","Type":"ContainerDied","Data":"31d50adcee68d6ae036c3da047ab3a89d7d449701ffbbf10d419bc89bae5d3b0"} Dec 03 22:28:41.453543 master-0 kubenswrapper[36504]: I1203 22:28:41.453467 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"787b4cf8-9144-4a47-991f-1fca25b680f5","Type":"ContainerStarted","Data":"f1b2981491601efcdd767a0db32e3b413ad6ef20e40db47fe6582e922dee6081"} Dec 03 22:28:41.453543 master-0 kubenswrapper[36504]: I1203 22:28:41.453546 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"787b4cf8-9144-4a47-991f-1fca25b680f5","Type":"ContainerStarted","Data":"22317be8bd206404359ca7adc1cf619507bfe8ab7cdba4396a5914d2fc304a8e"} Dec 03 22:28:42.475378 master-0 kubenswrapper[36504]: I1203 22:28:42.475122 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"787b4cf8-9144-4a47-991f-1fca25b680f5","Type":"ContainerStarted","Data":"ec4cfac6e1f0378cdd45642a6d952189fb23d1a5beac56c830e634079a9df2a1"} Dec 03 22:28:43.033521 master-0 kubenswrapper[36504]: I1203 22:28:43.033462 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:43.155596 master-0 kubenswrapper[36504]: I1203 22:28:43.155530 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-config-data\") pod \"b524cbe9-1597-4642-a543-a29a92375c0d\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " Dec 03 22:28:43.155907 master-0 kubenswrapper[36504]: I1203 22:28:43.155860 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-combined-ca-bundle\") pod \"b524cbe9-1597-4642-a543-a29a92375c0d\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " Dec 03 22:28:43.156053 master-0 kubenswrapper[36504]: I1203 22:28:43.156023 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8h7t\" (UniqueName: \"kubernetes.io/projected/b524cbe9-1597-4642-a543-a29a92375c0d-kube-api-access-q8h7t\") pod \"b524cbe9-1597-4642-a543-a29a92375c0d\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " Dec 03 22:28:43.156138 master-0 kubenswrapper[36504]: I1203 22:28:43.156112 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-scripts\") pod \"b524cbe9-1597-4642-a543-a29a92375c0d\" (UID: \"b524cbe9-1597-4642-a543-a29a92375c0d\") " Dec 03 22:28:43.160102 master-0 kubenswrapper[36504]: I1203 22:28:43.160009 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b524cbe9-1597-4642-a543-a29a92375c0d-kube-api-access-q8h7t" (OuterVolumeSpecName: "kube-api-access-q8h7t") pod "b524cbe9-1597-4642-a543-a29a92375c0d" (UID: "b524cbe9-1597-4642-a543-a29a92375c0d"). InnerVolumeSpecName "kube-api-access-q8h7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:43.162038 master-0 kubenswrapper[36504]: I1203 22:28:43.161994 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-scripts" (OuterVolumeSpecName: "scripts") pod "b524cbe9-1597-4642-a543-a29a92375c0d" (UID: "b524cbe9-1597-4642-a543-a29a92375c0d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:43.192485 master-0 kubenswrapper[36504]: I1203 22:28:43.192424 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-config-data" (OuterVolumeSpecName: "config-data") pod "b524cbe9-1597-4642-a543-a29a92375c0d" (UID: "b524cbe9-1597-4642-a543-a29a92375c0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:43.197584 master-0 kubenswrapper[36504]: I1203 22:28:43.197502 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b524cbe9-1597-4642-a543-a29a92375c0d" (UID: "b524cbe9-1597-4642-a543-a29a92375c0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:43.260372 master-0 kubenswrapper[36504]: I1203 22:28:43.260292 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8h7t\" (UniqueName: \"kubernetes.io/projected/b524cbe9-1597-4642-a543-a29a92375c0d-kube-api-access-q8h7t\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:43.260372 master-0 kubenswrapper[36504]: I1203 22:28:43.260359 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:43.260372 master-0 kubenswrapper[36504]: I1203 22:28:43.260380 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:43.260372 master-0 kubenswrapper[36504]: I1203 22:28:43.260391 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b524cbe9-1597-4642-a543-a29a92375c0d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:43.510337 master-0 kubenswrapper[36504]: I1203 22:28:43.510254 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fzdd5" event={"ID":"b524cbe9-1597-4642-a543-a29a92375c0d","Type":"ContainerDied","Data":"afe492814df87bf479a68a72ed77078ce4d03ecdae5cf1434f1246e4c16322f3"} Dec 03 22:28:43.510337 master-0 kubenswrapper[36504]: I1203 22:28:43.510327 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe492814df87bf479a68a72ed77078ce4d03ecdae5cf1434f1246e4c16322f3" Dec 03 22:28:43.511464 master-0 kubenswrapper[36504]: I1203 22:28:43.511414 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fzdd5" Dec 03 22:28:43.512641 master-0 kubenswrapper[36504]: I1203 22:28:43.512594 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"787b4cf8-9144-4a47-991f-1fca25b680f5","Type":"ContainerStarted","Data":"bc0ad6e4321291849f59d56cf62b6e0374af27f3b2a366fb046a67f3202ef1da"} Dec 03 22:28:43.727971 master-0 kubenswrapper[36504]: I1203 22:28:43.725644 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:43.727971 master-0 kubenswrapper[36504]: I1203 22:28:43.726052 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-log" containerID="cri-o://3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3" gracePeriod=30 Dec 03 22:28:43.727971 master-0 kubenswrapper[36504]: I1203 22:28:43.726052 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-api" containerID="cri-o://f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4" gracePeriod=30 Dec 03 22:28:43.843852 master-0 kubenswrapper[36504]: I1203 22:28:43.843044 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:43.843852 master-0 kubenswrapper[36504]: I1203 22:28:43.843312 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" containerName="nova-scheduler-scheduler" containerID="cri-o://f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" gracePeriod=30 Dec 03 22:28:43.871961 master-0 kubenswrapper[36504]: I1203 22:28:43.870687 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:43.871961 master-0 kubenswrapper[36504]: I1203 22:28:43.871494 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-metadata" containerID="cri-o://77a5d364d521b3672d4e1b7973c8c11b8d59e9103eb2ce07e5e595bd0e9613eb" gracePeriod=30 Dec 03 22:28:43.872893 master-0 kubenswrapper[36504]: I1203 22:28:43.871195 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-log" containerID="cri-o://38cb0943a7e5d3013c44a9450a646d8483eb910274e21b5318223c637bc23680" gracePeriod=30 Dec 03 22:28:44.553368 master-0 kubenswrapper[36504]: I1203 22:28:44.553219 36504 generic.go:334] "Generic (PLEG): container finished" podID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerID="38cb0943a7e5d3013c44a9450a646d8483eb910274e21b5318223c637bc23680" exitCode=143 Dec 03 22:28:44.554138 master-0 kubenswrapper[36504]: I1203 22:28:44.553640 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df","Type":"ContainerDied","Data":"38cb0943a7e5d3013c44a9450a646d8483eb910274e21b5318223c637bc23680"} Dec 03 22:28:44.560933 master-0 kubenswrapper[36504]: I1203 22:28:44.560815 36504 generic.go:334] "Generic (PLEG): container finished" podID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerID="3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3" exitCode=143 Dec 03 22:28:44.561224 master-0 kubenswrapper[36504]: I1203 22:28:44.561080 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4ad88f6-c451-44f9-b8be-52f9acbe5b82","Type":"ContainerDied","Data":"3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3"} Dec 03 22:28:44.573427 master-0 kubenswrapper[36504]: I1203 22:28:44.573311 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"787b4cf8-9144-4a47-991f-1fca25b680f5","Type":"ContainerStarted","Data":"873f4b877498cc512de499b39434d9362a4e9f01b632fca2a35f370921f492e1"} Dec 03 22:28:44.628798 master-0 kubenswrapper[36504]: I1203 22:28:44.625683 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.575932171 podStartE2EDuration="5.625640995s" podCreationTimestamp="2025-12-03 22:28:39 +0000 UTC" firstStartedPulling="2025-12-03 22:28:40.502861237 +0000 UTC m=+1085.722633244" lastFinishedPulling="2025-12-03 22:28:43.552570061 +0000 UTC m=+1088.772342068" observedRunningTime="2025-12-03 22:28:44.612792909 +0000 UTC m=+1089.832565116" watchObservedRunningTime="2025-12-03 22:28:44.625640995 +0000 UTC m=+1089.845413002" Dec 03 22:28:47.017390 master-0 kubenswrapper[36504]: I1203 22:28:47.017302 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.40:8775/\": read tcp 10.128.0.2:45090->10.128.1.40:8775: read: connection reset by peer" Dec 03 22:28:47.018195 master-0 kubenswrapper[36504]: I1203 22:28:47.017318 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.40:8775/\": read tcp 10.128.0.2:45080->10.128.1.40:8775: read: connection reset by peer" Dec 03 22:28:47.553481 master-0 kubenswrapper[36504]: I1203 22:28:47.553408 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:47.645797 master-0 kubenswrapper[36504]: I1203 22:28:47.645224 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df","Type":"ContainerDied","Data":"77a5d364d521b3672d4e1b7973c8c11b8d59e9103eb2ce07e5e595bd0e9613eb"} Dec 03 22:28:47.645797 master-0 kubenswrapper[36504]: I1203 22:28:47.645451 36504 generic.go:334] "Generic (PLEG): container finished" podID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerID="77a5d364d521b3672d4e1b7973c8c11b8d59e9103eb2ce07e5e595bd0e9613eb" exitCode=0 Dec 03 22:28:47.645797 master-0 kubenswrapper[36504]: I1203 22:28:47.645547 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df","Type":"ContainerDied","Data":"f3b283c1050484812c4f93280c67a1b74a265be98b7d70424b8a992d0b2b64d3"} Dec 03 22:28:47.645797 master-0 kubenswrapper[36504]: I1203 22:28:47.645571 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3b283c1050484812c4f93280c67a1b74a265be98b7d70424b8a992d0b2b64d3" Dec 03 22:28:47.650850 master-0 kubenswrapper[36504]: I1203 22:28:47.650356 36504 generic.go:334] "Generic (PLEG): container finished" podID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerID="f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4" exitCode=0 Dec 03 22:28:47.650850 master-0 kubenswrapper[36504]: I1203 22:28:47.650448 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4ad88f6-c451-44f9-b8be-52f9acbe5b82","Type":"ContainerDied","Data":"f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4"} Dec 03 22:28:47.650850 master-0 kubenswrapper[36504]: I1203 22:28:47.650517 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4ad88f6-c451-44f9-b8be-52f9acbe5b82","Type":"ContainerDied","Data":"2572c27522d30d2673352e936b3c20c078ae1f2233077242ced7f5d20abf02a2"} Dec 03 22:28:47.650850 master-0 kubenswrapper[36504]: I1203 22:28:47.650546 36504 scope.go:117] "RemoveContainer" containerID="f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4" Dec 03 22:28:47.650850 master-0 kubenswrapper[36504]: I1203 22:28:47.650681 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:47.680451 master-0 kubenswrapper[36504]: I1203 22:28:47.679368 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:47.701613 master-0 kubenswrapper[36504]: I1203 22:28:47.701446 36504 scope.go:117] "RemoveContainer" containerID="3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3" Dec 03 22:28:47.707179 master-0 kubenswrapper[36504]: I1203 22:28:47.707125 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-config-data\") pod \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " Dec 03 22:28:47.711491 master-0 kubenswrapper[36504]: I1203 22:28:47.707249 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-combined-ca-bundle\") pod \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " Dec 03 22:28:47.711491 master-0 kubenswrapper[36504]: I1203 22:28:47.707405 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-internal-tls-certs\") pod \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " Dec 03 22:28:47.711491 master-0 kubenswrapper[36504]: I1203 22:28:47.707637 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-public-tls-certs\") pod \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " Dec 03 22:28:47.711491 master-0 kubenswrapper[36504]: I1203 22:28:47.707710 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-logs\") pod \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " Dec 03 22:28:47.711491 master-0 kubenswrapper[36504]: I1203 22:28:47.707928 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjspj\" (UniqueName: \"kubernetes.io/projected/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-kube-api-access-kjspj\") pod \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\" (UID: \"f4ad88f6-c451-44f9-b8be-52f9acbe5b82\") " Dec 03 22:28:47.718167 master-0 kubenswrapper[36504]: I1203 22:28:47.718102 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-kube-api-access-kjspj" (OuterVolumeSpecName: "kube-api-access-kjspj") pod "f4ad88f6-c451-44f9-b8be-52f9acbe5b82" (UID: "f4ad88f6-c451-44f9-b8be-52f9acbe5b82"). InnerVolumeSpecName "kube-api-access-kjspj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:47.718933 master-0 kubenswrapper[36504]: I1203 22:28:47.718861 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-logs" (OuterVolumeSpecName: "logs") pod "f4ad88f6-c451-44f9-b8be-52f9acbe5b82" (UID: "f4ad88f6-c451-44f9-b8be-52f9acbe5b82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:47.751269 master-0 kubenswrapper[36504]: I1203 22:28:47.751134 36504 scope.go:117] "RemoveContainer" containerID="f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4" Dec 03 22:28:47.754571 master-0 kubenswrapper[36504]: I1203 22:28:47.754417 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4ad88f6-c451-44f9-b8be-52f9acbe5b82" (UID: "f4ad88f6-c451-44f9-b8be-52f9acbe5b82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47.756092 master-0 kubenswrapper[36504]: E1203 22:28:47.756036 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4\": container with ID starting with f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4 not found: ID does not exist" containerID="f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4" Dec 03 22:28:47.756160 master-0 kubenswrapper[36504]: I1203 22:28:47.756112 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4"} err="failed to get container status \"f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4\": rpc error: code = NotFound desc = could not find container \"f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4\": container with ID starting with f9aad72f4deaaf014a4d96b8413f8ed6835144e445f9d6c47af86070040a21d4 not found: ID does not exist" Dec 03 22:28:47.756220 master-0 kubenswrapper[36504]: I1203 22:28:47.756161 36504 scope.go:117] "RemoveContainer" containerID="3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3" Dec 03 22:28:47.758613 master-0 kubenswrapper[36504]: E1203 22:28:47.758575 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3\": container with ID starting with 3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3 not found: ID does not exist" containerID="3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3" Dec 03 22:28:47.758672 master-0 kubenswrapper[36504]: I1203 22:28:47.758625 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3"} err="failed to get container status \"3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3\": rpc error: code = NotFound desc = could not find container \"3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3\": container with ID starting with 3c8a7d98a28e8c26983c66f850183adbeac30ed39eb0ad97e9c264a6a7de5ee3 not found: ID does not exist" Dec 03 22:28:47.768102 master-0 kubenswrapper[36504]: I1203 22:28:47.768010 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-config-data" (OuterVolumeSpecName: "config-data") pod "f4ad88f6-c451-44f9-b8be-52f9acbe5b82" (UID: "f4ad88f6-c451-44f9-b8be-52f9acbe5b82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47.799589 master-0 kubenswrapper[36504]: I1203 22:28:47.799294 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f4ad88f6-c451-44f9-b8be-52f9acbe5b82" (UID: "f4ad88f6-c451-44f9-b8be-52f9acbe5b82"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47.811894 master-0 kubenswrapper[36504]: I1203 22:28:47.811826 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-combined-ca-bundle\") pod \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " Dec 03 22:28:47.812172 master-0 kubenswrapper[36504]: I1203 22:28:47.812031 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-logs\") pod \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " Dec 03 22:28:47.812172 master-0 kubenswrapper[36504]: I1203 22:28:47.812082 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-nova-metadata-tls-certs\") pod \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " Dec 03 22:28:47.812291 master-0 kubenswrapper[36504]: I1203 22:28:47.812271 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-config-data\") pod \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " Dec 03 22:28:47.812702 master-0 kubenswrapper[36504]: I1203 22:28:47.812378 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljn46\" (UniqueName: \"kubernetes.io/projected/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-kube-api-access-ljn46\") pod \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\" (UID: \"71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df\") " Dec 03 22:28:47.812804 master-0 kubenswrapper[36504]: I1203 22:28:47.812706 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-logs" (OuterVolumeSpecName: "logs") pod "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" (UID: "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:28:47.814005 master-0 kubenswrapper[36504]: I1203 22:28:47.813857 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.814005 master-0 kubenswrapper[36504]: I1203 22:28:47.813885 36504 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.814005 master-0 kubenswrapper[36504]: I1203 22:28:47.813897 36504 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-logs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.814005 master-0 kubenswrapper[36504]: I1203 22:28:47.813964 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjspj\" (UniqueName: \"kubernetes.io/projected/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-kube-api-access-kjspj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.814005 master-0 kubenswrapper[36504]: I1203 22:28:47.813975 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.814344 master-0 kubenswrapper[36504]: I1203 22:28:47.813985 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.818007 master-0 kubenswrapper[36504]: I1203 22:28:47.817048 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-kube-api-access-ljn46" (OuterVolumeSpecName: "kube-api-access-ljn46") pod "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" (UID: "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df"). InnerVolumeSpecName "kube-api-access-ljn46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:47.852394 master-0 kubenswrapper[36504]: I1203 22:28:47.852256 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f4ad88f6-c451-44f9-b8be-52f9acbe5b82" (UID: "f4ad88f6-c451-44f9-b8be-52f9acbe5b82"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47.879884 master-0 kubenswrapper[36504]: I1203 22:28:47.878964 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" (UID: "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47.883135 master-0 kubenswrapper[36504]: I1203 22:28:47.883073 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-config-data" (OuterVolumeSpecName: "config-data") pod "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" (UID: "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:47.934556 master-0 kubenswrapper[36504]: I1203 22:28:47.918646 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.934556 master-0 kubenswrapper[36504]: I1203 22:28:47.918695 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljn46\" (UniqueName: \"kubernetes.io/projected/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-kube-api-access-ljn46\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.934556 master-0 kubenswrapper[36504]: I1203 22:28:47.918711 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.934556 master-0 kubenswrapper[36504]: I1203 22:28:47.918725 36504 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4ad88f6-c451-44f9-b8be-52f9acbe5b82-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:47.972821 master-0 kubenswrapper[36504]: I1203 22:28:47.972646 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" (UID: "71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:48.021866 master-0 kubenswrapper[36504]: I1203 22:28:48.021792 36504 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:48.150383 master-0 kubenswrapper[36504]: I1203 22:28:48.150287 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:48.169613 master-0 kubenswrapper[36504]: I1203 22:28:48.169267 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:48.191288 master-0 kubenswrapper[36504]: I1203 22:28:48.190926 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:48.192185 master-0 kubenswrapper[36504]: E1203 22:28:48.192053 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b524cbe9-1597-4642-a543-a29a92375c0d" containerName="nova-manage" Dec 03 22:28:48.192185 master-0 kubenswrapper[36504]: I1203 22:28:48.192090 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b524cbe9-1597-4642-a543-a29a92375c0d" containerName="nova-manage" Dec 03 22:28:48.192185 master-0 kubenswrapper[36504]: E1203 22:28:48.192106 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-log" Dec 03 22:28:48.192185 master-0 kubenswrapper[36504]: I1203 22:28:48.192117 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-log" Dec 03 22:28:48.192185 master-0 kubenswrapper[36504]: E1203 22:28:48.192162 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-api" Dec 03 22:28:48.192185 master-0 kubenswrapper[36504]: I1203 22:28:48.192173 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-api" Dec 03 22:28:48.192436 master-0 kubenswrapper[36504]: E1203 22:28:48.192246 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-metadata" Dec 03 22:28:48.192436 master-0 kubenswrapper[36504]: I1203 22:28:48.192258 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-metadata" Dec 03 22:28:48.192436 master-0 kubenswrapper[36504]: E1203 22:28:48.192281 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-log" Dec 03 22:28:48.192436 master-0 kubenswrapper[36504]: I1203 22:28:48.192290 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-log" Dec 03 22:28:48.192659 master-0 kubenswrapper[36504]: I1203 22:28:48.192613 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-api" Dec 03 22:28:48.192659 master-0 kubenswrapper[36504]: I1203 22:28:48.192639 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-log" Dec 03 22:28:48.192757 master-0 kubenswrapper[36504]: I1203 22:28:48.192694 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" containerName="nova-api-log" Dec 03 22:28:48.192757 master-0 kubenswrapper[36504]: I1203 22:28:48.192724 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" containerName="nova-metadata-metadata" Dec 03 22:28:48.192853 master-0 kubenswrapper[36504]: I1203 22:28:48.192763 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b524cbe9-1597-4642-a543-a29a92375c0d" containerName="nova-manage" Dec 03 22:28:48.195265 master-0 kubenswrapper[36504]: I1203 22:28:48.195125 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:48.197462 master-0 kubenswrapper[36504]: I1203 22:28:48.197408 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 22:28:48.197671 master-0 kubenswrapper[36504]: I1203 22:28:48.197639 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 22:28:48.199323 master-0 kubenswrapper[36504]: I1203 22:28:48.199307 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 22:28:48.234884 master-0 kubenswrapper[36504]: I1203 22:28:48.229799 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:48.336803 master-0 kubenswrapper[36504]: I1203 22:28:48.336294 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.336803 master-0 kubenswrapper[36504]: I1203 22:28:48.336472 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxcj7\" (UniqueName: \"kubernetes.io/projected/04e2befe-d763-4801-9570-8ecd14c02dc7-kube-api-access-sxcj7\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.336803 master-0 kubenswrapper[36504]: I1203 22:28:48.336588 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.336803 master-0 kubenswrapper[36504]: I1203 22:28:48.336791 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-config-data\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.337269 master-0 kubenswrapper[36504]: I1203 22:28:48.336842 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2befe-d763-4801-9570-8ecd14c02dc7-logs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.337269 master-0 kubenswrapper[36504]: I1203 22:28:48.336932 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.440040 master-0 kubenswrapper[36504]: I1203 22:28:48.439937 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.440417 master-0 kubenswrapper[36504]: I1203 22:28:48.440154 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.440417 master-0 kubenswrapper[36504]: I1203 22:28:48.440216 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxcj7\" (UniqueName: \"kubernetes.io/projected/04e2befe-d763-4801-9570-8ecd14c02dc7-kube-api-access-sxcj7\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.440417 master-0 kubenswrapper[36504]: I1203 22:28:48.440273 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.440417 master-0 kubenswrapper[36504]: I1203 22:28:48.440374 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-config-data\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.440562 master-0 kubenswrapper[36504]: I1203 22:28:48.440423 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2befe-d763-4801-9570-8ecd14c02dc7-logs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.441301 master-0 kubenswrapper[36504]: I1203 22:28:48.441271 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2befe-d763-4801-9570-8ecd14c02dc7-logs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.445648 master-0 kubenswrapper[36504]: I1203 22:28:48.445583 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-public-tls-certs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.446634 master-0 kubenswrapper[36504]: I1203 22:28:48.446598 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.448911 master-0 kubenswrapper[36504]: I1203 22:28:48.447334 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-config-data\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.448911 master-0 kubenswrapper[36504]: I1203 22:28:48.447376 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2befe-d763-4801-9570-8ecd14c02dc7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.461727 master-0 kubenswrapper[36504]: I1203 22:28:48.461658 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxcj7\" (UniqueName: \"kubernetes.io/projected/04e2befe-d763-4801-9570-8ecd14c02dc7-kube-api-access-sxcj7\") pod \"nova-api-0\" (UID: \"04e2befe-d763-4801-9570-8ecd14c02dc7\") " pod="openstack/nova-api-0" Dec 03 22:28:48.558255 master-0 kubenswrapper[36504]: I1203 22:28:48.558087 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 22:28:48.574797 master-0 kubenswrapper[36504]: E1203 22:28:48.574657 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:28:48.580209 master-0 kubenswrapper[36504]: E1203 22:28:48.580060 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:28:48.586108 master-0 kubenswrapper[36504]: E1203 22:28:48.585948 36504 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 22:28:48.586108 master-0 kubenswrapper[36504]: E1203 22:28:48.586034 36504 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" containerName="nova-scheduler-scheduler" Dec 03 22:28:48.675432 master-0 kubenswrapper[36504]: I1203 22:28:48.675343 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:48.747183 master-0 kubenswrapper[36504]: I1203 22:28:48.747105 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:48.775797 master-0 kubenswrapper[36504]: I1203 22:28:48.771883 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:48.794957 master-0 kubenswrapper[36504]: I1203 22:28:48.794880 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:48.798729 master-0 kubenswrapper[36504]: I1203 22:28:48.798669 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:48.809548 master-0 kubenswrapper[36504]: I1203 22:28:48.809405 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 22:28:48.809813 master-0 kubenswrapper[36504]: I1203 22:28:48.809751 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 22:28:48.848815 master-0 kubenswrapper[36504]: I1203 22:28:48.848724 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:48.896988 master-0 kubenswrapper[36504]: I1203 22:28:48.896897 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48.897240 master-0 kubenswrapper[36504]: I1203 22:28:48.897091 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48.897240 master-0 kubenswrapper[36504]: I1203 22:28:48.897163 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc6b9\" (UniqueName: \"kubernetes.io/projected/deebbed4-ddab-4913-b1a6-86a9e540d588-kube-api-access-kc6b9\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48.897404 master-0 kubenswrapper[36504]: I1203 22:28:48.897289 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deebbed4-ddab-4913-b1a6-86a9e540d588-logs\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:48.897820 master-0 kubenswrapper[36504]: I1203 22:28:48.897733 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-config-data\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.002885 master-0 kubenswrapper[36504]: I1203 22:28:49.000905 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-config-data\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.002885 master-0 kubenswrapper[36504]: I1203 22:28:49.001123 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.002885 master-0 kubenswrapper[36504]: I1203 22:28:49.001203 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.002885 master-0 kubenswrapper[36504]: I1203 22:28:49.001236 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc6b9\" (UniqueName: \"kubernetes.io/projected/deebbed4-ddab-4913-b1a6-86a9e540d588-kube-api-access-kc6b9\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.002885 master-0 kubenswrapper[36504]: I1203 22:28:49.001299 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deebbed4-ddab-4913-b1a6-86a9e540d588-logs\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.002885 master-0 kubenswrapper[36504]: I1203 22:28:49.001992 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deebbed4-ddab-4913-b1a6-86a9e540d588-logs\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.007059 master-0 kubenswrapper[36504]: I1203 22:28:49.007009 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.007923 master-0 kubenswrapper[36504]: I1203 22:28:49.007865 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-config-data\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.009667 master-0 kubenswrapper[36504]: I1203 22:28:49.009610 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deebbed4-ddab-4913-b1a6-86a9e540d588-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.026137 master-0 kubenswrapper[36504]: I1203 22:28:49.026078 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc6b9\" (UniqueName: \"kubernetes.io/projected/deebbed4-ddab-4913-b1a6-86a9e540d588-kube-api-access-kc6b9\") pod \"nova-metadata-0\" (UID: \"deebbed4-ddab-4913-b1a6-86a9e540d588\") " pod="openstack/nova-metadata-0" Dec 03 22:28:49.120275 master-0 kubenswrapper[36504]: I1203 22:28:49.120201 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df" path="/var/lib/kubelet/pods/71a883a0-3fa9-40a9-9aa7-c0aaaa0de3df/volumes" Dec 03 22:28:49.121216 master-0 kubenswrapper[36504]: I1203 22:28:49.121193 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ad88f6-c451-44f9-b8be-52f9acbe5b82" path="/var/lib/kubelet/pods/f4ad88f6-c451-44f9-b8be-52f9acbe5b82/volumes" Dec 03 22:28:49.184435 master-0 kubenswrapper[36504]: I1203 22:28:49.184348 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 22:28:49.204064 master-0 kubenswrapper[36504]: I1203 22:28:49.203994 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 22:28:49.224355 master-0 kubenswrapper[36504]: W1203 22:28:49.224277 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04e2befe_d763_4801_9570_8ecd14c02dc7.slice/crio-ee25c938ea08d1bfb396ef11ddefa3cfceb50649fbacfdb9b108ef2cdf69816b WatchSource:0}: Error finding container ee25c938ea08d1bfb396ef11ddefa3cfceb50649fbacfdb9b108ef2cdf69816b: Status 404 returned error can't find the container with id ee25c938ea08d1bfb396ef11ddefa3cfceb50649fbacfdb9b108ef2cdf69816b Dec 03 22:28:49.400904 master-0 kubenswrapper[36504]: I1203 22:28:49.400854 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:28:49.519822 master-0 kubenswrapper[36504]: I1203 22:28:49.519696 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hp6v\" (UniqueName: \"kubernetes.io/projected/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-kube-api-access-7hp6v\") pod \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " Dec 03 22:28:49.520117 master-0 kubenswrapper[36504]: I1203 22:28:49.520063 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-config-data\") pod \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " Dec 03 22:28:49.520268 master-0 kubenswrapper[36504]: I1203 22:28:49.520241 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-combined-ca-bundle\") pod \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\" (UID: \"8871882f-bcc5-4ba4-8abc-8e19ce250bf9\") " Dec 03 22:28:49.524289 master-0 kubenswrapper[36504]: I1203 22:28:49.524225 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-kube-api-access-7hp6v" (OuterVolumeSpecName: "kube-api-access-7hp6v") pod "8871882f-bcc5-4ba4-8abc-8e19ce250bf9" (UID: "8871882f-bcc5-4ba4-8abc-8e19ce250bf9"). InnerVolumeSpecName "kube-api-access-7hp6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:28:49.562436 master-0 kubenswrapper[36504]: I1203 22:28:49.562233 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-config-data" (OuterVolumeSpecName: "config-data") pod "8871882f-bcc5-4ba4-8abc-8e19ce250bf9" (UID: "8871882f-bcc5-4ba4-8abc-8e19ce250bf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:49.575887 master-0 kubenswrapper[36504]: I1203 22:28:49.575758 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8871882f-bcc5-4ba4-8abc-8e19ce250bf9" (UID: "8871882f-bcc5-4ba4-8abc-8e19ce250bf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:28:49.628147 master-0 kubenswrapper[36504]: I1203 22:28:49.627120 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:49.628147 master-0 kubenswrapper[36504]: I1203 22:28:49.627187 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hp6v\" (UniqueName: \"kubernetes.io/projected/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-kube-api-access-7hp6v\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:49.628147 master-0 kubenswrapper[36504]: I1203 22:28:49.627202 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8871882f-bcc5-4ba4-8abc-8e19ce250bf9-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:28:49.703093 master-0 kubenswrapper[36504]: I1203 22:28:49.702950 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"04e2befe-d763-4801-9570-8ecd14c02dc7","Type":"ContainerStarted","Data":"88ea2f0ef04cbd4e4864e6b7f2f454c4520483efaf92c10ca1299c56127ac763"} Dec 03 22:28:49.703093 master-0 kubenswrapper[36504]: I1203 22:28:49.703031 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"04e2befe-d763-4801-9570-8ecd14c02dc7","Type":"ContainerStarted","Data":"ee25c938ea08d1bfb396ef11ddefa3cfceb50649fbacfdb9b108ef2cdf69816b"} Dec 03 22:28:49.706403 master-0 kubenswrapper[36504]: I1203 22:28:49.706311 36504 generic.go:334] "Generic (PLEG): container finished" podID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" exitCode=0 Dec 03 22:28:49.706403 master-0 kubenswrapper[36504]: I1203 22:28:49.706374 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:28:49.706403 master-0 kubenswrapper[36504]: I1203 22:28:49.706393 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8871882f-bcc5-4ba4-8abc-8e19ce250bf9","Type":"ContainerDied","Data":"f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0"} Dec 03 22:28:49.706719 master-0 kubenswrapper[36504]: I1203 22:28:49.706447 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8871882f-bcc5-4ba4-8abc-8e19ce250bf9","Type":"ContainerDied","Data":"06be330b3e02aac13d2c02eb891d3dca875cdd7231d8e8505224b7f8067571ce"} Dec 03 22:28:49.706719 master-0 kubenswrapper[36504]: I1203 22:28:49.706473 36504 scope.go:117] "RemoveContainer" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" Dec 03 22:28:49.752921 master-0 kubenswrapper[36504]: I1203 22:28:49.752576 36504 scope.go:117] "RemoveContainer" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" Dec 03 22:28:49.761202 master-0 kubenswrapper[36504]: E1203 22:28:49.760547 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0\": container with ID starting with f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0 not found: ID does not exist" containerID="f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0" Dec 03 22:28:49.761202 master-0 kubenswrapper[36504]: I1203 22:28:49.760631 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0"} err="failed to get container status \"f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0\": rpc error: code = NotFound desc = could not find container \"f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0\": container with ID starting with f60d10abfc7495dacf9c6991707c69beaf8d9e0a9db63443919317ce811546a0 not found: ID does not exist" Dec 03 22:28:49.778012 master-0 kubenswrapper[36504]: I1203 22:28:49.777943 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:49.790589 master-0 kubenswrapper[36504]: W1203 22:28:49.790454 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeebbed4_ddab_4913_b1a6_86a9e540d588.slice/crio-03e179eb12df6dedc60d64b3a8aa9bbec6b78d435f781d308d3af579cabe8ec5 WatchSource:0}: Error finding container 03e179eb12df6dedc60d64b3a8aa9bbec6b78d435f781d308d3af579cabe8ec5: Status 404 returned error can't find the container with id 03e179eb12df6dedc60d64b3a8aa9bbec6b78d435f781d308d3af579cabe8ec5 Dec 03 22:28:49.804197 master-0 kubenswrapper[36504]: I1203 22:28:49.804092 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:49.829799 master-0 kubenswrapper[36504]: I1203 22:28:49.826226 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 22:28:49.848804 master-0 kubenswrapper[36504]: I1203 22:28:49.846991 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:49.848804 master-0 kubenswrapper[36504]: E1203 22:28:49.848000 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" containerName="nova-scheduler-scheduler" Dec 03 22:28:49.848804 master-0 kubenswrapper[36504]: I1203 22:28:49.848022 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" containerName="nova-scheduler-scheduler" Dec 03 22:28:49.848804 master-0 kubenswrapper[36504]: I1203 22:28:49.848320 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" containerName="nova-scheduler-scheduler" Dec 03 22:28:49.852804 master-0 kubenswrapper[36504]: I1203 22:28:49.849980 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:28:49.856808 master-0 kubenswrapper[36504]: I1203 22:28:49.855149 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 22:28:49.907753 master-0 kubenswrapper[36504]: I1203 22:28:49.904428 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:49.948489 master-0 kubenswrapper[36504]: I1203 22:28:49.948429 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:49.949293 master-0 kubenswrapper[36504]: I1203 22:28:49.949135 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdsqb\" (UniqueName: \"kubernetes.io/projected/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-kube-api-access-qdsqb\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:49.949514 master-0 kubenswrapper[36504]: I1203 22:28:49.949481 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-config-data\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.052965 master-0 kubenswrapper[36504]: I1203 22:28:50.052877 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdsqb\" (UniqueName: \"kubernetes.io/projected/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-kube-api-access-qdsqb\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.100179 master-0 kubenswrapper[36504]: I1203 22:28:50.053067 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-config-data\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.100179 master-0 kubenswrapper[36504]: I1203 22:28:50.053243 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.100179 master-0 kubenswrapper[36504]: I1203 22:28:50.057662 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.100179 master-0 kubenswrapper[36504]: I1203 22:28:50.057685 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-config-data\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.100179 master-0 kubenswrapper[36504]: I1203 22:28:50.077849 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdsqb\" (UniqueName: \"kubernetes.io/projected/f6a15ba1-48a1-43e2-9e3c-c566eaaab557-kube-api-access-qdsqb\") pod \"nova-scheduler-0\" (UID: \"f6a15ba1-48a1-43e2-9e3c-c566eaaab557\") " pod="openstack/nova-scheduler-0" Dec 03 22:28:50.250488 master-0 kubenswrapper[36504]: I1203 22:28:50.250408 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 22:28:50.726511 master-0 kubenswrapper[36504]: I1203 22:28:50.726175 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deebbed4-ddab-4913-b1a6-86a9e540d588","Type":"ContainerStarted","Data":"692549e3c5ee053ee5a1c71cacf870ff08a0772d8a9b024f4f449126b1fed4f9"} Dec 03 22:28:50.726511 master-0 kubenswrapper[36504]: I1203 22:28:50.726249 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deebbed4-ddab-4913-b1a6-86a9e540d588","Type":"ContainerStarted","Data":"b47830ac974f2e3dbfef28cb9ce90b32ce849f539c057602aa5afb489fb13e89"} Dec 03 22:28:50.726511 master-0 kubenswrapper[36504]: I1203 22:28:50.726261 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"deebbed4-ddab-4913-b1a6-86a9e540d588","Type":"ContainerStarted","Data":"03e179eb12df6dedc60d64b3a8aa9bbec6b78d435f781d308d3af579cabe8ec5"} Dec 03 22:28:50.730945 master-0 kubenswrapper[36504]: I1203 22:28:50.730482 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"04e2befe-d763-4801-9570-8ecd14c02dc7","Type":"ContainerStarted","Data":"f41602f3360f5ad73dff28013bcf1ebe1722422a037342b7d24bd9870ad75192"} Dec 03 22:28:50.767790 master-0 kubenswrapper[36504]: I1203 22:28:50.766830 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.76679544 podStartE2EDuration="2.76679544s" podCreationTimestamp="2025-12-03 22:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:50.752015984 +0000 UTC m=+1095.971788001" watchObservedRunningTime="2025-12-03 22:28:50.76679544 +0000 UTC m=+1095.986567447" Dec 03 22:28:50.795857 master-0 kubenswrapper[36504]: W1203 22:28:50.795740 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a15ba1_48a1_43e2_9e3c_c566eaaab557.slice/crio-2ac265673ab48e606aa1da1d7c06fbde71664ada20d72c570f4ce5285ca007ef WatchSource:0}: Error finding container 2ac265673ab48e606aa1da1d7c06fbde71664ada20d72c570f4ce5285ca007ef: Status 404 returned error can't find the container with id 2ac265673ab48e606aa1da1d7c06fbde71664ada20d72c570f4ce5285ca007ef Dec 03 22:28:50.807916 master-0 kubenswrapper[36504]: I1203 22:28:50.806879 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 22:28:50.819078 master-0 kubenswrapper[36504]: I1203 22:28:50.818907 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.818873686 podStartE2EDuration="2.818873686s" podCreationTimestamp="2025-12-03 22:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:50.788288993 +0000 UTC m=+1096.008061010" watchObservedRunningTime="2025-12-03 22:28:50.818873686 +0000 UTC m=+1096.038645693" Dec 03 22:28:51.112889 master-0 kubenswrapper[36504]: I1203 22:28:51.112817 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8871882f-bcc5-4ba4-8abc-8e19ce250bf9" path="/var/lib/kubelet/pods/8871882f-bcc5-4ba4-8abc-8e19ce250bf9/volumes" Dec 03 22:28:51.750247 master-0 kubenswrapper[36504]: I1203 22:28:51.750159 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6a15ba1-48a1-43e2-9e3c-c566eaaab557","Type":"ContainerStarted","Data":"7403cadbf374704aed0e37aafa79f6f19a5c5ed2427c66aeaac9c86c09f9ad25"} Dec 03 22:28:51.750247 master-0 kubenswrapper[36504]: I1203 22:28:51.750244 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f6a15ba1-48a1-43e2-9e3c-c566eaaab557","Type":"ContainerStarted","Data":"2ac265673ab48e606aa1da1d7c06fbde71664ada20d72c570f4ce5285ca007ef"} Dec 03 22:28:51.802639 master-0 kubenswrapper[36504]: I1203 22:28:51.802495 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.80246455 podStartE2EDuration="2.80246455s" podCreationTimestamp="2025-12-03 22:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:28:51.775663234 +0000 UTC m=+1096.995435251" watchObservedRunningTime="2025-12-03 22:28:51.80246455 +0000 UTC m=+1097.022236557" Dec 03 22:28:54.204611 master-0 kubenswrapper[36504]: I1203 22:28:54.204511 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:28:54.204611 master-0 kubenswrapper[36504]: I1203 22:28:54.204623 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 22:28:55.112638 master-0 kubenswrapper[36504]: I1203 22:28:55.112556 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:28:55.251207 master-0 kubenswrapper[36504]: I1203 22:28:55.251131 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 22:28:57.661937 master-0 kubenswrapper[36504]: I1203 22:28:57.661849 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:28:58.559332 master-0 kubenswrapper[36504]: I1203 22:28:58.559256 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:58.559332 master-0 kubenswrapper[36504]: I1203 22:28:58.559339 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 22:28:59.205909 master-0 kubenswrapper[36504]: I1203 22:28:59.205750 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:28:59.205909 master-0 kubenswrapper[36504]: I1203 22:28:59.205889 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 22:28:59.580234 master-0 kubenswrapper[36504]: I1203 22:28:59.580039 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="04e2befe-d763-4801-9570-8ecd14c02dc7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.49:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:28:59.580234 master-0 kubenswrapper[36504]: I1203 22:28:59.580107 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="04e2befe-d763-4801-9570-8ecd14c02dc7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.49:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:00.230172 master-0 kubenswrapper[36504]: I1203 22:29:00.230071 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="deebbed4-ddab-4913-b1a6-86a9e540d588" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.50:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:00.231897 master-0 kubenswrapper[36504]: I1203 22:29:00.230587 36504 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="deebbed4-ddab-4913-b1a6-86a9e540d588" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.50:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 22:29:00.252185 master-0 kubenswrapper[36504]: I1203 22:29:00.252084 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 22:29:00.291267 master-0 kubenswrapper[36504]: I1203 22:29:00.291188 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 22:29:00.975793 master-0 kubenswrapper[36504]: I1203 22:29:00.975599 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 22:29:01.874256 master-0 kubenswrapper[36504]: I1203 22:29:01.874183 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:01.875154 master-0 kubenswrapper[36504]: I1203 22:29:01.874515 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1284fac2-9956-456c-9781-135e637e85bd" containerName="kube-state-metrics" containerID="cri-o://8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3" gracePeriod=30 Dec 03 22:29:02.451567 master-0 kubenswrapper[36504]: I1203 22:29:02.451499 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:02.568561 master-0 kubenswrapper[36504]: I1203 22:29:02.568357 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ksr5\" (UniqueName: \"kubernetes.io/projected/1284fac2-9956-456c-9781-135e637e85bd-kube-api-access-4ksr5\") pod \"1284fac2-9956-456c-9781-135e637e85bd\" (UID: \"1284fac2-9956-456c-9781-135e637e85bd\") " Dec 03 22:29:02.572962 master-0 kubenswrapper[36504]: I1203 22:29:02.572864 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1284fac2-9956-456c-9781-135e637e85bd-kube-api-access-4ksr5" (OuterVolumeSpecName: "kube-api-access-4ksr5") pod "1284fac2-9956-456c-9781-135e637e85bd" (UID: "1284fac2-9956-456c-9781-135e637e85bd"). InnerVolumeSpecName "kube-api-access-4ksr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:02.674053 master-0 kubenswrapper[36504]: I1203 22:29:02.673974 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ksr5\" (UniqueName: \"kubernetes.io/projected/1284fac2-9956-456c-9781-135e637e85bd-kube-api-access-4ksr5\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:02.957552 master-0 kubenswrapper[36504]: I1203 22:29:02.957476 36504 generic.go:334] "Generic (PLEG): container finished" podID="1284fac2-9956-456c-9781-135e637e85bd" containerID="8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3" exitCode=2 Dec 03 22:29:02.957552 master-0 kubenswrapper[36504]: I1203 22:29:02.957553 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:02.958422 master-0 kubenswrapper[36504]: I1203 22:29:02.957551 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1284fac2-9956-456c-9781-135e637e85bd","Type":"ContainerDied","Data":"8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3"} Dec 03 22:29:02.958422 master-0 kubenswrapper[36504]: I1203 22:29:02.957678 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1284fac2-9956-456c-9781-135e637e85bd","Type":"ContainerDied","Data":"754d83a0e0adc7afafd87a3e7e658d96fe36ba2927baa6e1c4dc55a4f13e4606"} Dec 03 22:29:02.958422 master-0 kubenswrapper[36504]: I1203 22:29:02.957705 36504 scope.go:117] "RemoveContainer" containerID="8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3" Dec 03 22:29:02.988526 master-0 kubenswrapper[36504]: I1203 22:29:02.986917 36504 scope.go:117] "RemoveContainer" containerID="8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3" Dec 03 22:29:02.988526 master-0 kubenswrapper[36504]: E1203 22:29:02.987593 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3\": container with ID starting with 8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3 not found: ID does not exist" containerID="8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3" Dec 03 22:29:02.988526 master-0 kubenswrapper[36504]: I1203 22:29:02.987664 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3"} err="failed to get container status \"8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3\": rpc error: code = NotFound desc = could not find container \"8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3\": container with ID starting with 8947bf9f168e49e934b3a1d0e27c6a1e8ae1dd9550b34737690d587fbc430fd3 not found: ID does not exist" Dec 03 22:29:03.038801 master-0 kubenswrapper[36504]: I1203 22:29:03.038527 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:03.061425 master-0 kubenswrapper[36504]: I1203 22:29:03.061354 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:03.084108 master-0 kubenswrapper[36504]: I1203 22:29:03.084022 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:03.085498 master-0 kubenswrapper[36504]: E1203 22:29:03.085233 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1284fac2-9956-456c-9781-135e637e85bd" containerName="kube-state-metrics" Dec 03 22:29:03.085498 master-0 kubenswrapper[36504]: I1203 22:29:03.085270 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1284fac2-9956-456c-9781-135e637e85bd" containerName="kube-state-metrics" Dec 03 22:29:03.086169 master-0 kubenswrapper[36504]: I1203 22:29:03.085825 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1284fac2-9956-456c-9781-135e637e85bd" containerName="kube-state-metrics" Dec 03 22:29:03.088668 master-0 kubenswrapper[36504]: I1203 22:29:03.088555 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.095228 master-0 kubenswrapper[36504]: I1203 22:29:03.095114 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 22:29:03.095980 master-0 kubenswrapper[36504]: I1203 22:29:03.095934 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 22:29:03.130079 master-0 kubenswrapper[36504]: I1203 22:29:03.129988 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.130401 master-0 kubenswrapper[36504]: I1203 22:29:03.130111 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.130401 master-0 kubenswrapper[36504]: I1203 22:29:03.130138 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46qgd\" (UniqueName: \"kubernetes.io/projected/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-api-access-46qgd\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.130401 master-0 kubenswrapper[36504]: I1203 22:29:03.129983 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1284fac2-9956-456c-9781-135e637e85bd" path="/var/lib/kubelet/pods/1284fac2-9956-456c-9781-135e637e85bd/volumes" Dec 03 22:29:03.130401 master-0 kubenswrapper[36504]: I1203 22:29:03.130216 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.150250 master-0 kubenswrapper[36504]: I1203 22:29:03.149423 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:03.234073 master-0 kubenswrapper[36504]: I1203 22:29:03.233218 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.234073 master-0 kubenswrapper[36504]: I1203 22:29:03.233305 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.234073 master-0 kubenswrapper[36504]: I1203 22:29:03.233329 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46qgd\" (UniqueName: \"kubernetes.io/projected/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-api-access-46qgd\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.234073 master-0 kubenswrapper[36504]: I1203 22:29:03.233364 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.239805 master-0 kubenswrapper[36504]: I1203 22:29:03.238619 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.239805 master-0 kubenswrapper[36504]: I1203 22:29:03.238629 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.239805 master-0 kubenswrapper[36504]: I1203 22:29:03.238714 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.257222 master-0 kubenswrapper[36504]: I1203 22:29:03.257157 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46qgd\" (UniqueName: \"kubernetes.io/projected/5b7e5313-e5e6-47e9-940d-c3fcb014502b-kube-api-access-46qgd\") pod \"kube-state-metrics-0\" (UID: \"5b7e5313-e5e6-47e9-940d-c3fcb014502b\") " pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.449655 master-0 kubenswrapper[36504]: I1203 22:29:03.449569 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 22:29:03.865030 master-0 kubenswrapper[36504]: I1203 22:29:03.864800 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:03.865422 master-0 kubenswrapper[36504]: I1203 22:29:03.865220 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-central-agent" containerID="cri-o://06378f7a35baeb277d82156dd96ffe06496d39394eed645f16cf5f7848c9200e" gracePeriod=30 Dec 03 22:29:03.865422 master-0 kubenswrapper[36504]: I1203 22:29:03.865398 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-notification-agent" containerID="cri-o://2e1195f7c14306b10e00e53907ae64b41ec55a6cfecbb723f31437fbc1228799" gracePeriod=30 Dec 03 22:29:03.865576 master-0 kubenswrapper[36504]: I1203 22:29:03.865390 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="sg-core" containerID="cri-o://410682d4622016aedc9341f9ee150e7a656ba0a58d7a6865d5e7fac7cd751f8b" gracePeriod=30 Dec 03 22:29:03.871478 master-0 kubenswrapper[36504]: I1203 22:29:03.867182 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="proxy-httpd" containerID="cri-o://1240ac558d6fad0b18f98df9311dcfd6c5752f54920f5c1b317ddc02cbe0ca26" gracePeriod=30 Dec 03 22:29:04.042124 master-0 kubenswrapper[36504]: I1203 22:29:04.042063 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 22:29:05.030816 master-0 kubenswrapper[36504]: I1203 22:29:05.030707 36504 generic.go:334] "Generic (PLEG): container finished" podID="8090e570-85c4-4562-8305-e48cc2b59936" containerID="1240ac558d6fad0b18f98df9311dcfd6c5752f54920f5c1b317ddc02cbe0ca26" exitCode=0 Dec 03 22:29:05.030816 master-0 kubenswrapper[36504]: I1203 22:29:05.030764 36504 generic.go:334] "Generic (PLEG): container finished" podID="8090e570-85c4-4562-8305-e48cc2b59936" containerID="410682d4622016aedc9341f9ee150e7a656ba0a58d7a6865d5e7fac7cd751f8b" exitCode=2 Dec 03 22:29:05.030816 master-0 kubenswrapper[36504]: I1203 22:29:05.030790 36504 generic.go:334] "Generic (PLEG): container finished" podID="8090e570-85c4-4562-8305-e48cc2b59936" containerID="06378f7a35baeb277d82156dd96ffe06496d39394eed645f16cf5f7848c9200e" exitCode=0 Dec 03 22:29:05.031214 master-0 kubenswrapper[36504]: I1203 22:29:05.030866 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerDied","Data":"1240ac558d6fad0b18f98df9311dcfd6c5752f54920f5c1b317ddc02cbe0ca26"} Dec 03 22:29:05.031214 master-0 kubenswrapper[36504]: I1203 22:29:05.030973 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerDied","Data":"410682d4622016aedc9341f9ee150e7a656ba0a58d7a6865d5e7fac7cd751f8b"} Dec 03 22:29:05.031214 master-0 kubenswrapper[36504]: I1203 22:29:05.031022 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerDied","Data":"06378f7a35baeb277d82156dd96ffe06496d39394eed645f16cf5f7848c9200e"} Dec 03 22:29:05.034668 master-0 kubenswrapper[36504]: I1203 22:29:05.034578 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5b7e5313-e5e6-47e9-940d-c3fcb014502b","Type":"ContainerStarted","Data":"d7d2a74916dfa3a8cdb0208b363f661cfd3f4f3baf48337edb7a4aac6b11b29b"} Dec 03 22:29:05.034753 master-0 kubenswrapper[36504]: I1203 22:29:05.034698 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 22:29:05.034753 master-0 kubenswrapper[36504]: I1203 22:29:05.034717 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5b7e5313-e5e6-47e9-940d-c3fcb014502b","Type":"ContainerStarted","Data":"b700207c046068820604a29eced9c621b43293aaedfa4dc04668b5f6112d2cae"} Dec 03 22:29:05.069704 master-0 kubenswrapper[36504]: I1203 22:29:05.069583 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.627551628 podStartE2EDuration="2.06955506s" podCreationTimestamp="2025-12-03 22:29:03 +0000 UTC" firstStartedPulling="2025-12-03 22:29:04.064091331 +0000 UTC m=+1109.283863338" lastFinishedPulling="2025-12-03 22:29:04.506094763 +0000 UTC m=+1109.725866770" observedRunningTime="2025-12-03 22:29:05.064540506 +0000 UTC m=+1110.284312513" watchObservedRunningTime="2025-12-03 22:29:05.06955506 +0000 UTC m=+1110.289327067" Dec 03 22:29:07.070557 master-0 kubenswrapper[36504]: I1203 22:29:07.070472 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerDied","Data":"2e1195f7c14306b10e00e53907ae64b41ec55a6cfecbb723f31437fbc1228799"} Dec 03 22:29:07.071245 master-0 kubenswrapper[36504]: I1203 22:29:07.070399 36504 generic.go:334] "Generic (PLEG): container finished" podID="8090e570-85c4-4562-8305-e48cc2b59936" containerID="2e1195f7c14306b10e00e53907ae64b41ec55a6cfecbb723f31437fbc1228799" exitCode=0 Dec 03 22:29:07.071245 master-0 kubenswrapper[36504]: I1203 22:29:07.070988 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8090e570-85c4-4562-8305-e48cc2b59936","Type":"ContainerDied","Data":"e339ce26f1ffe4f3a8107499a3bf9b5a15eecc65d3872c6e38cde3af561ceff9"} Dec 03 22:29:07.071245 master-0 kubenswrapper[36504]: I1203 22:29:07.071033 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e339ce26f1ffe4f3a8107499a3bf9b5a15eecc65d3872c6e38cde3af561ceff9" Dec 03 22:29:07.169183 master-0 kubenswrapper[36504]: I1203 22:29:07.169124 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334324 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-config-data\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334528 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-sg-core-conf-yaml\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334584 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-scripts\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334654 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-log-httpd\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334710 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdps5\" (UniqueName: \"kubernetes.io/projected/8090e570-85c4-4562-8305-e48cc2b59936-kube-api-access-mdps5\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334740 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-run-httpd\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.340908 master-0 kubenswrapper[36504]: I1203 22:29:07.334891 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-combined-ca-bundle\") pod \"8090e570-85c4-4562-8305-e48cc2b59936\" (UID: \"8090e570-85c4-4562-8305-e48cc2b59936\") " Dec 03 22:29:07.350810 master-0 kubenswrapper[36504]: I1203 22:29:07.347402 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:07.350810 master-0 kubenswrapper[36504]: I1203 22:29:07.350435 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:29:07.368822 master-0 kubenswrapper[36504]: I1203 22:29:07.360947 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-scripts" (OuterVolumeSpecName: "scripts") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:07.376812 master-0 kubenswrapper[36504]: I1203 22:29:07.374373 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8090e570-85c4-4562-8305-e48cc2b59936-kube-api-access-mdps5" (OuterVolumeSpecName: "kube-api-access-mdps5") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "kube-api-access-mdps5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:07.450932 master-0 kubenswrapper[36504]: I1203 22:29:07.450841 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:07.450932 master-0 kubenswrapper[36504]: I1203 22:29:07.450899 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:07.450932 master-0 kubenswrapper[36504]: I1203 22:29:07.450919 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdps5\" (UniqueName: \"kubernetes.io/projected/8090e570-85c4-4562-8305-e48cc2b59936-kube-api-access-mdps5\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:07.450932 master-0 kubenswrapper[36504]: I1203 22:29:07.450931 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8090e570-85c4-4562-8305-e48cc2b59936-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:07.515750 master-0 kubenswrapper[36504]: I1203 22:29:07.515651 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:07.519667 master-0 kubenswrapper[36504]: I1203 22:29:07.519594 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:07.554858 master-0 kubenswrapper[36504]: I1203 22:29:07.554641 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:07.554858 master-0 kubenswrapper[36504]: I1203 22:29:07.554708 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:07.607391 master-0 kubenswrapper[36504]: I1203 22:29:07.607213 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-config-data" (OuterVolumeSpecName: "config-data") pod "8090e570-85c4-4562-8305-e48cc2b59936" (UID: "8090e570-85c4-4562-8305-e48cc2b59936"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:29:07.657639 master-0 kubenswrapper[36504]: I1203 22:29:07.657564 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8090e570-85c4-4562-8305-e48cc2b59936-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:08.085852 master-0 kubenswrapper[36504]: I1203 22:29:08.085784 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:08.147964 master-0 kubenswrapper[36504]: I1203 22:29:08.147876 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:08.177091 master-0 kubenswrapper[36504]: I1203 22:29:08.175069 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:08.211195 master-0 kubenswrapper[36504]: I1203 22:29:08.211111 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:08.212261 master-0 kubenswrapper[36504]: E1203 22:29:08.212221 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="proxy-httpd" Dec 03 22:29:08.212375 master-0 kubenswrapper[36504]: I1203 22:29:08.212335 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="proxy-httpd" Dec 03 22:29:08.212429 master-0 kubenswrapper[36504]: E1203 22:29:08.212407 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-central-agent" Dec 03 22:29:08.212429 master-0 kubenswrapper[36504]: I1203 22:29:08.212418 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-central-agent" Dec 03 22:29:08.212524 master-0 kubenswrapper[36504]: E1203 22:29:08.212450 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-notification-agent" Dec 03 22:29:08.212524 master-0 kubenswrapper[36504]: I1203 22:29:08.212458 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-notification-agent" Dec 03 22:29:08.212524 master-0 kubenswrapper[36504]: E1203 22:29:08.212498 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="sg-core" Dec 03 22:29:08.212524 master-0 kubenswrapper[36504]: I1203 22:29:08.212506 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="sg-core" Dec 03 22:29:08.213072 master-0 kubenswrapper[36504]: I1203 22:29:08.213040 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="sg-core" Dec 03 22:29:08.213221 master-0 kubenswrapper[36504]: I1203 22:29:08.213076 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-central-agent" Dec 03 22:29:08.213221 master-0 kubenswrapper[36504]: I1203 22:29:08.213184 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="ceilometer-notification-agent" Dec 03 22:29:08.213221 master-0 kubenswrapper[36504]: I1203 22:29:08.213206 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8090e570-85c4-4562-8305-e48cc2b59936" containerName="proxy-httpd" Dec 03 22:29:08.218038 master-0 kubenswrapper[36504]: I1203 22:29:08.217981 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:08.221077 master-0 kubenswrapper[36504]: I1203 22:29:08.221026 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 22:29:08.221222 master-0 kubenswrapper[36504]: I1203 22:29:08.221161 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:29:08.223158 master-0 kubenswrapper[36504]: I1203 22:29:08.222882 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:29:08.229700 master-0 kubenswrapper[36504]: I1203 22:29:08.229641 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:08.394998 master-0 kubenswrapper[36504]: I1203 22:29:08.394917 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-scripts\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395360 master-0 kubenswrapper[36504]: I1203 22:29:08.395022 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395360 master-0 kubenswrapper[36504]: I1203 22:29:08.395144 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bzr\" (UniqueName: \"kubernetes.io/projected/ef9588fc-d81b-4470-b1be-7687ce7438aa-kube-api-access-l4bzr\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395360 master-0 kubenswrapper[36504]: I1203 22:29:08.395244 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-config-data\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395360 master-0 kubenswrapper[36504]: I1203 22:29:08.395313 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-run-httpd\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395687 master-0 kubenswrapper[36504]: I1203 22:29:08.395626 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395879 master-0 kubenswrapper[36504]: I1203 22:29:08.395853 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-log-httpd\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.395980 master-0 kubenswrapper[36504]: I1203 22:29:08.395959 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499562 master-0 kubenswrapper[36504]: I1203 22:29:08.499477 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-run-httpd\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499562 master-0 kubenswrapper[36504]: I1203 22:29:08.499557 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-config-data\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499953 master-0 kubenswrapper[36504]: I1203 22:29:08.499680 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499953 master-0 kubenswrapper[36504]: I1203 22:29:08.499730 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-log-httpd\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499953 master-0 kubenswrapper[36504]: I1203 22:29:08.499780 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499953 master-0 kubenswrapper[36504]: I1203 22:29:08.499881 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-scripts\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.499953 master-0 kubenswrapper[36504]: I1203 22:29:08.499937 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.500176 master-0 kubenswrapper[36504]: I1203 22:29:08.499989 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bzr\" (UniqueName: \"kubernetes.io/projected/ef9588fc-d81b-4470-b1be-7687ce7438aa-kube-api-access-l4bzr\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.500985 master-0 kubenswrapper[36504]: I1203 22:29:08.500947 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-log-httpd\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.501641 master-0 kubenswrapper[36504]: I1203 22:29:08.501621 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-run-httpd\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.505877 master-0 kubenswrapper[36504]: I1203 22:29:08.504825 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.505877 master-0 kubenswrapper[36504]: I1203 22:29:08.505811 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.506056 master-0 kubenswrapper[36504]: I1203 22:29:08.505986 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.506159 master-0 kubenswrapper[36504]: I1203 22:29:08.506101 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-config-data\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.507289 master-0 kubenswrapper[36504]: I1203 22:29:08.507242 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-scripts\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.520043 master-0 kubenswrapper[36504]: I1203 22:29:08.519802 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bzr\" (UniqueName: \"kubernetes.io/projected/ef9588fc-d81b-4470-b1be-7687ce7438aa-kube-api-access-l4bzr\") pod \"ceilometer-0\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " pod="openstack/ceilometer-0" Dec 03 22:29:08.554013 master-0 kubenswrapper[36504]: I1203 22:29:08.553946 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:29:08.579174 master-0 kubenswrapper[36504]: I1203 22:29:08.576038 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:29:08.579174 master-0 kubenswrapper[36504]: I1203 22:29:08.578208 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:29:08.590068 master-0 kubenswrapper[36504]: I1203 22:29:08.590018 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:29:08.597444 master-0 kubenswrapper[36504]: I1203 22:29:08.597373 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 22:29:09.116427 master-0 kubenswrapper[36504]: I1203 22:29:09.116350 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8090e570-85c4-4562-8305-e48cc2b59936" path="/var/lib/kubelet/pods/8090e570-85c4-4562-8305-e48cc2b59936/volumes" Dec 03 22:29:09.117499 master-0 kubenswrapper[36504]: I1203 22:29:09.117464 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 22:29:09.117563 master-0 kubenswrapper[36504]: I1203 22:29:09.117529 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 22:29:09.182068 master-0 kubenswrapper[36504]: W1203 22:29:09.181535 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef9588fc_d81b_4470_b1be_7687ce7438aa.slice/crio-ff085e4171eeca6ae03c12f8dc861070e970e63d493195da70ac7109209676b9 WatchSource:0}: Error finding container ff085e4171eeca6ae03c12f8dc861070e970e63d493195da70ac7109209676b9: Status 404 returned error can't find the container with id ff085e4171eeca6ae03c12f8dc861070e970e63d493195da70ac7109209676b9 Dec 03 22:29:09.206384 master-0 kubenswrapper[36504]: I1203 22:29:09.205350 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:29:09.222208 master-0 kubenswrapper[36504]: I1203 22:29:09.220583 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:29:09.231840 master-0 kubenswrapper[36504]: I1203 22:29:09.231677 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 22:29:09.247468 master-0 kubenswrapper[36504]: I1203 22:29:09.246422 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:29:10.125932 master-0 kubenswrapper[36504]: I1203 22:29:10.125829 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerStarted","Data":"7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb"} Dec 03 22:29:10.125932 master-0 kubenswrapper[36504]: I1203 22:29:10.125915 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerStarted","Data":"ff085e4171eeca6ae03c12f8dc861070e970e63d493195da70ac7109209676b9"} Dec 03 22:29:10.135726 master-0 kubenswrapper[36504]: I1203 22:29:10.135668 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 22:29:11.159070 master-0 kubenswrapper[36504]: I1203 22:29:11.158974 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerStarted","Data":"6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544"} Dec 03 22:29:12.183718 master-0 kubenswrapper[36504]: I1203 22:29:12.183654 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerStarted","Data":"3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883"} Dec 03 22:29:13.205631 master-0 kubenswrapper[36504]: I1203 22:29:13.205549 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerStarted","Data":"0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10"} Dec 03 22:29:13.207668 master-0 kubenswrapper[36504]: I1203 22:29:13.207303 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:29:13.242291 master-0 kubenswrapper[36504]: I1203 22:29:13.237251 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.646128902 podStartE2EDuration="5.237227453s" podCreationTimestamp="2025-12-03 22:29:08 +0000 UTC" firstStartedPulling="2025-12-03 22:29:09.186848868 +0000 UTC m=+1114.406620875" lastFinishedPulling="2025-12-03 22:29:12.777947419 +0000 UTC m=+1117.997719426" observedRunningTime="2025-12-03 22:29:13.234417216 +0000 UTC m=+1118.454189243" watchObservedRunningTime="2025-12-03 22:29:13.237227453 +0000 UTC m=+1118.456999460" Dec 03 22:29:13.465578 master-0 kubenswrapper[36504]: I1203 22:29:13.465522 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 22:29:30.905628 master-0 kubenswrapper[36504]: I1203 22:29:30.905534 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-5sqwd"] Dec 03 22:29:30.909204 master-0 kubenswrapper[36504]: I1203 22:29:30.909138 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:30.935736 master-0 kubenswrapper[36504]: I1203 22:29:30.935655 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-5sqwd"] Dec 03 22:29:31.087410 master-0 kubenswrapper[36504]: I1203 22:29:31.087328 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23d55dcc-3278-423b-b8a5-f0677d010e28-operator-scripts\") pod \"octavia-db-create-5sqwd\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.087792 master-0 kubenswrapper[36504]: I1203 22:29:31.087539 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdddj\" (UniqueName: \"kubernetes.io/projected/23d55dcc-3278-423b-b8a5-f0677d010e28-kube-api-access-bdddj\") pod \"octavia-db-create-5sqwd\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.191120 master-0 kubenswrapper[36504]: I1203 22:29:31.190956 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdddj\" (UniqueName: \"kubernetes.io/projected/23d55dcc-3278-423b-b8a5-f0677d010e28-kube-api-access-bdddj\") pod \"octavia-db-create-5sqwd\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.191401 master-0 kubenswrapper[36504]: I1203 22:29:31.191193 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23d55dcc-3278-423b-b8a5-f0677d010e28-operator-scripts\") pod \"octavia-db-create-5sqwd\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.193282 master-0 kubenswrapper[36504]: I1203 22:29:31.193246 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23d55dcc-3278-423b-b8a5-f0677d010e28-operator-scripts\") pod \"octavia-db-create-5sqwd\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.226803 master-0 kubenswrapper[36504]: I1203 22:29:31.225748 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdddj\" (UniqueName: \"kubernetes.io/projected/23d55dcc-3278-423b-b8a5-f0677d010e28-kube-api-access-bdddj\") pod \"octavia-db-create-5sqwd\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.255802 master-0 kubenswrapper[36504]: I1203 22:29:31.255520 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:31.961106 master-0 kubenswrapper[36504]: I1203 22:29:31.961000 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-5sqwd"] Dec 03 22:29:32.285208 master-0 kubenswrapper[36504]: I1203 22:29:32.284172 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-ce39-account-create-update-ngk2l"] Dec 03 22:29:32.287656 master-0 kubenswrapper[36504]: I1203 22:29:32.286594 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.290115 master-0 kubenswrapper[36504]: I1203 22:29:32.289968 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 03 22:29:32.305676 master-0 kubenswrapper[36504]: I1203 22:29:32.304965 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-ce39-account-create-update-ngk2l"] Dec 03 22:29:32.460352 master-0 kubenswrapper[36504]: I1203 22:29:32.460262 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8451d-cd60-4d12-8a93-231274fbccf2-operator-scripts\") pod \"octavia-ce39-account-create-update-ngk2l\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.461266 master-0 kubenswrapper[36504]: I1203 22:29:32.461198 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mm8g\" (UniqueName: \"kubernetes.io/projected/eef8451d-cd60-4d12-8a93-231274fbccf2-kube-api-access-2mm8g\") pod \"octavia-ce39-account-create-update-ngk2l\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.563675 master-0 kubenswrapper[36504]: I1203 22:29:32.563583 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mm8g\" (UniqueName: \"kubernetes.io/projected/eef8451d-cd60-4d12-8a93-231274fbccf2-kube-api-access-2mm8g\") pod \"octavia-ce39-account-create-update-ngk2l\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.564026 master-0 kubenswrapper[36504]: I1203 22:29:32.563863 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8451d-cd60-4d12-8a93-231274fbccf2-operator-scripts\") pod \"octavia-ce39-account-create-update-ngk2l\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.564669 master-0 kubenswrapper[36504]: I1203 22:29:32.564631 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8451d-cd60-4d12-8a93-231274fbccf2-operator-scripts\") pod \"octavia-ce39-account-create-update-ngk2l\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.583331 master-0 kubenswrapper[36504]: I1203 22:29:32.583227 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mm8g\" (UniqueName: \"kubernetes.io/projected/eef8451d-cd60-4d12-8a93-231274fbccf2-kube-api-access-2mm8g\") pod \"octavia-ce39-account-create-update-ngk2l\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:32.611685 master-0 kubenswrapper[36504]: I1203 22:29:32.611588 36504 generic.go:334] "Generic (PLEG): container finished" podID="23d55dcc-3278-423b-b8a5-f0677d010e28" containerID="13147e5ac3b7e4b019bb8f8e41a53efa49011505dce1c173e6002716576cf6c2" exitCode=0 Dec 03 22:29:32.612005 master-0 kubenswrapper[36504]: I1203 22:29:32.611688 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5sqwd" event={"ID":"23d55dcc-3278-423b-b8a5-f0677d010e28","Type":"ContainerDied","Data":"13147e5ac3b7e4b019bb8f8e41a53efa49011505dce1c173e6002716576cf6c2"} Dec 03 22:29:32.612005 master-0 kubenswrapper[36504]: I1203 22:29:32.611742 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5sqwd" event={"ID":"23d55dcc-3278-423b-b8a5-f0677d010e28","Type":"ContainerStarted","Data":"f0b47128fe7cbf6dbf39f9c3d45847e9c765468ae9810d37570715fb28a87c12"} Dec 03 22:29:32.714123 master-0 kubenswrapper[36504]: I1203 22:29:32.714062 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:33.284114 master-0 kubenswrapper[36504]: I1203 22:29:33.283659 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-ce39-account-create-update-ngk2l"] Dec 03 22:29:33.632311 master-0 kubenswrapper[36504]: I1203 22:29:33.632237 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ce39-account-create-update-ngk2l" event={"ID":"eef8451d-cd60-4d12-8a93-231274fbccf2","Type":"ContainerStarted","Data":"50beeb2af275a2e15964ddcb061f7b5a62d020dfd606ad3b38bb7e294190dcc6"} Dec 03 22:29:34.230020 master-0 kubenswrapper[36504]: I1203 22:29:34.229961 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:34.384252 master-0 kubenswrapper[36504]: I1203 22:29:34.384190 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23d55dcc-3278-423b-b8a5-f0677d010e28-operator-scripts\") pod \"23d55dcc-3278-423b-b8a5-f0677d010e28\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " Dec 03 22:29:34.385123 master-0 kubenswrapper[36504]: I1203 22:29:34.385099 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdddj\" (UniqueName: \"kubernetes.io/projected/23d55dcc-3278-423b-b8a5-f0677d010e28-kube-api-access-bdddj\") pod \"23d55dcc-3278-423b-b8a5-f0677d010e28\" (UID: \"23d55dcc-3278-423b-b8a5-f0677d010e28\") " Dec 03 22:29:34.385339 master-0 kubenswrapper[36504]: I1203 22:29:34.384876 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23d55dcc-3278-423b-b8a5-f0677d010e28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23d55dcc-3278-423b-b8a5-f0677d010e28" (UID: "23d55dcc-3278-423b-b8a5-f0677d010e28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:34.386273 master-0 kubenswrapper[36504]: I1203 22:29:34.386246 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23d55dcc-3278-423b-b8a5-f0677d010e28-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:34.389069 master-0 kubenswrapper[36504]: I1203 22:29:34.388620 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d55dcc-3278-423b-b8a5-f0677d010e28-kube-api-access-bdddj" (OuterVolumeSpecName: "kube-api-access-bdddj") pod "23d55dcc-3278-423b-b8a5-f0677d010e28" (UID: "23d55dcc-3278-423b-b8a5-f0677d010e28"). InnerVolumeSpecName "kube-api-access-bdddj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:34.489585 master-0 kubenswrapper[36504]: I1203 22:29:34.489505 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdddj\" (UniqueName: \"kubernetes.io/projected/23d55dcc-3278-423b-b8a5-f0677d010e28-kube-api-access-bdddj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:34.656515 master-0 kubenswrapper[36504]: I1203 22:29:34.656329 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-5sqwd" event={"ID":"23d55dcc-3278-423b-b8a5-f0677d010e28","Type":"ContainerDied","Data":"f0b47128fe7cbf6dbf39f9c3d45847e9c765468ae9810d37570715fb28a87c12"} Dec 03 22:29:34.656515 master-0 kubenswrapper[36504]: I1203 22:29:34.656394 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b47128fe7cbf6dbf39f9c3d45847e9c765468ae9810d37570715fb28a87c12" Dec 03 22:29:34.656515 master-0 kubenswrapper[36504]: I1203 22:29:34.656388 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-5sqwd" Dec 03 22:29:34.659743 master-0 kubenswrapper[36504]: I1203 22:29:34.659689 36504 generic.go:334] "Generic (PLEG): container finished" podID="eef8451d-cd60-4d12-8a93-231274fbccf2" containerID="6ff51e950cdbb5d7a1e4aed3edc76b4bfb56d4e6eab93bf801e2e53f4e5b6510" exitCode=0 Dec 03 22:29:34.659873 master-0 kubenswrapper[36504]: I1203 22:29:34.659753 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ce39-account-create-update-ngk2l" event={"ID":"eef8451d-cd60-4d12-8a93-231274fbccf2","Type":"ContainerDied","Data":"6ff51e950cdbb5d7a1e4aed3edc76b4bfb56d4e6eab93bf801e2e53f4e5b6510"} Dec 03 22:29:35.729467 master-0 kubenswrapper[36504]: E1203 22:29:35.729381 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:29:36.155910 master-0 kubenswrapper[36504]: I1203 22:29:36.155861 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:36.261796 master-0 kubenswrapper[36504]: I1203 22:29:36.258959 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mm8g\" (UniqueName: \"kubernetes.io/projected/eef8451d-cd60-4d12-8a93-231274fbccf2-kube-api-access-2mm8g\") pod \"eef8451d-cd60-4d12-8a93-231274fbccf2\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " Dec 03 22:29:36.261796 master-0 kubenswrapper[36504]: I1203 22:29:36.259561 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8451d-cd60-4d12-8a93-231274fbccf2-operator-scripts\") pod \"eef8451d-cd60-4d12-8a93-231274fbccf2\" (UID: \"eef8451d-cd60-4d12-8a93-231274fbccf2\") " Dec 03 22:29:36.261796 master-0 kubenswrapper[36504]: I1203 22:29:36.260933 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eef8451d-cd60-4d12-8a93-231274fbccf2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eef8451d-cd60-4d12-8a93-231274fbccf2" (UID: "eef8451d-cd60-4d12-8a93-231274fbccf2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:36.262347 master-0 kubenswrapper[36504]: I1203 22:29:36.262207 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eef8451d-cd60-4d12-8a93-231274fbccf2-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:36.264451 master-0 kubenswrapper[36504]: I1203 22:29:36.263847 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef8451d-cd60-4d12-8a93-231274fbccf2-kube-api-access-2mm8g" (OuterVolumeSpecName: "kube-api-access-2mm8g") pod "eef8451d-cd60-4d12-8a93-231274fbccf2" (UID: "eef8451d-cd60-4d12-8a93-231274fbccf2"). InnerVolumeSpecName "kube-api-access-2mm8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:36.366467 master-0 kubenswrapper[36504]: I1203 22:29:36.366385 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mm8g\" (UniqueName: \"kubernetes.io/projected/eef8451d-cd60-4d12-8a93-231274fbccf2-kube-api-access-2mm8g\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:36.691191 master-0 kubenswrapper[36504]: I1203 22:29:36.691118 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-ce39-account-create-update-ngk2l" event={"ID":"eef8451d-cd60-4d12-8a93-231274fbccf2","Type":"ContainerDied","Data":"50beeb2af275a2e15964ddcb061f7b5a62d020dfd606ad3b38bb7e294190dcc6"} Dec 03 22:29:36.691191 master-0 kubenswrapper[36504]: I1203 22:29:36.691179 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50beeb2af275a2e15964ddcb061f7b5a62d020dfd606ad3b38bb7e294190dcc6" Dec 03 22:29:36.691531 master-0 kubenswrapper[36504]: I1203 22:29:36.691201 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-ce39-account-create-update-ngk2l" Dec 03 22:29:38.139676 master-0 kubenswrapper[36504]: I1203 22:29:38.139586 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-c4sh9"] Dec 03 22:29:38.140684 master-0 kubenswrapper[36504]: E1203 22:29:38.140651 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d55dcc-3278-423b-b8a5-f0677d010e28" containerName="mariadb-database-create" Dec 03 22:29:38.140684 master-0 kubenswrapper[36504]: I1203 22:29:38.140683 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d55dcc-3278-423b-b8a5-f0677d010e28" containerName="mariadb-database-create" Dec 03 22:29:38.140805 master-0 kubenswrapper[36504]: E1203 22:29:38.140728 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef8451d-cd60-4d12-8a93-231274fbccf2" containerName="mariadb-account-create-update" Dec 03 22:29:38.140805 master-0 kubenswrapper[36504]: I1203 22:29:38.140736 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef8451d-cd60-4d12-8a93-231274fbccf2" containerName="mariadb-account-create-update" Dec 03 22:29:38.141203 master-0 kubenswrapper[36504]: I1203 22:29:38.141171 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef8451d-cd60-4d12-8a93-231274fbccf2" containerName="mariadb-account-create-update" Dec 03 22:29:38.141203 master-0 kubenswrapper[36504]: I1203 22:29:38.141196 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d55dcc-3278-423b-b8a5-f0677d010e28" containerName="mariadb-database-create" Dec 03 22:29:38.142349 master-0 kubenswrapper[36504]: I1203 22:29:38.142319 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.156422 master-0 kubenswrapper[36504]: I1203 22:29:38.155895 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-c4sh9"] Dec 03 22:29:38.235017 master-0 kubenswrapper[36504]: I1203 22:29:38.232390 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2682f06-2124-466c-8041-403c8a864821-operator-scripts\") pod \"octavia-persistence-db-create-c4sh9\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.235017 master-0 kubenswrapper[36504]: I1203 22:29:38.232478 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wbbl\" (UniqueName: \"kubernetes.io/projected/f2682f06-2124-466c-8041-403c8a864821-kube-api-access-9wbbl\") pod \"octavia-persistence-db-create-c4sh9\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.335855 master-0 kubenswrapper[36504]: I1203 22:29:38.335718 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2682f06-2124-466c-8041-403c8a864821-operator-scripts\") pod \"octavia-persistence-db-create-c4sh9\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.336153 master-0 kubenswrapper[36504]: I1203 22:29:38.335910 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wbbl\" (UniqueName: \"kubernetes.io/projected/f2682f06-2124-466c-8041-403c8a864821-kube-api-access-9wbbl\") pod \"octavia-persistence-db-create-c4sh9\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.336670 master-0 kubenswrapper[36504]: I1203 22:29:38.336623 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2682f06-2124-466c-8041-403c8a864821-operator-scripts\") pod \"octavia-persistence-db-create-c4sh9\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.356474 master-0 kubenswrapper[36504]: I1203 22:29:38.356408 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wbbl\" (UniqueName: \"kubernetes.io/projected/f2682f06-2124-466c-8041-403c8a864821-kube-api-access-9wbbl\") pod \"octavia-persistence-db-create-c4sh9\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.464904 master-0 kubenswrapper[36504]: I1203 22:29:38.464714 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:38.565526 master-0 kubenswrapper[36504]: I1203 22:29:38.565477 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:29:39.016924 master-0 kubenswrapper[36504]: I1203 22:29:39.016484 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-c4sh9"] Dec 03 22:29:39.250107 master-0 kubenswrapper[36504]: I1203 22:29:39.250034 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c786-account-create-update-x6vz9"] Dec 03 22:29:39.254680 master-0 kubenswrapper[36504]: I1203 22:29:39.252383 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:39.255512 master-0 kubenswrapper[36504]: I1203 22:29:39.255472 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 03 22:29:39.295385 master-0 kubenswrapper[36504]: I1203 22:29:39.295305 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c786-account-create-update-x6vz9"] Dec 03 22:29:39.378038 master-0 kubenswrapper[36504]: I1203 22:29:39.377966 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e10bb0-61c3-423b-b9b8-086b32e604ec-operator-scripts\") pod \"octavia-c786-account-create-update-x6vz9\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:39.378327 master-0 kubenswrapper[36504]: I1203 22:29:39.378061 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5kr8\" (UniqueName: \"kubernetes.io/projected/f8e10bb0-61c3-423b-b9b8-086b32e604ec-kube-api-access-v5kr8\") pod \"octavia-c786-account-create-update-x6vz9\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:39.481998 master-0 kubenswrapper[36504]: I1203 22:29:39.481889 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e10bb0-61c3-423b-b9b8-086b32e604ec-operator-scripts\") pod \"octavia-c786-account-create-update-x6vz9\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:39.482314 master-0 kubenswrapper[36504]: I1203 22:29:39.482045 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5kr8\" (UniqueName: \"kubernetes.io/projected/f8e10bb0-61c3-423b-b9b8-086b32e604ec-kube-api-access-v5kr8\") pod \"octavia-c786-account-create-update-x6vz9\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:39.483237 master-0 kubenswrapper[36504]: I1203 22:29:39.483194 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e10bb0-61c3-423b-b9b8-086b32e604ec-operator-scripts\") pod \"octavia-c786-account-create-update-x6vz9\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:39.745105 master-0 kubenswrapper[36504]: I1203 22:29:39.744932 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-c4sh9" event={"ID":"f2682f06-2124-466c-8041-403c8a864821","Type":"ContainerStarted","Data":"326cbd01f9177e1cf955f6213088368abe126c73f427f0652daddd93673ad43f"} Dec 03 22:29:39.903727 master-0 kubenswrapper[36504]: I1203 22:29:39.903667 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5kr8\" (UniqueName: \"kubernetes.io/projected/f8e10bb0-61c3-423b-b9b8-086b32e604ec-kube-api-access-v5kr8\") pod \"octavia-c786-account-create-update-x6vz9\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:40.181319 master-0 kubenswrapper[36504]: I1203 22:29:40.181235 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:40.727346 master-0 kubenswrapper[36504]: I1203 22:29:40.727253 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c786-account-create-update-x6vz9"] Dec 03 22:29:40.771659 master-0 kubenswrapper[36504]: I1203 22:29:40.771575 36504 generic.go:334] "Generic (PLEG): container finished" podID="f2682f06-2124-466c-8041-403c8a864821" containerID="5da860a370d517d8bd16379dab0a91f8ac71bf4cff2427aacff9c3fe72f1bb30" exitCode=0 Dec 03 22:29:40.772573 master-0 kubenswrapper[36504]: I1203 22:29:40.771655 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-c4sh9" event={"ID":"f2682f06-2124-466c-8041-403c8a864821","Type":"ContainerDied","Data":"5da860a370d517d8bd16379dab0a91f8ac71bf4cff2427aacff9c3fe72f1bb30"} Dec 03 22:29:40.772987 master-0 kubenswrapper[36504]: I1203 22:29:40.772947 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c786-account-create-update-x6vz9" event={"ID":"f8e10bb0-61c3-423b-b9b8-086b32e604ec","Type":"ContainerStarted","Data":"4647ee3eee69c052ca8c444e2ed55a91fe73fc817ae1ab25f950808b69954a8b"} Dec 03 22:29:41.790756 master-0 kubenswrapper[36504]: I1203 22:29:41.790689 36504 generic.go:334] "Generic (PLEG): container finished" podID="f8e10bb0-61c3-423b-b9b8-086b32e604ec" containerID="5cdc24b245f4fe12309797c09b0449423c3401d8e3176e5d6eace3d1bd4afae6" exitCode=0 Dec 03 22:29:41.791559 master-0 kubenswrapper[36504]: I1203 22:29:41.790869 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c786-account-create-update-x6vz9" event={"ID":"f8e10bb0-61c3-423b-b9b8-086b32e604ec","Type":"ContainerDied","Data":"5cdc24b245f4fe12309797c09b0449423c3401d8e3176e5d6eace3d1bd4afae6"} Dec 03 22:29:42.359889 master-0 kubenswrapper[36504]: I1203 22:29:42.358315 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:42.518507 master-0 kubenswrapper[36504]: I1203 22:29:42.518335 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2682f06-2124-466c-8041-403c8a864821-operator-scripts\") pod \"f2682f06-2124-466c-8041-403c8a864821\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " Dec 03 22:29:42.518840 master-0 kubenswrapper[36504]: I1203 22:29:42.518514 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wbbl\" (UniqueName: \"kubernetes.io/projected/f2682f06-2124-466c-8041-403c8a864821-kube-api-access-9wbbl\") pod \"f2682f06-2124-466c-8041-403c8a864821\" (UID: \"f2682f06-2124-466c-8041-403c8a864821\") " Dec 03 22:29:42.519615 master-0 kubenswrapper[36504]: I1203 22:29:42.519532 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2682f06-2124-466c-8041-403c8a864821-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2682f06-2124-466c-8041-403c8a864821" (UID: "f2682f06-2124-466c-8041-403c8a864821"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:42.526315 master-0 kubenswrapper[36504]: I1203 22:29:42.526242 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2682f06-2124-466c-8041-403c8a864821-kube-api-access-9wbbl" (OuterVolumeSpecName: "kube-api-access-9wbbl") pod "f2682f06-2124-466c-8041-403c8a864821" (UID: "f2682f06-2124-466c-8041-403c8a864821"). InnerVolumeSpecName "kube-api-access-9wbbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:42.623019 master-0 kubenswrapper[36504]: I1203 22:29:42.622953 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2682f06-2124-466c-8041-403c8a864821-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:42.623019 master-0 kubenswrapper[36504]: I1203 22:29:42.623004 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wbbl\" (UniqueName: \"kubernetes.io/projected/f2682f06-2124-466c-8041-403c8a864821-kube-api-access-9wbbl\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:42.846096 master-0 kubenswrapper[36504]: I1203 22:29:42.845936 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-c4sh9" Dec 03 22:29:42.846096 master-0 kubenswrapper[36504]: I1203 22:29:42.845958 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-c4sh9" event={"ID":"f2682f06-2124-466c-8041-403c8a864821","Type":"ContainerDied","Data":"326cbd01f9177e1cf955f6213088368abe126c73f427f0652daddd93673ad43f"} Dec 03 22:29:42.846096 master-0 kubenswrapper[36504]: I1203 22:29:42.846042 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="326cbd01f9177e1cf955f6213088368abe126c73f427f0652daddd93673ad43f" Dec 03 22:29:43.399598 master-0 kubenswrapper[36504]: I1203 22:29:43.399535 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:43.459794 master-0 kubenswrapper[36504]: I1203 22:29:43.459520 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e10bb0-61c3-423b-b9b8-086b32e604ec-operator-scripts\") pod \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " Dec 03 22:29:43.459794 master-0 kubenswrapper[36504]: I1203 22:29:43.459672 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5kr8\" (UniqueName: \"kubernetes.io/projected/f8e10bb0-61c3-423b-b9b8-086b32e604ec-kube-api-access-v5kr8\") pod \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\" (UID: \"f8e10bb0-61c3-423b-b9b8-086b32e604ec\") " Dec 03 22:29:43.466124 master-0 kubenswrapper[36504]: I1203 22:29:43.462106 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8e10bb0-61c3-423b-b9b8-086b32e604ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8e10bb0-61c3-423b-b9b8-086b32e604ec" (UID: "f8e10bb0-61c3-423b-b9b8-086b32e604ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:29:43.474178 master-0 kubenswrapper[36504]: I1203 22:29:43.474085 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e10bb0-61c3-423b-b9b8-086b32e604ec-kube-api-access-v5kr8" (OuterVolumeSpecName: "kube-api-access-v5kr8") pod "f8e10bb0-61c3-423b-b9b8-086b32e604ec" (UID: "f8e10bb0-61c3-423b-b9b8-086b32e604ec"). InnerVolumeSpecName "kube-api-access-v5kr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:29:43.564462 master-0 kubenswrapper[36504]: I1203 22:29:43.564386 36504 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8e10bb0-61c3-423b-b9b8-086b32e604ec-operator-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:43.564462 master-0 kubenswrapper[36504]: I1203 22:29:43.564445 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5kr8\" (UniqueName: \"kubernetes.io/projected/f8e10bb0-61c3-423b-b9b8-086b32e604ec-kube-api-access-v5kr8\") on node \"master-0\" DevicePath \"\"" Dec 03 22:29:43.864056 master-0 kubenswrapper[36504]: I1203 22:29:43.863893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c786-account-create-update-x6vz9" event={"ID":"f8e10bb0-61c3-423b-b9b8-086b32e604ec","Type":"ContainerDied","Data":"4647ee3eee69c052ca8c444e2ed55a91fe73fc817ae1ab25f950808b69954a8b"} Dec 03 22:29:43.864056 master-0 kubenswrapper[36504]: I1203 22:29:43.863962 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4647ee3eee69c052ca8c444e2ed55a91fe73fc817ae1ab25f950808b69954a8b" Dec 03 22:29:43.864056 master-0 kubenswrapper[36504]: I1203 22:29:43.863977 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c786-account-create-update-x6vz9" Dec 03 22:29:50.303681 master-0 kubenswrapper[36504]: I1203 22:29:50.302984 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6cd4ddb7f5-cj2qw"] Dec 03 22:29:50.313688 master-0 kubenswrapper[36504]: E1203 22:29:50.313564 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2682f06-2124-466c-8041-403c8a864821" containerName="mariadb-database-create" Dec 03 22:29:50.313688 master-0 kubenswrapper[36504]: I1203 22:29:50.313623 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2682f06-2124-466c-8041-403c8a864821" containerName="mariadb-database-create" Dec 03 22:29:50.314200 master-0 kubenswrapper[36504]: E1203 22:29:50.313726 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e10bb0-61c3-423b-b9b8-086b32e604ec" containerName="mariadb-account-create-update" Dec 03 22:29:50.314200 master-0 kubenswrapper[36504]: I1203 22:29:50.313733 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e10bb0-61c3-423b-b9b8-086b32e604ec" containerName="mariadb-account-create-update" Dec 03 22:29:50.314676 master-0 kubenswrapper[36504]: I1203 22:29:50.314571 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2682f06-2124-466c-8041-403c8a864821" containerName="mariadb-database-create" Dec 03 22:29:50.314676 master-0 kubenswrapper[36504]: I1203 22:29:50.314601 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e10bb0-61c3-423b-b9b8-086b32e604ec" containerName="mariadb-account-create-update" Dec 03 22:29:50.318476 master-0 kubenswrapper[36504]: I1203 22:29:50.318183 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.322229 master-0 kubenswrapper[36504]: I1203 22:29:50.322164 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Dec 03 22:29:50.324485 master-0 kubenswrapper[36504]: I1203 22:29:50.322606 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 03 22:29:50.324485 master-0 kubenswrapper[36504]: I1203 22:29:50.322824 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 03 22:29:50.324485 master-0 kubenswrapper[36504]: I1203 22:29:50.324248 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6cd4ddb7f5-cj2qw"] Dec 03 22:29:50.410490 master-0 kubenswrapper[36504]: I1203 22:29:50.410224 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-octavia-run\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.410490 master-0 kubenswrapper[36504]: I1203 22:29:50.410311 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-ovndb-tls-certs\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.410490 master-0 kubenswrapper[36504]: I1203 22:29:50.410360 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-combined-ca-bundle\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.410490 master-0 kubenswrapper[36504]: I1203 22:29:50.410383 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.411071 master-0 kubenswrapper[36504]: I1203 22:29:50.410606 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-scripts\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.411071 master-0 kubenswrapper[36504]: I1203 22:29:50.410749 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data-merged\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.514677 master-0 kubenswrapper[36504]: I1203 22:29:50.514517 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data-merged\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.514994 master-0 kubenswrapper[36504]: I1203 22:29:50.514737 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-octavia-run\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.514994 master-0 kubenswrapper[36504]: I1203 22:29:50.514801 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-ovndb-tls-certs\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.514994 master-0 kubenswrapper[36504]: I1203 22:29:50.514850 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-combined-ca-bundle\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.514994 master-0 kubenswrapper[36504]: I1203 22:29:50.514887 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.515130 master-0 kubenswrapper[36504]: I1203 22:29:50.515108 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-scripts\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.515562 master-0 kubenswrapper[36504]: I1203 22:29:50.515513 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-octavia-run\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.515882 master-0 kubenswrapper[36504]: I1203 22:29:50.515853 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data-merged\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.522928 master-0 kubenswrapper[36504]: I1203 22:29:50.519592 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-ovndb-tls-certs\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.522928 master-0 kubenswrapper[36504]: I1203 22:29:50.519804 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-combined-ca-bundle\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.524574 master-0 kubenswrapper[36504]: I1203 22:29:50.523478 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-scripts\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.534254 master-0 kubenswrapper[36504]: I1203 22:29:50.534193 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data\") pod \"octavia-api-6cd4ddb7f5-cj2qw\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:50.659633 master-0 kubenswrapper[36504]: I1203 22:29:50.659560 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:29:51.383516 master-0 kubenswrapper[36504]: W1203 22:29:51.383433 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd64b02c4_e815_4791_8b74_09ba3d96339f.slice/crio-74f8edeb3a562304e5ff7b52c76f9cea483e8481bfe8414516114b72b56618f8 WatchSource:0}: Error finding container 74f8edeb3a562304e5ff7b52c76f9cea483e8481bfe8414516114b72b56618f8: Status 404 returned error can't find the container with id 74f8edeb3a562304e5ff7b52c76f9cea483e8481bfe8414516114b72b56618f8 Dec 03 22:29:51.389407 master-0 kubenswrapper[36504]: I1203 22:29:51.388667 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6cd4ddb7f5-cj2qw"] Dec 03 22:29:52.033081 master-0 kubenswrapper[36504]: I1203 22:29:52.032989 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerStarted","Data":"74f8edeb3a562304e5ff7b52c76f9cea483e8481bfe8414516114b72b56618f8"} Dec 03 22:29:53.097910 master-0 kubenswrapper[36504]: I1203 22:29:53.096480 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:29:56.097210 master-0 kubenswrapper[36504]: I1203 22:29:56.096509 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:30:00.318839 master-0 kubenswrapper[36504]: I1203 22:30:00.318747 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt"] Dec 03 22:30:00.322754 master-0 kubenswrapper[36504]: I1203 22:30:00.322708 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.328423 master-0 kubenswrapper[36504]: I1203 22:30:00.328350 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 22:30:00.328734 master-0 kubenswrapper[36504]: I1203 22:30:00.328594 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:30:00.338011 master-0 kubenswrapper[36504]: I1203 22:30:00.337892 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt"] Dec 03 22:30:00.459900 master-0 kubenswrapper[36504]: I1203 22:30:00.459164 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hzjk\" (UniqueName: \"kubernetes.io/projected/5803fb46-dff1-46a8-b7b8-e52a1261e409-kube-api-access-2hzjk\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.459900 master-0 kubenswrapper[36504]: I1203 22:30:00.459675 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5803fb46-dff1-46a8-b7b8-e52a1261e409-config-volume\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.460327 master-0 kubenswrapper[36504]: I1203 22:30:00.459991 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5803fb46-dff1-46a8-b7b8-e52a1261e409-secret-volume\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.566806 master-0 kubenswrapper[36504]: I1203 22:30:00.564550 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hzjk\" (UniqueName: \"kubernetes.io/projected/5803fb46-dff1-46a8-b7b8-e52a1261e409-kube-api-access-2hzjk\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.566806 master-0 kubenswrapper[36504]: I1203 22:30:00.564757 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5803fb46-dff1-46a8-b7b8-e52a1261e409-config-volume\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.566806 master-0 kubenswrapper[36504]: I1203 22:30:00.564901 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5803fb46-dff1-46a8-b7b8-e52a1261e409-secret-volume\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.566806 master-0 kubenswrapper[36504]: I1203 22:30:00.565672 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5803fb46-dff1-46a8-b7b8-e52a1261e409-config-volume\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.569188 master-0 kubenswrapper[36504]: I1203 22:30:00.569100 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5803fb46-dff1-46a8-b7b8-e52a1261e409-secret-volume\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.584171 master-0 kubenswrapper[36504]: I1203 22:30:00.584111 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hzjk\" (UniqueName: \"kubernetes.io/projected/5803fb46-dff1-46a8-b7b8-e52a1261e409-kube-api-access-2hzjk\") pod \"collect-profiles-29413350-djxjt\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:00.680695 master-0 kubenswrapper[36504]: I1203 22:30:00.680603 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:01.984123 master-0 kubenswrapper[36504]: W1203 22:30:01.984045 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5803fb46_dff1_46a8_b7b8_e52a1261e409.slice/crio-899f6b9fd95772069965efff0656e52a0e6c37c630c01966952882a1d0c4e12c WatchSource:0}: Error finding container 899f6b9fd95772069965efff0656e52a0e6c37c630c01966952882a1d0c4e12c: Status 404 returned error can't find the container with id 899f6b9fd95772069965efff0656e52a0e6c37c630c01966952882a1d0c4e12c Dec 03 22:30:01.984895 master-0 kubenswrapper[36504]: E1203 22:30:01.984324 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c57b50_c8ce_423b_9d25_b659e26c03f8.slice/crio-1bb45df3e8027d6c5023023017b2646aeff498f99c7be70e6e07bdb8168e17de.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4c57b50_c8ce_423b_9d25_b659e26c03f8.slice/crio-conmon-1bb45df3e8027d6c5023023017b2646aeff498f99c7be70e6e07bdb8168e17de.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:30:01.999897 master-0 kubenswrapper[36504]: I1203 22:30:01.999831 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt"] Dec 03 22:30:02.229443 master-0 kubenswrapper[36504]: I1203 22:30:02.229386 36504 generic.go:334] "Generic (PLEG): container finished" podID="e4c57b50-c8ce-423b-9d25-b659e26c03f8" containerID="1bb45df3e8027d6c5023023017b2646aeff498f99c7be70e6e07bdb8168e17de" exitCode=0 Dec 03 22:30:02.229925 master-0 kubenswrapper[36504]: I1203 22:30:02.229590 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" event={"ID":"e4c57b50-c8ce-423b-9d25-b659e26c03f8","Type":"ContainerDied","Data":"1bb45df3e8027d6c5023023017b2646aeff498f99c7be70e6e07bdb8168e17de"} Dec 03 22:30:02.235349 master-0 kubenswrapper[36504]: I1203 22:30:02.235286 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" event={"ID":"5803fb46-dff1-46a8-b7b8-e52a1261e409","Type":"ContainerStarted","Data":"899f6b9fd95772069965efff0656e52a0e6c37c630c01966952882a1d0c4e12c"} Dec 03 22:30:02.264421 master-0 kubenswrapper[36504]: I1203 22:30:02.264345 36504 generic.go:334] "Generic (PLEG): container finished" podID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerID="876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb" exitCode=0 Dec 03 22:30:02.264621 master-0 kubenswrapper[36504]: I1203 22:30:02.264445 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerDied","Data":"876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb"} Dec 03 22:30:03.296080 master-0 kubenswrapper[36504]: I1203 22:30:03.295913 36504 generic.go:334] "Generic (PLEG): container finished" podID="5803fb46-dff1-46a8-b7b8-e52a1261e409" containerID="19d2313995bd1b110e25023c35fca32bf181a6c40bd844c6e1adeb019b0c4c5a" exitCode=0 Dec 03 22:30:03.296080 master-0 kubenswrapper[36504]: I1203 22:30:03.295974 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" event={"ID":"5803fb46-dff1-46a8-b7b8-e52a1261e409","Type":"ContainerDied","Data":"19d2313995bd1b110e25023c35fca32bf181a6c40bd844c6e1adeb019b0c4c5a"} Dec 03 22:30:03.301105 master-0 kubenswrapper[36504]: I1203 22:30:03.300963 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerStarted","Data":"47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239"} Dec 03 22:30:03.301105 master-0 kubenswrapper[36504]: I1203 22:30:03.301047 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerStarted","Data":"8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b"} Dec 03 22:30:03.302900 master-0 kubenswrapper[36504]: I1203 22:30:03.302818 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:30:03.302900 master-0 kubenswrapper[36504]: I1203 22:30:03.302873 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:30:03.371433 master-0 kubenswrapper[36504]: I1203 22:30:03.371270 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" podStartSLOduration=3.254282261 podStartE2EDuration="13.371236358s" podCreationTimestamp="2025-12-03 22:29:50 +0000 UTC" firstStartedPulling="2025-12-03 22:29:51.389443618 +0000 UTC m=+1156.609215625" lastFinishedPulling="2025-12-03 22:30:01.506397715 +0000 UTC m=+1166.726169722" observedRunningTime="2025-12-03 22:30:03.350981883 +0000 UTC m=+1168.570753920" watchObservedRunningTime="2025-12-03 22:30:03.371236358 +0000 UTC m=+1168.591008365" Dec 03 22:30:04.035568 master-0 kubenswrapper[36504]: I1203 22:30:04.035516 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:30:04.116954 master-0 kubenswrapper[36504]: I1203 22:30:04.116887 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:04.117352 master-0 kubenswrapper[36504]: I1203 22:30:04.117249 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-central-agent" containerID="cri-o://7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb" gracePeriod=30 Dec 03 22:30:04.117352 master-0 kubenswrapper[36504]: I1203 22:30:04.117290 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-notification-agent" containerID="cri-o://6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544" gracePeriod=30 Dec 03 22:30:04.117352 master-0 kubenswrapper[36504]: I1203 22:30:04.117306 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="sg-core" containerID="cri-o://3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883" gracePeriod=30 Dec 03 22:30:04.117519 master-0 kubenswrapper[36504]: I1203 22:30:04.117259 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="proxy-httpd" containerID="cri-o://0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10" gracePeriod=30 Dec 03 22:30:04.162814 master-0 kubenswrapper[36504]: I1203 22:30:04.161025 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-ssh-key\") pod \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " Dec 03 22:30:04.162814 master-0 kubenswrapper[36504]: I1203 22:30:04.161982 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-inventory\") pod \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " Dec 03 22:30:04.162814 master-0 kubenswrapper[36504]: I1203 22:30:04.162189 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bflp5\" (UniqueName: \"kubernetes.io/projected/e4c57b50-c8ce-423b-9d25-b659e26c03f8-kube-api-access-bflp5\") pod \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " Dec 03 22:30:04.162814 master-0 kubenswrapper[36504]: I1203 22:30:04.162236 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-bootstrap-combined-ca-bundle\") pod \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\" (UID: \"e4c57b50-c8ce-423b-9d25-b659e26c03f8\") " Dec 03 22:30:04.173371 master-0 kubenswrapper[36504]: I1203 22:30:04.173297 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e4c57b50-c8ce-423b-9d25-b659e26c03f8" (UID: "e4c57b50-c8ce-423b-9d25-b659e26c03f8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:04.173665 master-0 kubenswrapper[36504]: I1203 22:30:04.173364 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4c57b50-c8ce-423b-9d25-b659e26c03f8-kube-api-access-bflp5" (OuterVolumeSpecName: "kube-api-access-bflp5") pod "e4c57b50-c8ce-423b-9d25-b659e26c03f8" (UID: "e4c57b50-c8ce-423b-9d25-b659e26c03f8"). InnerVolumeSpecName "kube-api-access-bflp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:04.201602 master-0 kubenswrapper[36504]: I1203 22:30:04.200634 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e4c57b50-c8ce-423b-9d25-b659e26c03f8" (UID: "e4c57b50-c8ce-423b-9d25-b659e26c03f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:04.202623 master-0 kubenswrapper[36504]: I1203 22:30:04.202564 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-inventory" (OuterVolumeSpecName: "inventory") pod "e4c57b50-c8ce-423b-9d25-b659e26c03f8" (UID: "e4c57b50-c8ce-423b-9d25-b659e26c03f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:04.275861 master-0 kubenswrapper[36504]: I1203 22:30:04.274753 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:04.275861 master-0 kubenswrapper[36504]: I1203 22:30:04.274827 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bflp5\" (UniqueName: \"kubernetes.io/projected/e4c57b50-c8ce-423b-9d25-b659e26c03f8-kube-api-access-bflp5\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:04.275861 master-0 kubenswrapper[36504]: I1203 22:30:04.274843 36504 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-bootstrap-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:04.275861 master-0 kubenswrapper[36504]: I1203 22:30:04.274854 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e4c57b50-c8ce-423b-9d25-b659e26c03f8-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:04.329982 master-0 kubenswrapper[36504]: I1203 22:30:04.329795 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" Dec 03 22:30:04.331324 master-0 kubenswrapper[36504]: I1203 22:30:04.330050 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-dataplane-os-edpm-s7gzj" event={"ID":"e4c57b50-c8ce-423b-9d25-b659e26c03f8","Type":"ContainerDied","Data":"4cc2a7218ef4d430152d156a688ad6b6df852634d54176ccb20094d11162927c"} Dec 03 22:30:04.331324 master-0 kubenswrapper[36504]: I1203 22:30:04.330159 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cc2a7218ef4d430152d156a688ad6b6df852634d54176ccb20094d11162927c" Dec 03 22:30:04.337032 master-0 kubenswrapper[36504]: I1203 22:30:04.336981 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerID="0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10" exitCode=0 Dec 03 22:30:04.337148 master-0 kubenswrapper[36504]: I1203 22:30:04.337025 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerID="3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883" exitCode=2 Dec 03 22:30:04.337148 master-0 kubenswrapper[36504]: I1203 22:30:04.337056 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerDied","Data":"0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10"} Dec 03 22:30:04.337148 master-0 kubenswrapper[36504]: I1203 22:30:04.337128 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerDied","Data":"3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883"} Dec 03 22:30:04.450986 master-0 kubenswrapper[36504]: I1203 22:30:04.450834 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-dataplane-os-edpm-kmq5s"] Dec 03 22:30:04.452856 master-0 kubenswrapper[36504]: E1203 22:30:04.451710 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4c57b50-c8ce-423b-9d25-b659e26c03f8" containerName="bootstrap-dataplane-os-edpm" Dec 03 22:30:04.452856 master-0 kubenswrapper[36504]: I1203 22:30:04.451732 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4c57b50-c8ce-423b-9d25-b659e26c03f8" containerName="bootstrap-dataplane-os-edpm" Dec 03 22:30:04.452856 master-0 kubenswrapper[36504]: I1203 22:30:04.452213 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4c57b50-c8ce-423b-9d25-b659e26c03f8" containerName="bootstrap-dataplane-os-edpm" Dec 03 22:30:04.453542 master-0 kubenswrapper[36504]: I1203 22:30:04.453424 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.464537 master-0 kubenswrapper[36504]: I1203 22:30:04.456134 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:30:04.464537 master-0 kubenswrapper[36504]: I1203 22:30:04.456511 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:30:04.464537 master-0 kubenswrapper[36504]: I1203 22:30:04.456915 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:30:04.487804 master-0 kubenswrapper[36504]: I1203 22:30:04.487693 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7gb\" (UniqueName: \"kubernetes.io/projected/269654aa-a42a-43d3-a328-45f7c6f7169e-kube-api-access-sh7gb\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.488175 master-0 kubenswrapper[36504]: I1203 22:30:04.488135 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-ssh-key\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.488264 master-0 kubenswrapper[36504]: I1203 22:30:04.488230 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-inventory\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.503879 master-0 kubenswrapper[36504]: I1203 22:30:04.503783 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-dataplane-os-edpm-kmq5s"] Dec 03 22:30:04.589479 master-0 kubenswrapper[36504]: I1203 22:30:04.589372 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-ssh-key\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.589479 master-0 kubenswrapper[36504]: I1203 22:30:04.589490 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-inventory\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.589904 master-0 kubenswrapper[36504]: I1203 22:30:04.589616 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7gb\" (UniqueName: \"kubernetes.io/projected/269654aa-a42a-43d3-a328-45f7c6f7169e-kube-api-access-sh7gb\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.606803 master-0 kubenswrapper[36504]: I1203 22:30:04.594644 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-inventory\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.611814 master-0 kubenswrapper[36504]: I1203 22:30:04.610181 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-ssh-key\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.615052 master-0 kubenswrapper[36504]: I1203 22:30:04.614999 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7gb\" (UniqueName: \"kubernetes.io/projected/269654aa-a42a-43d3-a328-45f7c6f7169e-kube-api-access-sh7gb\") pod \"configure-network-dataplane-os-edpm-kmq5s\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:04.806155 master-0 kubenswrapper[36504]: I1203 22:30:04.806113 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:04.814934 master-0 kubenswrapper[36504]: I1203 22:30:04.814888 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:30:05.013435 master-0 kubenswrapper[36504]: I1203 22:30:05.013278 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hzjk\" (UniqueName: \"kubernetes.io/projected/5803fb46-dff1-46a8-b7b8-e52a1261e409-kube-api-access-2hzjk\") pod \"5803fb46-dff1-46a8-b7b8-e52a1261e409\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " Dec 03 22:30:05.014915 master-0 kubenswrapper[36504]: I1203 22:30:05.013808 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5803fb46-dff1-46a8-b7b8-e52a1261e409-secret-volume\") pod \"5803fb46-dff1-46a8-b7b8-e52a1261e409\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " Dec 03 22:30:05.014915 master-0 kubenswrapper[36504]: I1203 22:30:05.014328 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5803fb46-dff1-46a8-b7b8-e52a1261e409-config-volume\") pod \"5803fb46-dff1-46a8-b7b8-e52a1261e409\" (UID: \"5803fb46-dff1-46a8-b7b8-e52a1261e409\") " Dec 03 22:30:05.016338 master-0 kubenswrapper[36504]: I1203 22:30:05.016297 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5803fb46-dff1-46a8-b7b8-e52a1261e409-config-volume" (OuterVolumeSpecName: "config-volume") pod "5803fb46-dff1-46a8-b7b8-e52a1261e409" (UID: "5803fb46-dff1-46a8-b7b8-e52a1261e409"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:30:05.019308 master-0 kubenswrapper[36504]: I1203 22:30:05.019061 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5803fb46-dff1-46a8-b7b8-e52a1261e409-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5803fb46-dff1-46a8-b7b8-e52a1261e409" (UID: "5803fb46-dff1-46a8-b7b8-e52a1261e409"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:05.019391 master-0 kubenswrapper[36504]: I1203 22:30:05.019290 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5803fb46-dff1-46a8-b7b8-e52a1261e409-kube-api-access-2hzjk" (OuterVolumeSpecName: "kube-api-access-2hzjk") pod "5803fb46-dff1-46a8-b7b8-e52a1261e409" (UID: "5803fb46-dff1-46a8-b7b8-e52a1261e409"). InnerVolumeSpecName "kube-api-access-2hzjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:05.122033 master-0 kubenswrapper[36504]: I1203 22:30:05.121966 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5803fb46-dff1-46a8-b7b8-e52a1261e409-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:05.122033 master-0 kubenswrapper[36504]: I1203 22:30:05.122024 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hzjk\" (UniqueName: \"kubernetes.io/projected/5803fb46-dff1-46a8-b7b8-e52a1261e409-kube-api-access-2hzjk\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:05.122033 master-0 kubenswrapper[36504]: I1203 22:30:05.122040 36504 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5803fb46-dff1-46a8-b7b8-e52a1261e409-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:05.361545 master-0 kubenswrapper[36504]: I1203 22:30:05.361460 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerID="7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb" exitCode=0 Dec 03 22:30:05.362530 master-0 kubenswrapper[36504]: I1203 22:30:05.361580 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerDied","Data":"7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb"} Dec 03 22:30:05.366002 master-0 kubenswrapper[36504]: I1203 22:30:05.365955 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" event={"ID":"5803fb46-dff1-46a8-b7b8-e52a1261e409","Type":"ContainerDied","Data":"899f6b9fd95772069965efff0656e52a0e6c37c630c01966952882a1d0c4e12c"} Dec 03 22:30:05.366002 master-0 kubenswrapper[36504]: I1203 22:30:05.365998 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899f6b9fd95772069965efff0656e52a0e6c37c630c01966952882a1d0c4e12c" Dec 03 22:30:05.366175 master-0 kubenswrapper[36504]: I1203 22:30:05.366070 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt" Dec 03 22:30:05.463797 master-0 kubenswrapper[36504]: I1203 22:30:05.461047 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-dataplane-os-edpm-kmq5s"] Dec 03 22:30:05.914730 master-0 kubenswrapper[36504]: I1203 22:30:05.914619 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv"] Dec 03 22:30:05.928606 master-0 kubenswrapper[36504]: I1203 22:30:05.928494 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413305-sztmv"] Dec 03 22:30:06.385926 master-0 kubenswrapper[36504]: I1203 22:30:06.385843 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" event={"ID":"269654aa-a42a-43d3-a328-45f7c6f7169e","Type":"ContainerStarted","Data":"7600e7320b4ff19915af290fa5314e82ee6b08fe9c983ca80f63f53b8fd31fe4"} Dec 03 22:30:07.114415 master-0 kubenswrapper[36504]: I1203 22:30:07.114330 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dcf886d-2028-4acd-83ac-850c4b278810" path="/var/lib/kubelet/pods/2dcf886d-2028-4acd-83ac-850c4b278810/volumes" Dec 03 22:30:08.420345 master-0 kubenswrapper[36504]: I1203 22:30:08.420275 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" event={"ID":"269654aa-a42a-43d3-a328-45f7c6f7169e","Type":"ContainerStarted","Data":"68a357078b2461f00e4c1361e591041403f854a8729f298ba29b6cab4aa8d26b"} Dec 03 22:30:08.449548 master-0 kubenswrapper[36504]: I1203 22:30:08.449446 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" podStartSLOduration=2.63684705 podStartE2EDuration="4.44941404s" podCreationTimestamp="2025-12-03 22:30:04 +0000 UTC" firstStartedPulling="2025-12-03 22:30:05.448666326 +0000 UTC m=+1170.668438333" lastFinishedPulling="2025-12-03 22:30:07.261233316 +0000 UTC m=+1172.481005323" observedRunningTime="2025-12-03 22:30:08.445110947 +0000 UTC m=+1173.664882974" watchObservedRunningTime="2025-12-03 22:30:08.44941404 +0000 UTC m=+1173.669186057" Dec 03 22:30:08.558588 master-0 kubenswrapper[36504]: I1203 22:30:08.558500 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.128.1.53:3000/\": dial tcp 10.128.1.53:3000: connect: connection refused" Dec 03 22:30:10.021721 master-0 kubenswrapper[36504]: I1203 22:30:10.021651 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:10.203833 master-0 kubenswrapper[36504]: I1203 22:30:10.203746 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-sg-core-conf-yaml\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.204110 master-0 kubenswrapper[36504]: I1203 22:30:10.204004 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-log-httpd\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.204255 master-0 kubenswrapper[36504]: I1203 22:30:10.204125 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-combined-ca-bundle\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.204361 master-0 kubenswrapper[36504]: I1203 22:30:10.204334 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-config-data\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.204421 master-0 kubenswrapper[36504]: I1203 22:30:10.204370 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-ceilometer-tls-certs\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.205010 master-0 kubenswrapper[36504]: I1203 22:30:10.204812 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:10.205010 master-0 kubenswrapper[36504]: I1203 22:30:10.204846 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-run-httpd\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.205112 master-0 kubenswrapper[36504]: I1203 22:30:10.205062 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bzr\" (UniqueName: \"kubernetes.io/projected/ef9588fc-d81b-4470-b1be-7687ce7438aa-kube-api-access-l4bzr\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.205302 master-0 kubenswrapper[36504]: I1203 22:30:10.205265 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:10.205400 master-0 kubenswrapper[36504]: I1203 22:30:10.205370 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-scripts\") pod \"ef9588fc-d81b-4470-b1be-7687ce7438aa\" (UID: \"ef9588fc-d81b-4470-b1be-7687ce7438aa\") " Dec 03 22:30:10.208293 master-0 kubenswrapper[36504]: I1203 22:30:10.208247 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.208385 master-0 kubenswrapper[36504]: I1203 22:30:10.208295 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef9588fc-d81b-4470-b1be-7687ce7438aa-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.209877 master-0 kubenswrapper[36504]: I1203 22:30:10.209824 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-scripts" (OuterVolumeSpecName: "scripts") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:10.226290 master-0 kubenswrapper[36504]: I1203 22:30:10.226199 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef9588fc-d81b-4470-b1be-7687ce7438aa-kube-api-access-l4bzr" (OuterVolumeSpecName: "kube-api-access-l4bzr") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "kube-api-access-l4bzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:10.257339 master-0 kubenswrapper[36504]: I1203 22:30:10.255882 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:10.315407 master-0 kubenswrapper[36504]: I1203 22:30:10.311299 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bzr\" (UniqueName: \"kubernetes.io/projected/ef9588fc-d81b-4470-b1be-7687ce7438aa-kube-api-access-l4bzr\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.315407 master-0 kubenswrapper[36504]: I1203 22:30:10.311336 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.315407 master-0 kubenswrapper[36504]: I1203 22:30:10.311349 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.315407 master-0 kubenswrapper[36504]: I1203 22:30:10.310849 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:10.331590 master-0 kubenswrapper[36504]: I1203 22:30:10.331485 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:10.359297 master-0 kubenswrapper[36504]: I1203 22:30:10.358558 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-config-data" (OuterVolumeSpecName: "config-data") pod "ef9588fc-d81b-4470-b1be-7687ce7438aa" (UID: "ef9588fc-d81b-4470-b1be-7687ce7438aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:10.414632 master-0 kubenswrapper[36504]: I1203 22:30:10.414482 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.414632 master-0 kubenswrapper[36504]: I1203 22:30:10.414527 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.414632 master-0 kubenswrapper[36504]: I1203 22:30:10.414538 36504 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef9588fc-d81b-4470-b1be-7687ce7438aa-ceilometer-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:10.462201 master-0 kubenswrapper[36504]: I1203 22:30:10.458422 36504 generic.go:334] "Generic (PLEG): container finished" podID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerID="6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544" exitCode=0 Dec 03 22:30:10.462201 master-0 kubenswrapper[36504]: I1203 22:30:10.458496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerDied","Data":"6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544"} Dec 03 22:30:10.462201 master-0 kubenswrapper[36504]: I1203 22:30:10.458545 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef9588fc-d81b-4470-b1be-7687ce7438aa","Type":"ContainerDied","Data":"ff085e4171eeca6ae03c12f8dc861070e970e63d493195da70ac7109209676b9"} Dec 03 22:30:10.462201 master-0 kubenswrapper[36504]: I1203 22:30:10.458569 36504 scope.go:117] "RemoveContainer" containerID="0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10" Dec 03 22:30:10.462201 master-0 kubenswrapper[36504]: I1203 22:30:10.458817 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:10.553983 master-0 kubenswrapper[36504]: I1203 22:30:10.553284 36504 scope.go:117] "RemoveContainer" containerID="3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883" Dec 03 22:30:10.562267 master-0 kubenswrapper[36504]: I1203 22:30:10.562192 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:10.581400 master-0 kubenswrapper[36504]: I1203 22:30:10.581316 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:10.597731 master-0 kubenswrapper[36504]: I1203 22:30:10.597511 36504 scope.go:117] "RemoveContainer" containerID="6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.612951 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: E1203 22:30:10.613760 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5803fb46-dff1-46a8-b7b8-e52a1261e409" containerName="collect-profiles" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.613856 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5803fb46-dff1-46a8-b7b8-e52a1261e409" containerName="collect-profiles" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: E1203 22:30:10.613888 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-central-agent" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.613900 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-central-agent" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: E1203 22:30:10.613920 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-notification-agent" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.613928 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-notification-agent" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: E1203 22:30:10.613989 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="proxy-httpd" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.613997 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="proxy-httpd" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: E1203 22:30:10.614046 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="sg-core" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.614054 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="sg-core" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.614379 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="proxy-httpd" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.614424 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="5803fb46-dff1-46a8-b7b8-e52a1261e409" containerName="collect-profiles" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.614478 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-central-agent" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.614498 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="sg-core" Dec 03 22:30:10.615800 master-0 kubenswrapper[36504]: I1203 22:30:10.615014 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" containerName="ceilometer-notification-agent" Dec 03 22:30:10.623302 master-0 kubenswrapper[36504]: I1203 22:30:10.621467 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:10.623928 master-0 kubenswrapper[36504]: I1203 22:30:10.623879 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:30:10.650993 master-0 kubenswrapper[36504]: I1203 22:30:10.633192 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:10.654933 master-0 kubenswrapper[36504]: I1203 22:30:10.654748 36504 scope.go:117] "RemoveContainer" containerID="7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb" Dec 03 22:30:10.655557 master-0 kubenswrapper[36504]: I1203 22:30:10.655457 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:30:10.655914 master-0 kubenswrapper[36504]: I1203 22:30:10.655743 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 22:30:10.706547 master-0 kubenswrapper[36504]: I1203 22:30:10.706479 36504 scope.go:117] "RemoveContainer" containerID="0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10" Dec 03 22:30:10.706981 master-0 kubenswrapper[36504]: E1203 22:30:10.706949 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10\": container with ID starting with 0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10 not found: ID does not exist" containerID="0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10" Dec 03 22:30:10.707031 master-0 kubenswrapper[36504]: I1203 22:30:10.706987 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10"} err="failed to get container status \"0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10\": rpc error: code = NotFound desc = could not find container \"0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10\": container with ID starting with 0d2865671c298e3b2adc069c1166c53ab06b798396a9a036a40c70f14e93bc10 not found: ID does not exist" Dec 03 22:30:10.707031 master-0 kubenswrapper[36504]: I1203 22:30:10.707019 36504 scope.go:117] "RemoveContainer" containerID="3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883" Dec 03 22:30:10.707472 master-0 kubenswrapper[36504]: E1203 22:30:10.707423 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883\": container with ID starting with 3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883 not found: ID does not exist" containerID="3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883" Dec 03 22:30:10.707526 master-0 kubenswrapper[36504]: I1203 22:30:10.707492 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883"} err="failed to get container status \"3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883\": rpc error: code = NotFound desc = could not find container \"3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883\": container with ID starting with 3aa62c55977120f1d4d8b0540e42e5e5fa9c142fcc57e6205468dc0c57a98883 not found: ID does not exist" Dec 03 22:30:10.707564 master-0 kubenswrapper[36504]: I1203 22:30:10.707535 36504 scope.go:117] "RemoveContainer" containerID="6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544" Dec 03 22:30:10.708454 master-0 kubenswrapper[36504]: E1203 22:30:10.708366 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544\": container with ID starting with 6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544 not found: ID does not exist" containerID="6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544" Dec 03 22:30:10.708530 master-0 kubenswrapper[36504]: I1203 22:30:10.708465 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544"} err="failed to get container status \"6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544\": rpc error: code = NotFound desc = could not find container \"6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544\": container with ID starting with 6c38397b934b3d97e2df2b384ab4b734fe267523e8e3b0de142e97953791e544 not found: ID does not exist" Dec 03 22:30:10.708587 master-0 kubenswrapper[36504]: I1203 22:30:10.708531 36504 scope.go:117] "RemoveContainer" containerID="7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb" Dec 03 22:30:10.708906 master-0 kubenswrapper[36504]: E1203 22:30:10.708879 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb\": container with ID starting with 7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb not found: ID does not exist" containerID="7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb" Dec 03 22:30:10.708960 master-0 kubenswrapper[36504]: I1203 22:30:10.708906 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb"} err="failed to get container status \"7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb\": rpc error: code = NotFound desc = could not find container \"7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb\": container with ID starting with 7f2dd1dea63364bc458a7db14b26f7e661c1cb8548fce259f29a35a2fa5b32bb not found: ID does not exist" Dec 03 22:30:10.723524 master-0 kubenswrapper[36504]: I1203 22:30:10.723436 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.723524 master-0 kubenswrapper[36504]: I1203 22:30:10.723526 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-config-data\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.724070 master-0 kubenswrapper[36504]: I1203 22:30:10.723605 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-log-httpd\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.724070 master-0 kubenswrapper[36504]: I1203 22:30:10.723734 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.724070 master-0 kubenswrapper[36504]: I1203 22:30:10.723806 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-scripts\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.724070 master-0 kubenswrapper[36504]: I1203 22:30:10.723833 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.724070 master-0 kubenswrapper[36504]: I1203 22:30:10.723854 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbrhq\" (UniqueName: \"kubernetes.io/projected/34bc643f-a29d-4e07-8ac8-af56fdbe5671-kube-api-access-mbrhq\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.724070 master-0 kubenswrapper[36504]: I1203 22:30:10.723927 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-run-httpd\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826563 master-0 kubenswrapper[36504]: I1203 22:30:10.826434 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826563 master-0 kubenswrapper[36504]: I1203 22:30:10.826536 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-config-data\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826928 master-0 kubenswrapper[36504]: I1203 22:30:10.826649 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-log-httpd\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826928 master-0 kubenswrapper[36504]: I1203 22:30:10.826689 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826928 master-0 kubenswrapper[36504]: I1203 22:30:10.826714 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-scripts\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826928 master-0 kubenswrapper[36504]: I1203 22:30:10.826735 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826928 master-0 kubenswrapper[36504]: I1203 22:30:10.826754 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbrhq\" (UniqueName: \"kubernetes.io/projected/34bc643f-a29d-4e07-8ac8-af56fdbe5671-kube-api-access-mbrhq\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.826928 master-0 kubenswrapper[36504]: I1203 22:30:10.826886 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-run-httpd\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.827931 master-0 kubenswrapper[36504]: I1203 22:30:10.827418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-log-httpd\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.827931 master-0 kubenswrapper[36504]: I1203 22:30:10.827645 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-run-httpd\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.830792 master-0 kubenswrapper[36504]: I1203 22:30:10.830712 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.831487 master-0 kubenswrapper[36504]: I1203 22:30:10.830906 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.833510 master-0 kubenswrapper[36504]: I1203 22:30:10.833204 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.833510 master-0 kubenswrapper[36504]: I1203 22:30:10.833427 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-scripts\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.834759 master-0 kubenswrapper[36504]: I1203 22:30:10.834563 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-config-data\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.852548 master-0 kubenswrapper[36504]: I1203 22:30:10.852485 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbrhq\" (UniqueName: \"kubernetes.io/projected/34bc643f-a29d-4e07-8ac8-af56fdbe5671-kube-api-access-mbrhq\") pod \"ceilometer-0\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " pod="openstack/ceilometer-0" Dec 03 22:30:10.988291 master-0 kubenswrapper[36504]: I1203 22:30:10.988130 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:11.125910 master-0 kubenswrapper[36504]: I1203 22:30:11.124891 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef9588fc-d81b-4470-b1be-7687ce7438aa" path="/var/lib/kubelet/pods/ef9588fc-d81b-4470-b1be-7687ce7438aa/volumes" Dec 03 22:30:11.564665 master-0 kubenswrapper[36504]: I1203 22:30:11.564453 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:11.568525 master-0 kubenswrapper[36504]: W1203 22:30:11.568480 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bc643f_a29d_4e07_8ac8_af56fdbe5671.slice/crio-b7847a4b396511ff745fc2e3316f1dea50b7385aea9df96b4142f3befda75030 WatchSource:0}: Error finding container b7847a4b396511ff745fc2e3316f1dea50b7385aea9df96b4142f3befda75030: Status 404 returned error can't find the container with id b7847a4b396511ff745fc2e3316f1dea50b7385aea9df96b4142f3befda75030 Dec 03 22:30:12.498863 master-0 kubenswrapper[36504]: I1203 22:30:12.498783 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerStarted","Data":"5c36507108428f62aa57a70f371f97e561a039db9dd72a2773970fa738c67826"} Dec 03 22:30:12.498863 master-0 kubenswrapper[36504]: I1203 22:30:12.498863 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerStarted","Data":"b7847a4b396511ff745fc2e3316f1dea50b7385aea9df96b4142f3befda75030"} Dec 03 22:30:13.522909 master-0 kubenswrapper[36504]: I1203 22:30:13.522693 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerStarted","Data":"bd413b0458e7126572ba84a93d545dde695eb5108f570ee704d17cfab7944d1d"} Dec 03 22:30:14.540914 master-0 kubenswrapper[36504]: I1203 22:30:14.540832 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerStarted","Data":"c0360480d1510e9828042cd238f5b4c0b09a119c48cf141b9e6bafec39fa03d7"} Dec 03 22:30:16.574839 master-0 kubenswrapper[36504]: I1203 22:30:16.574743 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerStarted","Data":"4a810500045a0c37acd01e0cf8e51cd02adcb8f7c16455ee2cbf011e4203373e"} Dec 03 22:30:16.575496 master-0 kubenswrapper[36504]: I1203 22:30:16.574974 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:30:16.634808 master-0 kubenswrapper[36504]: I1203 22:30:16.626942 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.953158787 podStartE2EDuration="6.626913756s" podCreationTimestamp="2025-12-03 22:30:10 +0000 UTC" firstStartedPulling="2025-12-03 22:30:11.574285472 +0000 UTC m=+1176.794057479" lastFinishedPulling="2025-12-03 22:30:15.248040441 +0000 UTC m=+1180.467812448" observedRunningTime="2025-12-03 22:30:16.608274371 +0000 UTC m=+1181.828046388" watchObservedRunningTime="2025-12-03 22:30:16.626913756 +0000 UTC m=+1181.846685783" Dec 03 22:30:25.590790 master-0 kubenswrapper[36504]: I1203 22:30:25.588423 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-flg69"] Dec 03 22:30:25.596798 master-0 kubenswrapper[36504]: I1203 22:30:25.591464 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.596798 master-0 kubenswrapper[36504]: I1203 22:30:25.594296 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 03 22:30:25.596798 master-0 kubenswrapper[36504]: I1203 22:30:25.594639 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 03 22:30:25.596798 master-0 kubenswrapper[36504]: I1203 22:30:25.594832 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 03 22:30:25.611798 master-0 kubenswrapper[36504]: I1203 22:30:25.604488 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-flg69"] Dec 03 22:30:25.654714 master-0 kubenswrapper[36504]: I1203 22:30:25.654637 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:30:25.657911 master-0 kubenswrapper[36504]: I1203 22:30:25.657861 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1b4d21c0-99d8-4d30-80d0-0b558644643a-hm-ports\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.660054 master-0 kubenswrapper[36504]: I1203 22:30:25.657988 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d21c0-99d8-4d30-80d0-0b558644643a-scripts\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.660054 master-0 kubenswrapper[36504]: I1203 22:30:25.658049 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1b4d21c0-99d8-4d30-80d0-0b558644643a-config-data-merged\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.660054 master-0 kubenswrapper[36504]: I1203 22:30:25.659416 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d21c0-99d8-4d30-80d0-0b558644643a-config-data\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.752364 master-0 kubenswrapper[36504]: I1203 22:30:25.752076 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:30:25.761852 master-0 kubenswrapper[36504]: I1203 22:30:25.761729 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1b4d21c0-99d8-4d30-80d0-0b558644643a-hm-ports\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.761852 master-0 kubenswrapper[36504]: I1203 22:30:25.761836 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d21c0-99d8-4d30-80d0-0b558644643a-scripts\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.762158 master-0 kubenswrapper[36504]: I1203 22:30:25.761880 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1b4d21c0-99d8-4d30-80d0-0b558644643a-config-data-merged\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.762158 master-0 kubenswrapper[36504]: I1203 22:30:25.762046 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d21c0-99d8-4d30-80d0-0b558644643a-config-data\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.762910 master-0 kubenswrapper[36504]: I1203 22:30:25.762797 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1b4d21c0-99d8-4d30-80d0-0b558644643a-config-data-merged\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.765385 master-0 kubenswrapper[36504]: I1203 22:30:25.765270 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/1b4d21c0-99d8-4d30-80d0-0b558644643a-hm-ports\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.765385 master-0 kubenswrapper[36504]: I1203 22:30:25.765371 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b4d21c0-99d8-4d30-80d0-0b558644643a-config-data\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.767413 master-0 kubenswrapper[36504]: I1203 22:30:25.767378 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b4d21c0-99d8-4d30-80d0-0b558644643a-scripts\") pod \"octavia-rsyslog-flg69\" (UID: \"1b4d21c0-99d8-4d30-80d0-0b558644643a\") " pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:25.952632 master-0 kubenswrapper[36504]: I1203 22:30:25.952563 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:26.440341 master-0 kubenswrapper[36504]: I1203 22:30:26.440222 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fjctx"] Dec 03 22:30:26.446747 master-0 kubenswrapper[36504]: I1203 22:30:26.446214 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.452652 master-0 kubenswrapper[36504]: I1203 22:30:26.452090 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 03 22:30:26.482803 master-0 kubenswrapper[36504]: I1203 22:30:26.480256 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fjctx"] Dec 03 22:30:26.511585 master-0 kubenswrapper[36504]: I1203 22:30:26.511309 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80423a1b-a97b-45ae-ba50-7b25fb2f2007-httpd-config\") pod \"octavia-image-upload-56c9f55b99-fjctx\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.511585 master-0 kubenswrapper[36504]: I1203 22:30:26.511456 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/80423a1b-a97b-45ae-ba50-7b25fb2f2007-amphora-image\") pod \"octavia-image-upload-56c9f55b99-fjctx\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.615677 master-0 kubenswrapper[36504]: I1203 22:30:26.615581 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80423a1b-a97b-45ae-ba50-7b25fb2f2007-httpd-config\") pod \"octavia-image-upload-56c9f55b99-fjctx\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.616330 master-0 kubenswrapper[36504]: I1203 22:30:26.615901 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/80423a1b-a97b-45ae-ba50-7b25fb2f2007-amphora-image\") pod \"octavia-image-upload-56c9f55b99-fjctx\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.621427 master-0 kubenswrapper[36504]: I1203 22:30:26.621365 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/80423a1b-a97b-45ae-ba50-7b25fb2f2007-amphora-image\") pod \"octavia-image-upload-56c9f55b99-fjctx\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.633880 master-0 kubenswrapper[36504]: I1203 22:30:26.628962 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80423a1b-a97b-45ae-ba50-7b25fb2f2007-httpd-config\") pod \"octavia-image-upload-56c9f55b99-fjctx\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:26.667289 master-0 kubenswrapper[36504]: I1203 22:30:26.665503 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-flg69"] Dec 03 22:30:26.728361 master-0 kubenswrapper[36504]: I1203 22:30:26.728276 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flg69" event={"ID":"1b4d21c0-99d8-4d30-80d0-0b558644643a","Type":"ContainerStarted","Data":"154b7e1106aaa1e455ae357f69514355be242faaf156e58d8baaeb80c25cc13a"} Dec 03 22:30:26.803505 master-0 kubenswrapper[36504]: I1203 22:30:26.803420 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:30:27.972335 master-0 kubenswrapper[36504]: I1203 22:30:27.972232 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fjctx"] Dec 03 22:30:28.712801 master-0 kubenswrapper[36504]: I1203 22:30:28.708375 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-79b888857f-grgwt"] Dec 03 22:30:28.730237 master-0 kubenswrapper[36504]: I1203 22:30:28.718380 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.730237 master-0 kubenswrapper[36504]: I1203 22:30:28.725003 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Dec 03 22:30:28.730237 master-0 kubenswrapper[36504]: I1203 22:30:28.725292 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Dec 03 22:30:28.760204 master-0 kubenswrapper[36504]: I1203 22:30:28.758596 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-79b888857f-grgwt"] Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.800681 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-public-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.800748 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac652e9-13ae-4990-844d-93c97bc0d43c-octavia-run\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801009 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" event={"ID":"80423a1b-a97b-45ae-ba50-7b25fb2f2007","Type":"ContainerStarted","Data":"55e4e9d6c6420bb0f0071998a44b9f445f39c94986a8115b0277db8567d0810f"} Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801055 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4ac652e9-13ae-4990-844d-93c97bc0d43c-config-data-merged\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801088 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-config-data\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801137 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-internal-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801209 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-ovndb-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801320 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-combined-ca-bundle\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.801939 master-0 kubenswrapper[36504]: I1203 22:30:28.801394 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-scripts\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.905422 master-0 kubenswrapper[36504]: I1203 22:30:28.905321 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-public-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.905739 master-0 kubenswrapper[36504]: I1203 22:30:28.905437 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac652e9-13ae-4990-844d-93c97bc0d43c-octavia-run\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.906542 master-0 kubenswrapper[36504]: I1203 22:30:28.906491 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4ac652e9-13ae-4990-844d-93c97bc0d43c-config-data-merged\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.906622 master-0 kubenswrapper[36504]: I1203 22:30:28.906567 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-config-data\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.906721 master-0 kubenswrapper[36504]: I1203 22:30:28.906701 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-internal-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.906919 master-0 kubenswrapper[36504]: I1203 22:30:28.906880 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4ac652e9-13ae-4990-844d-93c97bc0d43c-octavia-run\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.907040 master-0 kubenswrapper[36504]: I1203 22:30:28.906997 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-ovndb-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.907295 master-0 kubenswrapper[36504]: I1203 22:30:28.907262 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-combined-ca-bundle\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.907637 master-0 kubenswrapper[36504]: I1203 22:30:28.907584 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4ac652e9-13ae-4990-844d-93c97bc0d43c-config-data-merged\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.911167 master-0 kubenswrapper[36504]: I1203 22:30:28.911135 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-scripts\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.912494 master-0 kubenswrapper[36504]: I1203 22:30:28.912456 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-internal-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.914364 master-0 kubenswrapper[36504]: I1203 22:30:28.914324 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-ovndb-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.915579 master-0 kubenswrapper[36504]: I1203 22:30:28.915525 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-public-tls-certs\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.920318 master-0 kubenswrapper[36504]: I1203 22:30:28.920243 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-scripts\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.921502 master-0 kubenswrapper[36504]: I1203 22:30:28.921190 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-combined-ca-bundle\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:28.957395 master-0 kubenswrapper[36504]: I1203 22:30:28.957327 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac652e9-13ae-4990-844d-93c97bc0d43c-config-data\") pod \"octavia-api-79b888857f-grgwt\" (UID: \"4ac652e9-13ae-4990-844d-93c97bc0d43c\") " pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:29.079029 master-0 kubenswrapper[36504]: I1203 22:30:29.078850 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:30.108422 master-0 kubenswrapper[36504]: I1203 22:30:30.108363 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-79b888857f-grgwt"] Dec 03 22:30:30.896830 master-0 kubenswrapper[36504]: I1203 22:30:30.895678 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79b888857f-grgwt" event={"ID":"4ac652e9-13ae-4990-844d-93c97bc0d43c","Type":"ContainerStarted","Data":"7a16cfff6385d0a443a1673074c3c48e876985456f64e387348b6a4412af3169"} Dec 03 22:30:30.899283 master-0 kubenswrapper[36504]: I1203 22:30:30.898725 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flg69" event={"ID":"1b4d21c0-99d8-4d30-80d0-0b558644643a","Type":"ContainerStarted","Data":"a6da9fc178518feebccd4d531d9d16b0b7b528e5433e3b1037914bad6f7af584"} Dec 03 22:30:31.701502 master-0 kubenswrapper[36504]: I1203 22:30:31.701412 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:31.702310 master-0 kubenswrapper[36504]: I1203 22:30:31.701825 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-central-agent" containerID="cri-o://5c36507108428f62aa57a70f371f97e561a039db9dd72a2773970fa738c67826" gracePeriod=30 Dec 03 22:30:31.702310 master-0 kubenswrapper[36504]: I1203 22:30:31.702012 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-notification-agent" containerID="cri-o://bd413b0458e7126572ba84a93d545dde695eb5108f570ee704d17cfab7944d1d" gracePeriod=30 Dec 03 22:30:31.702310 master-0 kubenswrapper[36504]: I1203 22:30:31.701977 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="sg-core" containerID="cri-o://c0360480d1510e9828042cd238f5b4c0b09a119c48cf141b9e6bafec39fa03d7" gracePeriod=30 Dec 03 22:30:31.702310 master-0 kubenswrapper[36504]: I1203 22:30:31.702221 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="proxy-httpd" containerID="cri-o://4a810500045a0c37acd01e0cf8e51cd02adcb8f7c16455ee2cbf011e4203373e" gracePeriod=30 Dec 03 22:30:31.732937 master-0 kubenswrapper[36504]: I1203 22:30:31.728177 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 22:30:31.933094 master-0 kubenswrapper[36504]: I1203 22:30:31.932946 36504 generic.go:334] "Generic (PLEG): container finished" podID="4ac652e9-13ae-4990-844d-93c97bc0d43c" containerID="3f40bd6a0d951131894ca3f02d1bc45e99dd58a2aaa99b9591808d6fc3f9daba" exitCode=0 Dec 03 22:30:31.933094 master-0 kubenswrapper[36504]: I1203 22:30:31.933016 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79b888857f-grgwt" event={"ID":"4ac652e9-13ae-4990-844d-93c97bc0d43c","Type":"ContainerDied","Data":"3f40bd6a0d951131894ca3f02d1bc45e99dd58a2aaa99b9591808d6fc3f9daba"} Dec 03 22:30:31.953608 master-0 kubenswrapper[36504]: I1203 22:30:31.953540 36504 generic.go:334] "Generic (PLEG): container finished" podID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerID="c0360480d1510e9828042cd238f5b4c0b09a119c48cf141b9e6bafec39fa03d7" exitCode=2 Dec 03 22:30:31.954804 master-0 kubenswrapper[36504]: I1203 22:30:31.954719 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerDied","Data":"c0360480d1510e9828042cd238f5b4c0b09a119c48cf141b9e6bafec39fa03d7"} Dec 03 22:30:31.989321 master-0 kubenswrapper[36504]: E1203 22:30:31.989237 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bc643f_a29d_4e07_8ac8_af56fdbe5671.slice/crio-conmon-c0360480d1510e9828042cd238f5b4c0b09a119c48cf141b9e6bafec39fa03d7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bc643f_a29d_4e07_8ac8_af56fdbe5671.slice/crio-4a810500045a0c37acd01e0cf8e51cd02adcb8f7c16455ee2cbf011e4203373e.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:30:32.996701 master-0 kubenswrapper[36504]: I1203 22:30:32.996570 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79b888857f-grgwt" event={"ID":"4ac652e9-13ae-4990-844d-93c97bc0d43c","Type":"ContainerStarted","Data":"d33aedbc5414822b6d909893e79b7eefc00700a4f7d732b3cbf8d32717903a48"} Dec 03 22:30:32.996701 master-0 kubenswrapper[36504]: I1203 22:30:32.996681 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-79b888857f-grgwt" event={"ID":"4ac652e9-13ae-4990-844d-93c97bc0d43c","Type":"ContainerStarted","Data":"951fbf98de676470643e965ce768d8b99cc7ca5c762b3430167ed708e966e63b"} Dec 03 22:30:32.996701 master-0 kubenswrapper[36504]: I1203 22:30:32.996701 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:32.997499 master-0 kubenswrapper[36504]: I1203 22:30:32.996782 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:33.004609 master-0 kubenswrapper[36504]: I1203 22:30:33.004541 36504 generic.go:334] "Generic (PLEG): container finished" podID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerID="4a810500045a0c37acd01e0cf8e51cd02adcb8f7c16455ee2cbf011e4203373e" exitCode=0 Dec 03 22:30:33.004609 master-0 kubenswrapper[36504]: I1203 22:30:33.004593 36504 generic.go:334] "Generic (PLEG): container finished" podID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerID="5c36507108428f62aa57a70f371f97e561a039db9dd72a2773970fa738c67826" exitCode=0 Dec 03 22:30:33.005052 master-0 kubenswrapper[36504]: I1203 22:30:33.004603 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerDied","Data":"4a810500045a0c37acd01e0cf8e51cd02adcb8f7c16455ee2cbf011e4203373e"} Dec 03 22:30:33.005052 master-0 kubenswrapper[36504]: I1203 22:30:33.004672 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerDied","Data":"5c36507108428f62aa57a70f371f97e561a039db9dd72a2773970fa738c67826"} Dec 03 22:30:33.008478 master-0 kubenswrapper[36504]: I1203 22:30:33.008435 36504 generic.go:334] "Generic (PLEG): container finished" podID="1b4d21c0-99d8-4d30-80d0-0b558644643a" containerID="a6da9fc178518feebccd4d531d9d16b0b7b528e5433e3b1037914bad6f7af584" exitCode=0 Dec 03 22:30:33.008478 master-0 kubenswrapper[36504]: I1203 22:30:33.008480 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flg69" event={"ID":"1b4d21c0-99d8-4d30-80d0-0b558644643a","Type":"ContainerDied","Data":"a6da9fc178518feebccd4d531d9d16b0b7b528e5433e3b1037914bad6f7af584"} Dec 03 22:30:33.047661 master-0 kubenswrapper[36504]: I1203 22:30:33.047462 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-79b888857f-grgwt" podStartSLOduration=5.047428708 podStartE2EDuration="5.047428708s" podCreationTimestamp="2025-12-03 22:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:30:33.031752365 +0000 UTC m=+1198.251524372" watchObservedRunningTime="2025-12-03 22:30:33.047428708 +0000 UTC m=+1198.267200725" Dec 03 22:30:35.062794 master-0 kubenswrapper[36504]: I1203 22:30:35.061973 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-flg69" event={"ID":"1b4d21c0-99d8-4d30-80d0-0b558644643a","Type":"ContainerStarted","Data":"70bf7a17b23b6413cd700bf234bfdee0c2b4988b428dbda5715e0c1eb3e743f2"} Dec 03 22:30:35.063444 master-0 kubenswrapper[36504]: I1203 22:30:35.063194 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:35.076561 master-0 kubenswrapper[36504]: I1203 22:30:35.076491 36504 generic.go:334] "Generic (PLEG): container finished" podID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerID="bd413b0458e7126572ba84a93d545dde695eb5108f570ee704d17cfab7944d1d" exitCode=0 Dec 03 22:30:35.076876 master-0 kubenswrapper[36504]: I1203 22:30:35.076575 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerDied","Data":"bd413b0458e7126572ba84a93d545dde695eb5108f570ee704d17cfab7944d1d"} Dec 03 22:30:35.154120 master-0 kubenswrapper[36504]: I1203 22:30:35.154016 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-flg69" podStartSLOduration=2.698065874 podStartE2EDuration="10.153991115s" podCreationTimestamp="2025-12-03 22:30:25 +0000 UTC" firstStartedPulling="2025-12-03 22:30:26.683892786 +0000 UTC m=+1191.903664793" lastFinishedPulling="2025-12-03 22:30:34.139818027 +0000 UTC m=+1199.359590034" observedRunningTime="2025-12-03 22:30:35.08703229 +0000 UTC m=+1200.306804327" watchObservedRunningTime="2025-12-03 22:30:35.153991115 +0000 UTC m=+1200.373763122" Dec 03 22:30:35.533469 master-0 kubenswrapper[36504]: I1203 22:30:35.533416 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:35.616247 master-0 kubenswrapper[36504]: I1203 22:30:35.616137 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-combined-ca-bundle\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616549 master-0 kubenswrapper[36504]: I1203 22:30:35.616266 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-config-data\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616549 master-0 kubenswrapper[36504]: I1203 22:30:35.616327 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-log-httpd\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616633 master-0 kubenswrapper[36504]: I1203 22:30:35.616579 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-scripts\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616684 master-0 kubenswrapper[36504]: I1203 22:30:35.616641 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-sg-core-conf-yaml\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616717 master-0 kubenswrapper[36504]: I1203 22:30:35.616691 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-run-httpd\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616757 master-0 kubenswrapper[36504]: I1203 22:30:35.616725 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-ceilometer-tls-certs\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.616902 master-0 kubenswrapper[36504]: I1203 22:30:35.616855 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbrhq\" (UniqueName: \"kubernetes.io/projected/34bc643f-a29d-4e07-8ac8-af56fdbe5671-kube-api-access-mbrhq\") pod \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\" (UID: \"34bc643f-a29d-4e07-8ac8-af56fdbe5671\") " Dec 03 22:30:35.617285 master-0 kubenswrapper[36504]: I1203 22:30:35.617220 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:35.620090 master-0 kubenswrapper[36504]: I1203 22:30:35.619619 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:35.621486 master-0 kubenswrapper[36504]: I1203 22:30:35.621293 36504 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-run-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.621486 master-0 kubenswrapper[36504]: I1203 22:30:35.621342 36504 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34bc643f-a29d-4e07-8ac8-af56fdbe5671-log-httpd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.648634 master-0 kubenswrapper[36504]: I1203 22:30:35.647550 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-scripts" (OuterVolumeSpecName: "scripts") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:35.669661 master-0 kubenswrapper[36504]: I1203 22:30:35.669486 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bc643f-a29d-4e07-8ac8-af56fdbe5671-kube-api-access-mbrhq" (OuterVolumeSpecName: "kube-api-access-mbrhq") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "kube-api-access-mbrhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:30:35.691465 master-0 kubenswrapper[36504]: I1203 22:30:35.690217 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:35.724530 master-0 kubenswrapper[36504]: I1203 22:30:35.724451 36504 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-sg-core-conf-yaml\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.724530 master-0 kubenswrapper[36504]: I1203 22:30:35.724510 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbrhq\" (UniqueName: \"kubernetes.io/projected/34bc643f-a29d-4e07-8ac8-af56fdbe5671-kube-api-access-mbrhq\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.724530 master-0 kubenswrapper[36504]: I1203 22:30:35.724524 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.735713 master-0 kubenswrapper[36504]: I1203 22:30:35.735630 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:35.754703 master-0 kubenswrapper[36504]: I1203 22:30:35.754615 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:35.756072 master-0 kubenswrapper[36504]: E1203 22:30:35.755965 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:30:35.811506 master-0 kubenswrapper[36504]: I1203 22:30:35.811431 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-config-data" (OuterVolumeSpecName: "config-data") pod "34bc643f-a29d-4e07-8ac8-af56fdbe5671" (UID: "34bc643f-a29d-4e07-8ac8-af56fdbe5671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:35.830059 master-0 kubenswrapper[36504]: I1203 22:30:35.829209 36504 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-ceilometer-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.830059 master-0 kubenswrapper[36504]: I1203 22:30:35.829410 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:35.830059 master-0 kubenswrapper[36504]: I1203 22:30:35.829429 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc643f-a29d-4e07-8ac8-af56fdbe5671-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:36.106898 master-0 kubenswrapper[36504]: I1203 22:30:36.106699 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34bc643f-a29d-4e07-8ac8-af56fdbe5671","Type":"ContainerDied","Data":"b7847a4b396511ff745fc2e3316f1dea50b7385aea9df96b4142f3befda75030"} Dec 03 22:30:36.106898 master-0 kubenswrapper[36504]: I1203 22:30:36.106845 36504 scope.go:117] "RemoveContainer" containerID="4a810500045a0c37acd01e0cf8e51cd02adcb8f7c16455ee2cbf011e4203373e" Dec 03 22:30:36.107565 master-0 kubenswrapper[36504]: I1203 22:30:36.106874 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:36.149947 master-0 kubenswrapper[36504]: I1203 22:30:36.149895 36504 scope.go:117] "RemoveContainer" containerID="c0360480d1510e9828042cd238f5b4c0b09a119c48cf141b9e6bafec39fa03d7" Dec 03 22:30:36.182356 master-0 kubenswrapper[36504]: I1203 22:30:36.181874 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:36.208173 master-0 kubenswrapper[36504]: I1203 22:30:36.208077 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:36.215382 master-0 kubenswrapper[36504]: I1203 22:30:36.215294 36504 scope.go:117] "RemoveContainer" containerID="bd413b0458e7126572ba84a93d545dde695eb5108f570ee704d17cfab7944d1d" Dec 03 22:30:36.229691 master-0 kubenswrapper[36504]: I1203 22:30:36.229605 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:36.230897 master-0 kubenswrapper[36504]: E1203 22:30:36.230879 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="proxy-httpd" Dec 03 22:30:36.230989 master-0 kubenswrapper[36504]: I1203 22:30:36.230977 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="proxy-httpd" Dec 03 22:30:36.231096 master-0 kubenswrapper[36504]: E1203 22:30:36.231085 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="sg-core" Dec 03 22:30:36.231161 master-0 kubenswrapper[36504]: I1203 22:30:36.231148 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="sg-core" Dec 03 22:30:36.231328 master-0 kubenswrapper[36504]: E1203 22:30:36.231288 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-notification-agent" Dec 03 22:30:36.231397 master-0 kubenswrapper[36504]: I1203 22:30:36.231387 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-notification-agent" Dec 03 22:30:36.231506 master-0 kubenswrapper[36504]: E1203 22:30:36.231492 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-central-agent" Dec 03 22:30:36.231568 master-0 kubenswrapper[36504]: I1203 22:30:36.231559 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-central-agent" Dec 03 22:30:36.231947 master-0 kubenswrapper[36504]: I1203 22:30:36.231930 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-notification-agent" Dec 03 22:30:36.232039 master-0 kubenswrapper[36504]: I1203 22:30:36.232028 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="proxy-httpd" Dec 03 22:30:36.232117 master-0 kubenswrapper[36504]: I1203 22:30:36.232107 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="ceilometer-central-agent" Dec 03 22:30:36.232206 master-0 kubenswrapper[36504]: I1203 22:30:36.232196 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" containerName="sg-core" Dec 03 22:30:36.237144 master-0 kubenswrapper[36504]: I1203 22:30:36.237113 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:36.242430 master-0 kubenswrapper[36504]: I1203 22:30:36.242360 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 22:30:36.243460 master-0 kubenswrapper[36504]: I1203 22:30:36.242792 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 22:30:36.243460 master-0 kubenswrapper[36504]: I1203 22:30:36.242995 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 22:30:36.250203 master-0 kubenswrapper[36504]: I1203 22:30:36.250123 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:36.255353 master-0 kubenswrapper[36504]: I1203 22:30:36.255259 36504 scope.go:117] "RemoveContainer" containerID="5c36507108428f62aa57a70f371f97e561a039db9dd72a2773970fa738c67826" Dec 03 22:30:36.349794 master-0 kubenswrapper[36504]: I1203 22:30:36.347369 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-scripts\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.349794 master-0 kubenswrapper[36504]: I1203 22:30:36.347478 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/687d2809-5fbb-4943-bde7-593ce33dc945-log-httpd\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.349794 master-0 kubenswrapper[36504]: I1203 22:30:36.347557 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/687d2809-5fbb-4943-bde7-593ce33dc945-run-httpd\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.352144 master-0 kubenswrapper[36504]: I1203 22:30:36.351978 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.352308 master-0 kubenswrapper[36504]: I1203 22:30:36.352273 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.352371 master-0 kubenswrapper[36504]: I1203 22:30:36.352353 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.352480 master-0 kubenswrapper[36504]: I1203 22:30:36.352401 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-config-data\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.352916 master-0 kubenswrapper[36504]: I1203 22:30:36.352879 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4csmk\" (UniqueName: \"kubernetes.io/projected/687d2809-5fbb-4943-bde7-593ce33dc945-kube-api-access-4csmk\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456309 master-0 kubenswrapper[36504]: I1203 22:30:36.456231 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4csmk\" (UniqueName: \"kubernetes.io/projected/687d2809-5fbb-4943-bde7-593ce33dc945-kube-api-access-4csmk\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456360 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-scripts\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456392 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/687d2809-5fbb-4943-bde7-593ce33dc945-log-httpd\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456437 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/687d2809-5fbb-4943-bde7-593ce33dc945-run-httpd\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456491 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456545 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456577 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.456698 master-0 kubenswrapper[36504]: I1203 22:30:36.456599 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-config-data\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.457286 master-0 kubenswrapper[36504]: I1203 22:30:36.457191 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/687d2809-5fbb-4943-bde7-593ce33dc945-log-httpd\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.457593 master-0 kubenswrapper[36504]: I1203 22:30:36.457565 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/687d2809-5fbb-4943-bde7-593ce33dc945-run-httpd\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.462711 master-0 kubenswrapper[36504]: I1203 22:30:36.462622 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.462913 master-0 kubenswrapper[36504]: I1203 22:30:36.462879 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.463025 master-0 kubenswrapper[36504]: I1203 22:30:36.462966 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-scripts\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.465215 master-0 kubenswrapper[36504]: I1203 22:30:36.465171 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-config-data\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.469601 master-0 kubenswrapper[36504]: I1203 22:30:36.469540 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687d2809-5fbb-4943-bde7-593ce33dc945-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.477493 master-0 kubenswrapper[36504]: I1203 22:30:36.477430 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4csmk\" (UniqueName: \"kubernetes.io/projected/687d2809-5fbb-4943-bde7-593ce33dc945-kube-api-access-4csmk\") pod \"ceilometer-0\" (UID: \"687d2809-5fbb-4943-bde7-593ce33dc945\") " pod="openstack/ceilometer-0" Dec 03 22:30:36.587514 master-0 kubenswrapper[36504]: I1203 22:30:36.584028 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 22:30:37.134185 master-0 kubenswrapper[36504]: I1203 22:30:37.134110 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bc643f-a29d-4e07-8ac8-af56fdbe5671" path="/var/lib/kubelet/pods/34bc643f-a29d-4e07-8ac8-af56fdbe5671/volumes" Dec 03 22:30:37.141693 master-0 kubenswrapper[36504]: I1203 22:30:37.141631 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 22:30:38.161555 master-0 kubenswrapper[36504]: I1203 22:30:38.161402 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"687d2809-5fbb-4943-bde7-593ce33dc945","Type":"ContainerStarted","Data":"4bb6d46650c426476753388068fe6d7adde08347a9ad6ef0e08e646ef3dc62ff"} Dec 03 22:30:41.002961 master-0 kubenswrapper[36504]: I1203 22:30:40.992500 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-nbld9"] Dec 03 22:30:41.002961 master-0 kubenswrapper[36504]: I1203 22:30:40.998623 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.010750 master-0 kubenswrapper[36504]: I1203 22:30:41.008720 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-nbld9"] Dec 03 22:30:41.010750 master-0 kubenswrapper[36504]: I1203 22:30:41.009254 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 03 22:30:41.018280 master-0 kubenswrapper[36504]: I1203 22:30:41.018218 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-flg69" Dec 03 22:30:41.168309 master-0 kubenswrapper[36504]: I1203 22:30:41.168168 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-combined-ca-bundle\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.168309 master-0 kubenswrapper[36504]: I1203 22:30:41.168274 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.168621 master-0 kubenswrapper[36504]: I1203 22:30:41.168563 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data-merged\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.169602 master-0 kubenswrapper[36504]: I1203 22:30:41.169522 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-scripts\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.277016 master-0 kubenswrapper[36504]: I1203 22:30:41.276935 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data-merged\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.277624 master-0 kubenswrapper[36504]: I1203 22:30:41.277590 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data-merged\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.277718 master-0 kubenswrapper[36504]: I1203 22:30:41.277630 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-scripts\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.278076 master-0 kubenswrapper[36504]: I1203 22:30:41.278048 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-combined-ca-bundle\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.278156 master-0 kubenswrapper[36504]: I1203 22:30:41.278110 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.282836 master-0 kubenswrapper[36504]: I1203 22:30:41.282749 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-scripts\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.284267 master-0 kubenswrapper[36504]: I1203 22:30:41.283874 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-combined-ca-bundle\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.285223 master-0 kubenswrapper[36504]: I1203 22:30:41.285164 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data\") pod \"octavia-db-sync-nbld9\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.346902 master-0 kubenswrapper[36504]: I1203 22:30:41.346754 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:41.354417 master-0 kubenswrapper[36504]: I1203 22:30:41.354054 36504 scope.go:117] "RemoveContainer" containerID="1e44be388b71c86ea10e47695909fce01dcbdba8ee0d5078a421414b16fc0a44" Dec 03 22:30:41.573102 master-0 kubenswrapper[36504]: I1203 22:30:41.573050 36504 scope.go:117] "RemoveContainer" containerID="8e98bd7632e10184a3905755491b0bf5b7f9601daae4ea74b3820228d382dac8" Dec 03 22:30:41.649331 master-0 kubenswrapper[36504]: I1203 22:30:41.648324 36504 scope.go:117] "RemoveContainer" containerID="cb58d2e435b9c236b89f108b32f2d859f2790d96b310dbf4a6308adc863a4a21" Dec 03 22:30:41.833653 master-0 kubenswrapper[36504]: I1203 22:30:41.832815 36504 scope.go:117] "RemoveContainer" containerID="7eabf8c6a3ff450e30776a0cedb3a616a85913114f66d573623b566160209bbe" Dec 03 22:30:41.897612 master-0 kubenswrapper[36504]: I1203 22:30:41.897415 36504 scope.go:117] "RemoveContainer" containerID="4d490f766d003ce3f7cdde5e3f2db16774798f6827797743b163c51406309f61" Dec 03 22:30:42.265176 master-0 kubenswrapper[36504]: I1203 22:30:42.262181 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" event={"ID":"80423a1b-a97b-45ae-ba50-7b25fb2f2007","Type":"ContainerStarted","Data":"86f17e310a919976c92f883add3ac6f07734ad581882099f93f3043013e6560e"} Dec 03 22:30:42.269802 master-0 kubenswrapper[36504]: I1203 22:30:42.267144 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"687d2809-5fbb-4943-bde7-593ce33dc945","Type":"ContainerStarted","Data":"4e3c0723f8fbf91e9aeffa507c7d805a74bc668f886f05b482b406a831ad5619"} Dec 03 22:30:42.280796 master-0 kubenswrapper[36504]: I1203 22:30:42.278789 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-nbld9"] Dec 03 22:30:43.293158 master-0 kubenswrapper[36504]: I1203 22:30:43.293079 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"687d2809-5fbb-4943-bde7-593ce33dc945","Type":"ContainerStarted","Data":"bc235b0702551710a548f3d8d8e938db3f978562fc7fce4bfec904a3418c5956"} Dec 03 22:30:43.296928 master-0 kubenswrapper[36504]: I1203 22:30:43.296867 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nbld9" event={"ID":"a6c78c17-5e9b-4583-ab1c-6067e305c3de","Type":"ContainerStarted","Data":"bb44caac1b06e75cd223259e13c1684297a0475c7dc2eb3ebba39a5d8ededb79"} Dec 03 22:30:43.296928 master-0 kubenswrapper[36504]: I1203 22:30:43.296907 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nbld9" event={"ID":"a6c78c17-5e9b-4583-ab1c-6067e305c3de","Type":"ContainerStarted","Data":"e799b2034033303830905f0eaddf6aba3782252a63cf94fa93a905a8fb9901b7"} Dec 03 22:30:43.299913 master-0 kubenswrapper[36504]: I1203 22:30:43.299783 36504 generic.go:334] "Generic (PLEG): container finished" podID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerID="86f17e310a919976c92f883add3ac6f07734ad581882099f93f3043013e6560e" exitCode=0 Dec 03 22:30:43.299913 master-0 kubenswrapper[36504]: I1203 22:30:43.299850 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" event={"ID":"80423a1b-a97b-45ae-ba50-7b25fb2f2007","Type":"ContainerDied","Data":"86f17e310a919976c92f883add3ac6f07734ad581882099f93f3043013e6560e"} Dec 03 22:30:44.332249 master-0 kubenswrapper[36504]: I1203 22:30:44.332164 36504 generic.go:334] "Generic (PLEG): container finished" podID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerID="bb44caac1b06e75cd223259e13c1684297a0475c7dc2eb3ebba39a5d8ededb79" exitCode=0 Dec 03 22:30:44.332249 master-0 kubenswrapper[36504]: I1203 22:30:44.332232 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nbld9" event={"ID":"a6c78c17-5e9b-4583-ab1c-6067e305c3de","Type":"ContainerDied","Data":"bb44caac1b06e75cd223259e13c1684297a0475c7dc2eb3ebba39a5d8ededb79"} Dec 03 22:30:45.356498 master-0 kubenswrapper[36504]: I1203 22:30:45.356307 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"687d2809-5fbb-4943-bde7-593ce33dc945","Type":"ContainerStarted","Data":"a5edb232472478ee94e74fd20dc60120c78a00fc0706f622901ccc06b8d64ab2"} Dec 03 22:30:45.363170 master-0 kubenswrapper[36504]: I1203 22:30:45.361277 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nbld9" event={"ID":"a6c78c17-5e9b-4583-ab1c-6067e305c3de","Type":"ContainerStarted","Data":"2dbd9e1c9414c257cc93f177b87fc808d48e40e22cfe21c24b9d5cbf30251c3f"} Dec 03 22:30:45.375843 master-0 kubenswrapper[36504]: I1203 22:30:45.375751 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" event={"ID":"80423a1b-a97b-45ae-ba50-7b25fb2f2007","Type":"ContainerStarted","Data":"71cba74237c6f5a2b82462c55d752ec1f2515e1e4420072a3384d63d2b6e162a"} Dec 03 22:30:45.585916 master-0 kubenswrapper[36504]: I1203 22:30:45.585823 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-nbld9" podStartSLOduration=5.585753533 podStartE2EDuration="5.585753533s" podCreationTimestamp="2025-12-03 22:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:30:45.567044876 +0000 UTC m=+1210.786816883" watchObservedRunningTime="2025-12-03 22:30:45.585753533 +0000 UTC m=+1210.805525540" Dec 03 22:30:46.426269 master-0 kubenswrapper[36504]: I1203 22:30:46.426165 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"687d2809-5fbb-4943-bde7-593ce33dc945","Type":"ContainerStarted","Data":"e6ce3a8e06630eeed0a1b5c7f9c34d22aaf7f4666fe7489f197b26328ab86850"} Dec 03 22:30:46.515222 master-0 kubenswrapper[36504]: I1203 22:30:46.514838 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.670610488 podStartE2EDuration="10.514804095s" podCreationTimestamp="2025-12-03 22:30:36 +0000 UTC" firstStartedPulling="2025-12-03 22:30:37.139091056 +0000 UTC m=+1202.358863063" lastFinishedPulling="2025-12-03 22:30:45.983284663 +0000 UTC m=+1211.203056670" observedRunningTime="2025-12-03 22:30:46.479920029 +0000 UTC m=+1211.699692046" watchObservedRunningTime="2025-12-03 22:30:46.514804095 +0000 UTC m=+1211.734576122" Dec 03 22:30:46.558171 master-0 kubenswrapper[36504]: I1203 22:30:46.541565 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" podStartSLOduration=6.79069276 podStartE2EDuration="20.541523999s" podCreationTimestamp="2025-12-03 22:30:26 +0000 UTC" firstStartedPulling="2025-12-03 22:30:27.949751805 +0000 UTC m=+1193.169523812" lastFinishedPulling="2025-12-03 22:30:41.700583044 +0000 UTC m=+1206.920355051" observedRunningTime="2025-12-03 22:30:46.505734626 +0000 UTC m=+1211.725506633" watchObservedRunningTime="2025-12-03 22:30:46.541523999 +0000 UTC m=+1211.761295996" Dec 03 22:30:47.441800 master-0 kubenswrapper[36504]: I1203 22:30:47.441732 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 22:30:49.128541 master-0 kubenswrapper[36504]: I1203 22:30:49.128451 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:49.181284 master-0 kubenswrapper[36504]: I1203 22:30:49.181217 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-79b888857f-grgwt" Dec 03 22:30:49.334827 master-0 kubenswrapper[36504]: I1203 22:30:49.328868 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-6cd4ddb7f5-cj2qw"] Dec 03 22:30:49.334827 master-0 kubenswrapper[36504]: I1203 22:30:49.329214 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api" containerID="cri-o://8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b" gracePeriod=30 Dec 03 22:30:49.334827 master-0 kubenswrapper[36504]: I1203 22:30:49.329721 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api-provider-agent" containerID="cri-o://47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239" gracePeriod=30 Dec 03 22:30:50.505471 master-0 kubenswrapper[36504]: I1203 22:30:50.505383 36504 generic.go:334] "Generic (PLEG): container finished" podID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerID="47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239" exitCode=0 Dec 03 22:30:50.505471 master-0 kubenswrapper[36504]: I1203 22:30:50.505452 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerDied","Data":"47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239"} Dec 03 22:30:52.522219 master-0 kubenswrapper[36504]: I1203 22:30:52.521946 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api" probeResult="failure" output="Get \"http://10.128.1.58:9876/healthcheck\": read tcp 10.128.0.2:37860->10.128.1.58:9876: read: connection reset by peer" Dec 03 22:30:52.523157 master-0 kubenswrapper[36504]: I1203 22:30:52.521972 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api-provider-agent" probeResult="failure" output="Get \"http://10.128.1.58:9876/healthcheck\": read tcp 10.128.0.2:37876->10.128.1.58:9876: read: connection reset by peer" Dec 03 22:30:53.315651 master-0 kubenswrapper[36504]: I1203 22:30:53.315611 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:30:53.426223 master-0 kubenswrapper[36504]: I1203 22:30:53.426167 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-combined-ca-bundle\") pod \"d64b02c4-e815-4791-8b74-09ba3d96339f\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " Dec 03 22:30:53.426767 master-0 kubenswrapper[36504]: I1203 22:30:53.426751 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data\") pod \"d64b02c4-e815-4791-8b74-09ba3d96339f\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " Dec 03 22:30:53.426947 master-0 kubenswrapper[36504]: I1203 22:30:53.426930 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-scripts\") pod \"d64b02c4-e815-4791-8b74-09ba3d96339f\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " Dec 03 22:30:53.427108 master-0 kubenswrapper[36504]: I1203 22:30:53.427090 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-ovndb-tls-certs\") pod \"d64b02c4-e815-4791-8b74-09ba3d96339f\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " Dec 03 22:30:53.427275 master-0 kubenswrapper[36504]: I1203 22:30:53.427252 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data-merged\") pod \"d64b02c4-e815-4791-8b74-09ba3d96339f\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " Dec 03 22:30:53.427462 master-0 kubenswrapper[36504]: I1203 22:30:53.427444 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-octavia-run\") pod \"d64b02c4-e815-4791-8b74-09ba3d96339f\" (UID: \"d64b02c4-e815-4791-8b74-09ba3d96339f\") " Dec 03 22:30:53.428096 master-0 kubenswrapper[36504]: I1203 22:30:53.428040 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "d64b02c4-e815-4791-8b74-09ba3d96339f" (UID: "d64b02c4-e815-4791-8b74-09ba3d96339f"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:53.428997 master-0 kubenswrapper[36504]: I1203 22:30:53.428972 36504 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-octavia-run\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:53.430624 master-0 kubenswrapper[36504]: I1203 22:30:53.430569 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data" (OuterVolumeSpecName: "config-data") pod "d64b02c4-e815-4791-8b74-09ba3d96339f" (UID: "d64b02c4-e815-4791-8b74-09ba3d96339f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53.435045 master-0 kubenswrapper[36504]: I1203 22:30:53.434984 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-scripts" (OuterVolumeSpecName: "scripts") pod "d64b02c4-e815-4791-8b74-09ba3d96339f" (UID: "d64b02c4-e815-4791-8b74-09ba3d96339f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53.496858 master-0 kubenswrapper[36504]: I1203 22:30:53.496451 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "d64b02c4-e815-4791-8b74-09ba3d96339f" (UID: "d64b02c4-e815-4791-8b74-09ba3d96339f"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:53.503255 master-0 kubenswrapper[36504]: I1203 22:30:53.503142 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d64b02c4-e815-4791-8b74-09ba3d96339f" (UID: "d64b02c4-e815-4791-8b74-09ba3d96339f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53.533672 master-0 kubenswrapper[36504]: I1203 22:30:53.533196 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:53.533672 master-0 kubenswrapper[36504]: I1203 22:30:53.533258 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:53.533672 master-0 kubenswrapper[36504]: I1203 22:30:53.533271 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:53.533672 master-0 kubenswrapper[36504]: I1203 22:30:53.533286 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d64b02c4-e815-4791-8b74-09ba3d96339f-config-data-merged\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:53.556568 master-0 kubenswrapper[36504]: I1203 22:30:53.556454 36504 generic.go:334] "Generic (PLEG): container finished" podID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerID="8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b" exitCode=0 Dec 03 22:30:53.556883 master-0 kubenswrapper[36504]: I1203 22:30:53.556573 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" Dec 03 22:30:53.557001 master-0 kubenswrapper[36504]: I1203 22:30:53.556965 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerDied","Data":"8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b"} Dec 03 22:30:53.557104 master-0 kubenswrapper[36504]: I1203 22:30:53.557086 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6cd4ddb7f5-cj2qw" event={"ID":"d64b02c4-e815-4791-8b74-09ba3d96339f","Type":"ContainerDied","Data":"74f8edeb3a562304e5ff7b52c76f9cea483e8481bfe8414516114b72b56618f8"} Dec 03 22:30:53.557192 master-0 kubenswrapper[36504]: I1203 22:30:53.557167 36504 scope.go:117] "RemoveContainer" containerID="47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239" Dec 03 22:30:53.600988 master-0 kubenswrapper[36504]: I1203 22:30:53.600266 36504 scope.go:117] "RemoveContainer" containerID="8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b" Dec 03 22:30:53.636006 master-0 kubenswrapper[36504]: I1203 22:30:53.635838 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d64b02c4-e815-4791-8b74-09ba3d96339f" (UID: "d64b02c4-e815-4791-8b74-09ba3d96339f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:53.637376 master-0 kubenswrapper[36504]: I1203 22:30:53.637319 36504 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d64b02c4-e815-4791-8b74-09ba3d96339f-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:53.743911 master-0 kubenswrapper[36504]: I1203 22:30:53.743859 36504 scope.go:117] "RemoveContainer" containerID="876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb" Dec 03 22:30:53.779832 master-0 kubenswrapper[36504]: I1203 22:30:53.779754 36504 scope.go:117] "RemoveContainer" containerID="47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239" Dec 03 22:30:53.780549 master-0 kubenswrapper[36504]: E1203 22:30:53.780488 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239\": container with ID starting with 47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239 not found: ID does not exist" containerID="47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239" Dec 03 22:30:53.780671 master-0 kubenswrapper[36504]: I1203 22:30:53.780564 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239"} err="failed to get container status \"47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239\": rpc error: code = NotFound desc = could not find container \"47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239\": container with ID starting with 47f4221d4529dc5d6707dad9273ef345f72da187a2627b704516846bb2e13239 not found: ID does not exist" Dec 03 22:30:53.780671 master-0 kubenswrapper[36504]: I1203 22:30:53.780627 36504 scope.go:117] "RemoveContainer" containerID="8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b" Dec 03 22:30:53.782217 master-0 kubenswrapper[36504]: E1203 22:30:53.782177 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b\": container with ID starting with 8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b not found: ID does not exist" containerID="8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b" Dec 03 22:30:53.782290 master-0 kubenswrapper[36504]: I1203 22:30:53.782225 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b"} err="failed to get container status \"8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b\": rpc error: code = NotFound desc = could not find container \"8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b\": container with ID starting with 8b451e298ff3c877bc3c3268a0627e4f182b6677907f28c5e9d19cbc63b29b4b not found: ID does not exist" Dec 03 22:30:53.782290 master-0 kubenswrapper[36504]: I1203 22:30:53.782259 36504 scope.go:117] "RemoveContainer" containerID="876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb" Dec 03 22:30:53.782653 master-0 kubenswrapper[36504]: E1203 22:30:53.782619 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb\": container with ID starting with 876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb not found: ID does not exist" containerID="876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb" Dec 03 22:30:53.782723 master-0 kubenswrapper[36504]: I1203 22:30:53.782652 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb"} err="failed to get container status \"876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb\": rpc error: code = NotFound desc = could not find container \"876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb\": container with ID starting with 876931273b60b1594279b98e6b8e22917795fb42be3487a2e0b3b4ffee5270fb not found: ID does not exist" Dec 03 22:30:53.919616 master-0 kubenswrapper[36504]: I1203 22:30:53.919537 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-6cd4ddb7f5-cj2qw"] Dec 03 22:30:53.941976 master-0 kubenswrapper[36504]: I1203 22:30:53.941844 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-6cd4ddb7f5-cj2qw"] Dec 03 22:30:54.124097 master-0 kubenswrapper[36504]: E1203 22:30:54.124017 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd64b02c4_e815_4791_8b74_09ba3d96339f.slice/crio-74f8edeb3a562304e5ff7b52c76f9cea483e8481bfe8414516114b72b56618f8\": RecentStats: unable to find data in memory cache]" Dec 03 22:30:55.117349 master-0 kubenswrapper[36504]: I1203 22:30:55.117277 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" path="/var/lib/kubelet/pods/d64b02c4-e815-4791-8b74-09ba3d96339f/volumes" Dec 03 22:30:56.607694 master-0 kubenswrapper[36504]: I1203 22:30:56.607542 36504 generic.go:334] "Generic (PLEG): container finished" podID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerID="2dbd9e1c9414c257cc93f177b87fc808d48e40e22cfe21c24b9d5cbf30251c3f" exitCode=0 Dec 03 22:30:56.608442 master-0 kubenswrapper[36504]: I1203 22:30:56.607637 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nbld9" event={"ID":"a6c78c17-5e9b-4583-ab1c-6067e305c3de","Type":"ContainerDied","Data":"2dbd9e1c9414c257cc93f177b87fc808d48e40e22cfe21c24b9d5cbf30251c3f"} Dec 03 22:30:58.269908 master-0 kubenswrapper[36504]: I1203 22:30:58.269839 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:58.413482 master-0 kubenswrapper[36504]: I1203 22:30:58.412738 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-scripts\") pod \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " Dec 03 22:30:58.414031 master-0 kubenswrapper[36504]: I1203 22:30:58.414014 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data-merged\") pod \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " Dec 03 22:30:58.414605 master-0 kubenswrapper[36504]: I1203 22:30:58.414583 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data\") pod \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " Dec 03 22:30:58.414856 master-0 kubenswrapper[36504]: I1203 22:30:58.414835 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-combined-ca-bundle\") pod \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\" (UID: \"a6c78c17-5e9b-4583-ab1c-6067e305c3de\") " Dec 03 22:30:58.416832 master-0 kubenswrapper[36504]: I1203 22:30:58.416791 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-scripts" (OuterVolumeSpecName: "scripts") pod "a6c78c17-5e9b-4583-ab1c-6067e305c3de" (UID: "a6c78c17-5e9b-4583-ab1c-6067e305c3de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:58.418760 master-0 kubenswrapper[36504]: I1203 22:30:58.418731 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data" (OuterVolumeSpecName: "config-data") pod "a6c78c17-5e9b-4583-ab1c-6067e305c3de" (UID: "a6c78c17-5e9b-4583-ab1c-6067e305c3de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:58.445745 master-0 kubenswrapper[36504]: I1203 22:30:58.445522 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "a6c78c17-5e9b-4583-ab1c-6067e305c3de" (UID: "a6c78c17-5e9b-4583-ab1c-6067e305c3de"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:30:58.446608 master-0 kubenswrapper[36504]: I1203 22:30:58.446542 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6c78c17-5e9b-4583-ab1c-6067e305c3de" (UID: "a6c78c17-5e9b-4583-ab1c-6067e305c3de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:30:58.518860 master-0 kubenswrapper[36504]: I1203 22:30:58.518763 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:58.518860 master-0 kubenswrapper[36504]: I1203 22:30:58.518841 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:58.518860 master-0 kubenswrapper[36504]: I1203 22:30:58.518852 36504 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6c78c17-5e9b-4583-ab1c-6067e305c3de-scripts\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:58.518860 master-0 kubenswrapper[36504]: I1203 22:30:58.518871 36504 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a6c78c17-5e9b-4583-ab1c-6067e305c3de-config-data-merged\") on node \"master-0\" DevicePath \"\"" Dec 03 22:30:58.640100 master-0 kubenswrapper[36504]: I1203 22:30:58.640050 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-nbld9" Dec 03 22:30:58.640391 master-0 kubenswrapper[36504]: I1203 22:30:58.640057 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-nbld9" event={"ID":"a6c78c17-5e9b-4583-ab1c-6067e305c3de","Type":"ContainerDied","Data":"e799b2034033303830905f0eaddf6aba3782252a63cf94fa93a905a8fb9901b7"} Dec 03 22:30:58.640391 master-0 kubenswrapper[36504]: I1203 22:30:58.640197 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e799b2034033303830905f0eaddf6aba3782252a63cf94fa93a905a8fb9901b7" Dec 03 22:31:06.596374 master-0 kubenswrapper[36504]: I1203 22:31:06.596297 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 22:31:15.096731 master-0 kubenswrapper[36504]: I1203 22:31:15.096645 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:31:17.581691 master-0 kubenswrapper[36504]: I1203 22:31:17.581599 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fjctx"] Dec 03 22:31:17.582531 master-0 kubenswrapper[36504]: I1203 22:31:17.581950 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerName="octavia-amphora-httpd" containerID="cri-o://71cba74237c6f5a2b82462c55d752ec1f2515e1e4420072a3384d63d2b6e162a" gracePeriod=30 Dec 03 22:31:17.989637 master-0 kubenswrapper[36504]: I1203 22:31:17.989576 36504 generic.go:334] "Generic (PLEG): container finished" podID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerID="71cba74237c6f5a2b82462c55d752ec1f2515e1e4420072a3384d63d2b6e162a" exitCode=0 Dec 03 22:31:17.990044 master-0 kubenswrapper[36504]: I1203 22:31:17.990011 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" event={"ID":"80423a1b-a97b-45ae-ba50-7b25fb2f2007","Type":"ContainerDied","Data":"71cba74237c6f5a2b82462c55d752ec1f2515e1e4420072a3384d63d2b6e162a"} Dec 03 22:31:18.420586 master-0 kubenswrapper[36504]: I1203 22:31:18.420532 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:31:18.562274 master-0 kubenswrapper[36504]: I1203 22:31:18.562119 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80423a1b-a97b-45ae-ba50-7b25fb2f2007-httpd-config\") pod \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " Dec 03 22:31:18.562274 master-0 kubenswrapper[36504]: I1203 22:31:18.562246 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/80423a1b-a97b-45ae-ba50-7b25fb2f2007-amphora-image\") pod \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\" (UID: \"80423a1b-a97b-45ae-ba50-7b25fb2f2007\") " Dec 03 22:31:18.603216 master-0 kubenswrapper[36504]: I1203 22:31:18.603110 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80423a1b-a97b-45ae-ba50-7b25fb2f2007-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "80423a1b-a97b-45ae-ba50-7b25fb2f2007" (UID: "80423a1b-a97b-45ae-ba50-7b25fb2f2007"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:18.667159 master-0 kubenswrapper[36504]: I1203 22:31:18.667083 36504 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/80423a1b-a97b-45ae-ba50-7b25fb2f2007-httpd-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:18.677168 master-0 kubenswrapper[36504]: I1203 22:31:18.677085 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80423a1b-a97b-45ae-ba50-7b25fb2f2007-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "80423a1b-a97b-45ae-ba50-7b25fb2f2007" (UID: "80423a1b-a97b-45ae-ba50-7b25fb2f2007"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:31:18.769998 master-0 kubenswrapper[36504]: I1203 22:31:18.769911 36504 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/80423a1b-a97b-45ae-ba50-7b25fb2f2007-amphora-image\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:19.015199 master-0 kubenswrapper[36504]: I1203 22:31:19.015124 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" event={"ID":"80423a1b-a97b-45ae-ba50-7b25fb2f2007","Type":"ContainerDied","Data":"55e4e9d6c6420bb0f0071998a44b9f445f39c94986a8115b0277db8567d0810f"} Dec 03 22:31:19.015566 master-0 kubenswrapper[36504]: I1203 22:31:19.015218 36504 scope.go:117] "RemoveContainer" containerID="71cba74237c6f5a2b82462c55d752ec1f2515e1e4420072a3384d63d2b6e162a" Dec 03 22:31:19.015566 master-0 kubenswrapper[36504]: I1203 22:31:19.015285 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-fjctx" Dec 03 22:31:19.054380 master-0 kubenswrapper[36504]: I1203 22:31:19.053998 36504 scope.go:117] "RemoveContainer" containerID="86f17e310a919976c92f883add3ac6f07734ad581882099f93f3043013e6560e" Dec 03 22:31:19.069872 master-0 kubenswrapper[36504]: I1203 22:31:19.069750 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fjctx"] Dec 03 22:31:19.087899 master-0 kubenswrapper[36504]: I1203 22:31:19.087703 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fjctx"] Dec 03 22:31:19.117618 master-0 kubenswrapper[36504]: I1203 22:31:19.117528 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" path="/var/lib/kubelet/pods/80423a1b-a97b-45ae-ba50-7b25fb2f2007/volumes" Dec 03 22:31:23.143036 master-0 kubenswrapper[36504]: I1203 22:31:23.142624 36504 generic.go:334] "Generic (PLEG): container finished" podID="269654aa-a42a-43d3-a328-45f7c6f7169e" containerID="68a357078b2461f00e4c1361e591041403f854a8729f298ba29b6cab4aa8d26b" exitCode=0 Dec 03 22:31:23.143036 master-0 kubenswrapper[36504]: I1203 22:31:23.142705 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" event={"ID":"269654aa-a42a-43d3-a328-45f7c6f7169e","Type":"ContainerDied","Data":"68a357078b2461f00e4c1361e591041403f854a8729f298ba29b6cab4aa8d26b"} Dec 03 22:31:24.862493 master-0 kubenswrapper[36504]: I1203 22:31:24.862433 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:31:24.937461 master-0 kubenswrapper[36504]: I1203 22:31:24.937294 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-inventory\") pod \"269654aa-a42a-43d3-a328-45f7c6f7169e\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " Dec 03 22:31:24.938017 master-0 kubenswrapper[36504]: I1203 22:31:24.937985 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-ssh-key\") pod \"269654aa-a42a-43d3-a328-45f7c6f7169e\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " Dec 03 22:31:24.938090 master-0 kubenswrapper[36504]: I1203 22:31:24.938054 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh7gb\" (UniqueName: \"kubernetes.io/projected/269654aa-a42a-43d3-a328-45f7c6f7169e-kube-api-access-sh7gb\") pod \"269654aa-a42a-43d3-a328-45f7c6f7169e\" (UID: \"269654aa-a42a-43d3-a328-45f7c6f7169e\") " Dec 03 22:31:24.951000 master-0 kubenswrapper[36504]: I1203 22:31:24.950934 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269654aa-a42a-43d3-a328-45f7c6f7169e-kube-api-access-sh7gb" (OuterVolumeSpecName: "kube-api-access-sh7gb") pod "269654aa-a42a-43d3-a328-45f7c6f7169e" (UID: "269654aa-a42a-43d3-a328-45f7c6f7169e"). InnerVolumeSpecName "kube-api-access-sh7gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:24.982174 master-0 kubenswrapper[36504]: I1203 22:31:24.982053 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "269654aa-a42a-43d3-a328-45f7c6f7169e" (UID: "269654aa-a42a-43d3-a328-45f7c6f7169e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:25.002142 master-0 kubenswrapper[36504]: I1203 22:31:25.002024 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-inventory" (OuterVolumeSpecName: "inventory") pod "269654aa-a42a-43d3-a328-45f7c6f7169e" (UID: "269654aa-a42a-43d3-a328-45f7c6f7169e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:25.051762 master-0 kubenswrapper[36504]: I1203 22:31:25.051635 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:25.051762 master-0 kubenswrapper[36504]: I1203 22:31:25.051709 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh7gb\" (UniqueName: \"kubernetes.io/projected/269654aa-a42a-43d3-a328-45f7c6f7169e-kube-api-access-sh7gb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:25.051762 master-0 kubenswrapper[36504]: I1203 22:31:25.051729 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/269654aa-a42a-43d3-a328-45f7c6f7169e-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:25.115662 master-0 kubenswrapper[36504]: I1203 22:31:25.115604 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:31:25.197252 master-0 kubenswrapper[36504]: I1203 22:31:25.197051 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" event={"ID":"269654aa-a42a-43d3-a328-45f7c6f7169e","Type":"ContainerDied","Data":"7600e7320b4ff19915af290fa5314e82ee6b08fe9c983ca80f63f53b8fd31fe4"} Dec 03 22:31:25.197252 master-0 kubenswrapper[36504]: I1203 22:31:25.197131 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7600e7320b4ff19915af290fa5314e82ee6b08fe9c983ca80f63f53b8fd31fe4" Dec 03 22:31:25.197252 master-0 kubenswrapper[36504]: I1203 22:31:25.197237 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-dataplane-os-edpm-kmq5s" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.214550 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fnbx4"] Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215509 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerName="init" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215526 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerName="init" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215555 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215561 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215570 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerName="octavia-db-sync" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215577 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerName="octavia-db-sync" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215594 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="init" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215600 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="init" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215621 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerName="init" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215627 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerName="init" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215648 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerName="octavia-amphora-httpd" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215657 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerName="octavia-amphora-httpd" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215673 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api-provider-agent" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215679 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api-provider-agent" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: E1203 22:31:25.215692 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269654aa-a42a-43d3-a328-45f7c6f7169e" containerName="configure-network-dataplane-os-edpm" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.215698 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="269654aa-a42a-43d3-a328-45f7c6f7169e" containerName="configure-network-dataplane-os-edpm" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.216139 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" containerName="octavia-db-sync" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.216172 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api-provider-agent" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.216188 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="269654aa-a42a-43d3-a328-45f7c6f7169e" containerName="configure-network-dataplane-os-edpm" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.216232 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="80423a1b-a97b-45ae-ba50-7b25fb2f2007" containerName="octavia-amphora-httpd" Dec 03 22:31:25.217569 master-0 kubenswrapper[36504]: I1203 22:31:25.216248 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64b02c4-e815-4791-8b74-09ba3d96339f" containerName="octavia-api" Dec 03 22:31:25.221226 master-0 kubenswrapper[36504]: I1203 22:31:25.219223 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.234931 master-0 kubenswrapper[36504]: I1203 22:31:25.230188 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 03 22:31:25.245947 master-0 kubenswrapper[36504]: I1203 22:31:25.245864 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fnbx4"] Dec 03 22:31:25.342454 master-0 kubenswrapper[36504]: I1203 22:31:25.334814 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-dataplane-os-edpm-87gbb"] Dec 03 22:31:25.342454 master-0 kubenswrapper[36504]: I1203 22:31:25.338226 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.342454 master-0 kubenswrapper[36504]: I1203 22:31:25.341533 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:31:25.363643 master-0 kubenswrapper[36504]: I1203 22:31:25.342985 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:31:25.363643 master-0 kubenswrapper[36504]: I1203 22:31:25.347189 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:31:25.363643 master-0 kubenswrapper[36504]: I1203 22:31:25.354596 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-dataplane-os-edpm-87gbb"] Dec 03 22:31:25.364745 master-0 kubenswrapper[36504]: I1203 22:31:25.364039 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2fb504a3-a8c8-49a6-9374-f62a888561b0-httpd-config\") pod \"octavia-image-upload-56c9f55b99-fnbx4\" (UID: \"2fb504a3-a8c8-49a6-9374-f62a888561b0\") " pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.370001 master-0 kubenswrapper[36504]: I1203 22:31:25.365249 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2fb504a3-a8c8-49a6-9374-f62a888561b0-amphora-image\") pod \"octavia-image-upload-56c9f55b99-fnbx4\" (UID: \"2fb504a3-a8c8-49a6-9374-f62a888561b0\") " pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.469648 master-0 kubenswrapper[36504]: I1203 22:31:25.469392 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-ssh-key\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.469648 master-0 kubenswrapper[36504]: I1203 22:31:25.469504 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2fb504a3-a8c8-49a6-9374-f62a888561b0-httpd-config\") pod \"octavia-image-upload-56c9f55b99-fnbx4\" (UID: \"2fb504a3-a8c8-49a6-9374-f62a888561b0\") " pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.469648 master-0 kubenswrapper[36504]: I1203 22:31:25.469597 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s49p\" (UniqueName: \"kubernetes.io/projected/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-kube-api-access-2s49p\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.469648 master-0 kubenswrapper[36504]: I1203 22:31:25.469649 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2fb504a3-a8c8-49a6-9374-f62a888561b0-amphora-image\") pod \"octavia-image-upload-56c9f55b99-fnbx4\" (UID: \"2fb504a3-a8c8-49a6-9374-f62a888561b0\") " pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.470102 master-0 kubenswrapper[36504]: I1203 22:31:25.469733 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-inventory\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.470517 master-0 kubenswrapper[36504]: I1203 22:31:25.470463 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/2fb504a3-a8c8-49a6-9374-f62a888561b0-amphora-image\") pod \"octavia-image-upload-56c9f55b99-fnbx4\" (UID: \"2fb504a3-a8c8-49a6-9374-f62a888561b0\") " pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.478827 master-0 kubenswrapper[36504]: I1203 22:31:25.473711 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2fb504a3-a8c8-49a6-9374-f62a888561b0-httpd-config\") pod \"octavia-image-upload-56c9f55b99-fnbx4\" (UID: \"2fb504a3-a8c8-49a6-9374-f62a888561b0\") " pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.572923 master-0 kubenswrapper[36504]: I1203 22:31:25.572722 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-ssh-key\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.572923 master-0 kubenswrapper[36504]: I1203 22:31:25.572888 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s49p\" (UniqueName: \"kubernetes.io/projected/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-kube-api-access-2s49p\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.573288 master-0 kubenswrapper[36504]: I1203 22:31:25.573004 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-inventory\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.576704 master-0 kubenswrapper[36504]: I1203 22:31:25.576654 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-ssh-key\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.577199 master-0 kubenswrapper[36504]: I1203 22:31:25.577159 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-inventory\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.595863 master-0 kubenswrapper[36504]: I1203 22:31:25.595789 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" Dec 03 22:31:25.596168 master-0 kubenswrapper[36504]: I1203 22:31:25.596100 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s49p\" (UniqueName: \"kubernetes.io/projected/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-kube-api-access-2s49p\") pod \"validate-network-dataplane-os-edpm-87gbb\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:25.685808 master-0 kubenswrapper[36504]: I1203 22:31:25.681264 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:26.175862 master-0 kubenswrapper[36504]: I1203 22:31:26.175804 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-fnbx4"] Dec 03 22:31:26.177562 master-0 kubenswrapper[36504]: W1203 22:31:26.177522 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fb504a3_a8c8_49a6_9374_f62a888561b0.slice/crio-283328e2e71d542484bdc093e5b82acec42e255130acc868e952bf07c3faea83 WatchSource:0}: Error finding container 283328e2e71d542484bdc093e5b82acec42e255130acc868e952bf07c3faea83: Status 404 returned error can't find the container with id 283328e2e71d542484bdc093e5b82acec42e255130acc868e952bf07c3faea83 Dec 03 22:31:26.198463 master-0 kubenswrapper[36504]: I1203 22:31:26.198428 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:31:26.215359 master-0 kubenswrapper[36504]: I1203 22:31:26.215278 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" event={"ID":"2fb504a3-a8c8-49a6-9374-f62a888561b0","Type":"ContainerStarted","Data":"283328e2e71d542484bdc093e5b82acec42e255130acc868e952bf07c3faea83"} Dec 03 22:31:26.399906 master-0 kubenswrapper[36504]: I1203 22:31:26.399834 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-dataplane-os-edpm-87gbb"] Dec 03 22:31:27.236449 master-0 kubenswrapper[36504]: I1203 22:31:27.236299 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-os-edpm-87gbb" event={"ID":"c634dafd-5f2a-4e31-9a87-bd6ea37498ea","Type":"ContainerStarted","Data":"c2b6bd72ed8f8f48f7044040e4c23be3275fe430401f6d888b808e43dc1f7cdb"} Dec 03 22:31:27.237791 master-0 kubenswrapper[36504]: I1203 22:31:27.237719 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" event={"ID":"2fb504a3-a8c8-49a6-9374-f62a888561b0","Type":"ContainerStarted","Data":"73fae75f549097c682f7af0308e9eca31a0165e53d89b3c4047d3c3939eb4f19"} Dec 03 22:31:27.282604 master-0 kubenswrapper[36504]: I1203 22:31:27.282372 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-dataplane-os-edpm-87gbb" podStartSLOduration=1.819984815 podStartE2EDuration="2.281247601s" podCreationTimestamp="2025-12-03 22:31:25 +0000 UTC" firstStartedPulling="2025-12-03 22:31:26.44366249 +0000 UTC m=+1251.663434497" lastFinishedPulling="2025-12-03 22:31:26.904925276 +0000 UTC m=+1252.124697283" observedRunningTime="2025-12-03 22:31:27.266653491 +0000 UTC m=+1252.486425498" watchObservedRunningTime="2025-12-03 22:31:27.281247601 +0000 UTC m=+1252.501019618" Dec 03 22:31:28.258611 master-0 kubenswrapper[36504]: I1203 22:31:28.258494 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-os-edpm-87gbb" event={"ID":"c634dafd-5f2a-4e31-9a87-bd6ea37498ea","Type":"ContainerStarted","Data":"53d0f956098bc5790a273d15ec65ebdafd1e84d40339810c8809b77e0b9e06e8"} Dec 03 22:31:28.261382 master-0 kubenswrapper[36504]: I1203 22:31:28.261322 36504 generic.go:334] "Generic (PLEG): container finished" podID="2fb504a3-a8c8-49a6-9374-f62a888561b0" containerID="73fae75f549097c682f7af0308e9eca31a0165e53d89b3c4047d3c3939eb4f19" exitCode=0 Dec 03 22:31:28.261491 master-0 kubenswrapper[36504]: I1203 22:31:28.261391 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" event={"ID":"2fb504a3-a8c8-49a6-9374-f62a888561b0","Type":"ContainerDied","Data":"73fae75f549097c682f7af0308e9eca31a0165e53d89b3c4047d3c3939eb4f19"} Dec 03 22:31:29.300528 master-0 kubenswrapper[36504]: I1203 22:31:29.300455 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" event={"ID":"2fb504a3-a8c8-49a6-9374-f62a888561b0","Type":"ContainerStarted","Data":"347fe7c1279207e022aa478d3bc865fbab6785623e3a0c24b8da64214b2241ea"} Dec 03 22:31:29.368273 master-0 kubenswrapper[36504]: I1203 22:31:29.368163 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-fnbx4" podStartSLOduration=3.95921795 podStartE2EDuration="4.368124581s" podCreationTimestamp="2025-12-03 22:31:25 +0000 UTC" firstStartedPulling="2025-12-03 22:31:26.198363915 +0000 UTC m=+1251.418135912" lastFinishedPulling="2025-12-03 22:31:26.607270536 +0000 UTC m=+1251.827042543" observedRunningTime="2025-12-03 22:31:29.334222476 +0000 UTC m=+1254.553994483" watchObservedRunningTime="2025-12-03 22:31:29.368124581 +0000 UTC m=+1254.587896588" Dec 03 22:31:30.965983 master-0 kubenswrapper[36504]: I1203 22:31:30.962664 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-74hkl"] Dec 03 22:31:30.966992 master-0 kubenswrapper[36504]: I1203 22:31:30.966491 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:30.969560 master-0 kubenswrapper[36504]: I1203 22:31:30.969483 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 03 22:31:30.969560 master-0 kubenswrapper[36504]: I1203 22:31:30.969548 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 03 22:31:30.969765 master-0 kubenswrapper[36504]: I1203 22:31:30.969735 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 03 22:31:31.011380 master-0 kubenswrapper[36504]: I1203 22:31:31.011265 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-74hkl"] Dec 03 22:31:31.040601 master-0 kubenswrapper[36504]: I1203 22:31:31.040505 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-scripts\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.040904 master-0 kubenswrapper[36504]: I1203 22:31:31.040628 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2d9f6b72-51f1-436d-8836-2745d82faddc-config-data-merged\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.040979 master-0 kubenswrapper[36504]: I1203 22:31:31.040949 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-combined-ca-bundle\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.041096 master-0 kubenswrapper[36504]: I1203 22:31:31.041060 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2d9f6b72-51f1-436d-8836-2745d82faddc-hm-ports\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.041303 master-0 kubenswrapper[36504]: I1203 22:31:31.041267 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-config-data\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.041374 master-0 kubenswrapper[36504]: I1203 22:31:31.041350 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-amphora-certs\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.144895 master-0 kubenswrapper[36504]: I1203 22:31:31.144758 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-scripts\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.145663 master-0 kubenswrapper[36504]: I1203 22:31:31.145598 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2d9f6b72-51f1-436d-8836-2745d82faddc-config-data-merged\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.146289 master-0 kubenswrapper[36504]: I1203 22:31:31.146264 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-combined-ca-bundle\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.146589 master-0 kubenswrapper[36504]: I1203 22:31:31.146565 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2d9f6b72-51f1-436d-8836-2745d82faddc-hm-ports\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.147013 master-0 kubenswrapper[36504]: I1203 22:31:31.146989 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-config-data\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.147167 master-0 kubenswrapper[36504]: I1203 22:31:31.146988 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2d9f6b72-51f1-436d-8836-2745d82faddc-config-data-merged\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.147444 master-0 kubenswrapper[36504]: I1203 22:31:31.147423 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-amphora-certs\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.147631 master-0 kubenswrapper[36504]: I1203 22:31:31.147587 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2d9f6b72-51f1-436d-8836-2745d82faddc-hm-ports\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.150140 master-0 kubenswrapper[36504]: I1203 22:31:31.150088 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-combined-ca-bundle\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.151015 master-0 kubenswrapper[36504]: I1203 22:31:31.150875 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-amphora-certs\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.151235 master-0 kubenswrapper[36504]: I1203 22:31:31.151174 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-scripts\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.151474 master-0 kubenswrapper[36504]: I1203 22:31:31.151423 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d9f6b72-51f1-436d-8836-2745d82faddc-config-data\") pod \"octavia-healthmanager-74hkl\" (UID: \"2d9f6b72-51f1-436d-8836-2745d82faddc\") " pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.308689 master-0 kubenswrapper[36504]: I1203 22:31:31.308510 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:31.944337 master-0 kubenswrapper[36504]: W1203 22:31:31.944260 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d9f6b72_51f1_436d_8836_2745d82faddc.slice/crio-e0e92e4f9267abf87f6b83e0b97d89a6788cc29daf686d45c877e63188fd6f69 WatchSource:0}: Error finding container e0e92e4f9267abf87f6b83e0b97d89a6788cc29daf686d45c877e63188fd6f69: Status 404 returned error can't find the container with id e0e92e4f9267abf87f6b83e0b97d89a6788cc29daf686d45c877e63188fd6f69 Dec 03 22:31:31.966519 master-0 kubenswrapper[36504]: I1203 22:31:31.966453 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-74hkl"] Dec 03 22:31:32.376271 master-0 kubenswrapper[36504]: I1203 22:31:32.376163 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-74hkl" event={"ID":"2d9f6b72-51f1-436d-8836-2745d82faddc","Type":"ContainerStarted","Data":"e0e92e4f9267abf87f6b83e0b97d89a6788cc29daf686d45c877e63188fd6f69"} Dec 03 22:31:33.112453 master-0 kubenswrapper[36504]: I1203 22:31:33.112382 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-2vjcj"] Dec 03 22:31:33.116028 master-0 kubenswrapper[36504]: I1203 22:31:33.115980 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.118799 master-0 kubenswrapper[36504]: I1203 22:31:33.118736 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 03 22:31:33.118934 master-0 kubenswrapper[36504]: I1203 22:31:33.118873 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 03 22:31:33.133797 master-0 kubenswrapper[36504]: I1203 22:31:33.131539 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-2vjcj"] Dec 03 22:31:33.219084 master-0 kubenswrapper[36504]: I1203 22:31:33.218948 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-config-data-merged\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.219084 master-0 kubenswrapper[36504]: I1203 22:31:33.219071 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-config-data\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.219477 master-0 kubenswrapper[36504]: I1203 22:31:33.219131 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-hm-ports\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.219477 master-0 kubenswrapper[36504]: I1203 22:31:33.219203 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-combined-ca-bundle\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.219560 master-0 kubenswrapper[36504]: I1203 22:31:33.219451 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-scripts\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.219783 master-0 kubenswrapper[36504]: I1203 22:31:33.219689 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-amphora-certs\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.321968 master-0 kubenswrapper[36504]: I1203 22:31:33.321868 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-combined-ca-bundle\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.321968 master-0 kubenswrapper[36504]: I1203 22:31:33.321987 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-scripts\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.322388 master-0 kubenswrapper[36504]: I1203 22:31:33.322014 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-amphora-certs\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.323044 master-0 kubenswrapper[36504]: I1203 22:31:33.322992 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-config-data-merged\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.323266 master-0 kubenswrapper[36504]: I1203 22:31:33.323228 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-config-data\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.323344 master-0 kubenswrapper[36504]: I1203 22:31:33.323312 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-hm-ports\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.323915 master-0 kubenswrapper[36504]: I1203 22:31:33.323853 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-config-data-merged\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.325035 master-0 kubenswrapper[36504]: I1203 22:31:33.324986 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-hm-ports\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.327552 master-0 kubenswrapper[36504]: I1203 22:31:33.327466 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-combined-ca-bundle\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.330743 master-0 kubenswrapper[36504]: I1203 22:31:33.330690 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-amphora-certs\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.334726 master-0 kubenswrapper[36504]: I1203 22:31:33.334308 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-config-data\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.353138 master-0 kubenswrapper[36504]: I1203 22:31:33.353036 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3-scripts\") pod \"octavia-housekeeping-2vjcj\" (UID: \"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3\") " pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:33.397560 master-0 kubenswrapper[36504]: I1203 22:31:33.397482 36504 generic.go:334] "Generic (PLEG): container finished" podID="c634dafd-5f2a-4e31-9a87-bd6ea37498ea" containerID="53d0f956098bc5790a273d15ec65ebdafd1e84d40339810c8809b77e0b9e06e8" exitCode=0 Dec 03 22:31:33.397862 master-0 kubenswrapper[36504]: I1203 22:31:33.397576 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-os-edpm-87gbb" event={"ID":"c634dafd-5f2a-4e31-9a87-bd6ea37498ea","Type":"ContainerDied","Data":"53d0f956098bc5790a273d15ec65ebdafd1e84d40339810c8809b77e0b9e06e8"} Dec 03 22:31:33.403134 master-0 kubenswrapper[36504]: I1203 22:31:33.403071 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-74hkl" event={"ID":"2d9f6b72-51f1-436d-8836-2745d82faddc","Type":"ContainerStarted","Data":"35a097ec782b398b7ec186d1440698723bd96145f883ec933d7c8bf711c6f122"} Dec 03 22:31:33.451152 master-0 kubenswrapper[36504]: I1203 22:31:33.451084 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:34.055126 master-0 kubenswrapper[36504]: I1203 22:31:34.055049 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-2vjcj"] Dec 03 22:31:34.422237 master-0 kubenswrapper[36504]: I1203 22:31:34.422186 36504 generic.go:334] "Generic (PLEG): container finished" podID="2d9f6b72-51f1-436d-8836-2745d82faddc" containerID="35a097ec782b398b7ec186d1440698723bd96145f883ec933d7c8bf711c6f122" exitCode=0 Dec 03 22:31:34.422951 master-0 kubenswrapper[36504]: I1203 22:31:34.422927 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-74hkl" event={"ID":"2d9f6b72-51f1-436d-8836-2745d82faddc","Type":"ContainerDied","Data":"35a097ec782b398b7ec186d1440698723bd96145f883ec933d7c8bf711c6f122"} Dec 03 22:31:34.429432 master-0 kubenswrapper[36504]: I1203 22:31:34.429335 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2vjcj" event={"ID":"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3","Type":"ContainerStarted","Data":"307b099af6d15f0b7df2719d64edfae60ee14fa11135d823f676f178b759cddc"} Dec 03 22:31:34.722276 master-0 kubenswrapper[36504]: I1203 22:31:34.722180 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-js9sr"] Dec 03 22:31:34.727751 master-0 kubenswrapper[36504]: I1203 22:31:34.727698 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.730876 master-0 kubenswrapper[36504]: I1203 22:31:34.730815 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 03 22:31:34.732314 master-0 kubenswrapper[36504]: I1203 22:31:34.731200 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 03 22:31:34.767130 master-0 kubenswrapper[36504]: I1203 22:31:34.767040 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-js9sr"] Dec 03 22:31:34.806444 master-0 kubenswrapper[36504]: I1203 22:31:34.806368 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-scripts\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.807808 master-0 kubenswrapper[36504]: I1203 22:31:34.806490 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-config-data\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.807808 master-0 kubenswrapper[36504]: I1203 22:31:34.806650 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-combined-ca-bundle\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.807808 master-0 kubenswrapper[36504]: I1203 22:31:34.806712 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-config-data-merged\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.807808 master-0 kubenswrapper[36504]: I1203 22:31:34.807234 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-amphora-certs\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.807808 master-0 kubenswrapper[36504]: I1203 22:31:34.807410 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-hm-ports\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.931095 master-0 kubenswrapper[36504]: I1203 22:31:34.930260 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-amphora-certs\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.931095 master-0 kubenswrapper[36504]: I1203 22:31:34.930391 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-hm-ports\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.931095 master-0 kubenswrapper[36504]: I1203 22:31:34.930528 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-scripts\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.931095 master-0 kubenswrapper[36504]: I1203 22:31:34.930623 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-config-data\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.931095 master-0 kubenswrapper[36504]: I1203 22:31:34.930788 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-combined-ca-bundle\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.931095 master-0 kubenswrapper[36504]: I1203 22:31:34.930863 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-config-data-merged\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.932334 master-0 kubenswrapper[36504]: I1203 22:31:34.932287 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-hm-ports\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.932989 master-0 kubenswrapper[36504]: I1203 22:31:34.932960 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-config-data-merged\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.936958 master-0 kubenswrapper[36504]: I1203 22:31:34.936601 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-combined-ca-bundle\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.938065 master-0 kubenswrapper[36504]: I1203 22:31:34.938025 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-amphora-certs\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.940482 master-0 kubenswrapper[36504]: I1203 22:31:34.940221 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-scripts\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:34.941113 master-0 kubenswrapper[36504]: I1203 22:31:34.941059 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95de3c6d-6ec9-4ec8-8285-6cb03f8f992a-config-data\") pod \"octavia-worker-js9sr\" (UID: \"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a\") " pod="openstack/octavia-worker-js9sr" Dec 03 22:31:35.066973 master-0 kubenswrapper[36504]: I1203 22:31:35.063927 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-js9sr" Dec 03 22:31:35.220690 master-0 kubenswrapper[36504]: I1203 22:31:35.220627 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:35.248732 master-0 kubenswrapper[36504]: I1203 22:31:35.248632 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-ssh-key\") pod \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " Dec 03 22:31:35.249108 master-0 kubenswrapper[36504]: I1203 22:31:35.249038 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-inventory\") pod \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " Dec 03 22:31:35.250084 master-0 kubenswrapper[36504]: I1203 22:31:35.249487 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s49p\" (UniqueName: \"kubernetes.io/projected/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-kube-api-access-2s49p\") pod \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\" (UID: \"c634dafd-5f2a-4e31-9a87-bd6ea37498ea\") " Dec 03 22:31:35.258726 master-0 kubenswrapper[36504]: I1203 22:31:35.258648 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-kube-api-access-2s49p" (OuterVolumeSpecName: "kube-api-access-2s49p") pod "c634dafd-5f2a-4e31-9a87-bd6ea37498ea" (UID: "c634dafd-5f2a-4e31-9a87-bd6ea37498ea"). InnerVolumeSpecName "kube-api-access-2s49p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:31:35.335905 master-0 kubenswrapper[36504]: I1203 22:31:35.334571 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-inventory" (OuterVolumeSpecName: "inventory") pod "c634dafd-5f2a-4e31-9a87-bd6ea37498ea" (UID: "c634dafd-5f2a-4e31-9a87-bd6ea37498ea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:35.356864 master-0 kubenswrapper[36504]: I1203 22:31:35.356227 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:35.356864 master-0 kubenswrapper[36504]: I1203 22:31:35.356281 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s49p\" (UniqueName: \"kubernetes.io/projected/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-kube-api-access-2s49p\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:35.377920 master-0 kubenswrapper[36504]: I1203 22:31:35.376216 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c634dafd-5f2a-4e31-9a87-bd6ea37498ea" (UID: "c634dafd-5f2a-4e31-9a87-bd6ea37498ea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:31:35.468566 master-0 kubenswrapper[36504]: I1203 22:31:35.468071 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c634dafd-5f2a-4e31-9a87-bd6ea37498ea-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:31:35.535996 master-0 kubenswrapper[36504]: I1203 22:31:35.534354 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-74hkl" event={"ID":"2d9f6b72-51f1-436d-8836-2745d82faddc","Type":"ContainerStarted","Data":"cf85f4b09c5797dab49e972d0b0aa10b8eb4f22332b57ac53baf0d84a6e6fb7f"} Dec 03 22:31:35.535996 master-0 kubenswrapper[36504]: I1203 22:31:35.535650 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:35.559500 master-0 kubenswrapper[36504]: I1203 22:31:35.559436 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-dataplane-os-edpm-87gbb" event={"ID":"c634dafd-5f2a-4e31-9a87-bd6ea37498ea","Type":"ContainerDied","Data":"c2b6bd72ed8f8f48f7044040e4c23be3275fe430401f6d888b808e43dc1f7cdb"} Dec 03 22:31:35.559500 master-0 kubenswrapper[36504]: I1203 22:31:35.559500 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b6bd72ed8f8f48f7044040e4c23be3275fe430401f6d888b808e43dc1f7cdb" Dec 03 22:31:35.559969 master-0 kubenswrapper[36504]: I1203 22:31:35.559584 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-dataplane-os-edpm-87gbb" Dec 03 22:31:35.595831 master-0 kubenswrapper[36504]: I1203 22:31:35.595718 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-dataplane-os-edpm-9fg6m"] Dec 03 22:31:35.598263 master-0 kubenswrapper[36504]: E1203 22:31:35.598204 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c634dafd-5f2a-4e31-9a87-bd6ea37498ea" containerName="validate-network-dataplane-os-edpm" Dec 03 22:31:35.598263 master-0 kubenswrapper[36504]: I1203 22:31:35.598262 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c634dafd-5f2a-4e31-9a87-bd6ea37498ea" containerName="validate-network-dataplane-os-edpm" Dec 03 22:31:35.598978 master-0 kubenswrapper[36504]: I1203 22:31:35.598947 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c634dafd-5f2a-4e31-9a87-bd6ea37498ea" containerName="validate-network-dataplane-os-edpm" Dec 03 22:31:35.601360 master-0 kubenswrapper[36504]: I1203 22:31:35.601327 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.604186 master-0 kubenswrapper[36504]: I1203 22:31:35.604146 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:31:35.604432 master-0 kubenswrapper[36504]: I1203 22:31:35.604379 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:31:35.612004 master-0 kubenswrapper[36504]: I1203 22:31:35.611872 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:31:35.766896 master-0 kubenswrapper[36504]: I1203 22:31:35.760893 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-dataplane-os-edpm-9fg6m"] Dec 03 22:31:35.780211 master-0 kubenswrapper[36504]: I1203 22:31:35.780041 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-ssh-key\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.780211 master-0 kubenswrapper[36504]: I1203 22:31:35.780158 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s6jh\" (UniqueName: \"kubernetes.io/projected/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-kube-api-access-9s6jh\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.780613 master-0 kubenswrapper[36504]: I1203 22:31:35.780351 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-inventory\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.838147 master-0 kubenswrapper[36504]: I1203 22:31:35.826614 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-74hkl" podStartSLOduration=5.826582062 podStartE2EDuration="5.826582062s" podCreationTimestamp="2025-12-03 22:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:31:35.588330304 +0000 UTC m=+1260.808102321" watchObservedRunningTime="2025-12-03 22:31:35.826582062 +0000 UTC m=+1261.046354069" Dec 03 22:31:35.920019 master-0 kubenswrapper[36504]: I1203 22:31:35.918047 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-ssh-key\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.920019 master-0 kubenswrapper[36504]: I1203 22:31:35.918365 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s6jh\" (UniqueName: \"kubernetes.io/projected/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-kube-api-access-9s6jh\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.942804 master-0 kubenswrapper[36504]: I1203 22:31:35.927190 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-inventory\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.970833 master-0 kubenswrapper[36504]: I1203 22:31:35.966188 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-ssh-key\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.986817 master-0 kubenswrapper[36504]: I1203 22:31:35.980040 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s6jh\" (UniqueName: \"kubernetes.io/projected/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-kube-api-access-9s6jh\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:35.986817 master-0 kubenswrapper[36504]: I1203 22:31:35.980703 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-inventory\") pod \"install-os-dataplane-os-edpm-9fg6m\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:36.029543 master-0 kubenswrapper[36504]: E1203 22:31:36.029399 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:31:36.055226 master-0 kubenswrapper[36504]: I1203 22:31:36.055147 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:31:36.165926 master-0 kubenswrapper[36504]: I1203 22:31:36.165596 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-js9sr"] Dec 03 22:31:36.429874 master-0 kubenswrapper[36504]: W1203 22:31:36.429278 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95de3c6d_6ec9_4ec8_8285_6cb03f8f992a.slice/crio-bb2146e456c6663dde9612904d16c3983066cfdcaf8a922c3eb40f09c37a918e WatchSource:0}: Error finding container bb2146e456c6663dde9612904d16c3983066cfdcaf8a922c3eb40f09c37a918e: Status 404 returned error can't find the container with id bb2146e456c6663dde9612904d16c3983066cfdcaf8a922c3eb40f09c37a918e Dec 03 22:31:36.586181 master-0 kubenswrapper[36504]: I1203 22:31:36.585596 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-js9sr" event={"ID":"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a","Type":"ContainerStarted","Data":"bb2146e456c6663dde9612904d16c3983066cfdcaf8a922c3eb40f09c37a918e"} Dec 03 22:31:38.624572 master-0 kubenswrapper[36504]: I1203 22:31:38.624470 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2vjcj" event={"ID":"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3","Type":"ContainerStarted","Data":"679396699f5e64176ea6e8fd8f90e93e240104f32d6a56c8bed1b6a40c40ba46"} Dec 03 22:31:39.384402 master-0 kubenswrapper[36504]: I1203 22:31:39.381348 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-dataplane-os-edpm-9fg6m"] Dec 03 22:31:40.682936 master-0 kubenswrapper[36504]: I1203 22:31:40.682561 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-os-edpm-9fg6m" event={"ID":"839db6a2-6896-4e0f-b3ee-b0f54800ecd5","Type":"ContainerStarted","Data":"c74e7681bea7ec2df9440b7769bf8992451d846b04123fc17b99f9115fe1b210"} Dec 03 22:31:40.684713 master-0 kubenswrapper[36504]: I1203 22:31:40.684414 36504 generic.go:334] "Generic (PLEG): container finished" podID="70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3" containerID="679396699f5e64176ea6e8fd8f90e93e240104f32d6a56c8bed1b6a40c40ba46" exitCode=0 Dec 03 22:31:40.684713 master-0 kubenswrapper[36504]: I1203 22:31:40.684461 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2vjcj" event={"ID":"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3","Type":"ContainerDied","Data":"679396699f5e64176ea6e8fd8f90e93e240104f32d6a56c8bed1b6a40c40ba46"} Dec 03 22:31:41.702136 master-0 kubenswrapper[36504]: I1203 22:31:41.702045 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-os-edpm-9fg6m" event={"ID":"839db6a2-6896-4e0f-b3ee-b0f54800ecd5","Type":"ContainerStarted","Data":"fec40eb67dbc0afe4f8e31cd28ac0f3db4c8b75308097d88e33b9a5d2cf9b2ea"} Dec 03 22:31:41.705394 master-0 kubenswrapper[36504]: I1203 22:31:41.705340 36504 generic.go:334] "Generic (PLEG): container finished" podID="95de3c6d-6ec9-4ec8-8285-6cb03f8f992a" containerID="7678b9b54911a4af5941a49aa975de035e207ad6504259935c9c571958d434e3" exitCode=0 Dec 03 22:31:41.706425 master-0 kubenswrapper[36504]: I1203 22:31:41.705448 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-js9sr" event={"ID":"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a","Type":"ContainerDied","Data":"7678b9b54911a4af5941a49aa975de035e207ad6504259935c9c571958d434e3"} Dec 03 22:31:41.708662 master-0 kubenswrapper[36504]: I1203 22:31:41.708557 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2vjcj" event={"ID":"70c31cf5-b2e2-40c5-a0b3-31820fe7a3a3","Type":"ContainerStarted","Data":"c46d3ea20da1ec6ac610d0372d54be6142e3be162e2ec6b964d78939e9cb2029"} Dec 03 22:31:41.709049 master-0 kubenswrapper[36504]: I1203 22:31:41.709016 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:41.758737 master-0 kubenswrapper[36504]: I1203 22:31:41.758619 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-dataplane-os-edpm-9fg6m" podStartSLOduration=6.333872448 podStartE2EDuration="6.758590666s" podCreationTimestamp="2025-12-03 22:31:35 +0000 UTC" firstStartedPulling="2025-12-03 22:31:39.946846732 +0000 UTC m=+1265.166618739" lastFinishedPulling="2025-12-03 22:31:40.37156494 +0000 UTC m=+1265.591336957" observedRunningTime="2025-12-03 22:31:41.723496348 +0000 UTC m=+1266.943268425" watchObservedRunningTime="2025-12-03 22:31:41.758590666 +0000 UTC m=+1266.978362673" Dec 03 22:31:41.775968 master-0 kubenswrapper[36504]: I1203 22:31:41.775856 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-2vjcj" podStartSLOduration=6.360378643 podStartE2EDuration="8.775828266s" podCreationTimestamp="2025-12-03 22:31:33 +0000 UTC" firstStartedPulling="2025-12-03 22:31:34.078946904 +0000 UTC m=+1259.298718911" lastFinishedPulling="2025-12-03 22:31:36.494396527 +0000 UTC m=+1261.714168534" observedRunningTime="2025-12-03 22:31:41.75202018 +0000 UTC m=+1266.971792187" watchObservedRunningTime="2025-12-03 22:31:41.775828266 +0000 UTC m=+1266.995600273" Dec 03 22:31:42.165173 master-0 kubenswrapper[36504]: I1203 22:31:42.164812 36504 scope.go:117] "RemoveContainer" containerID="3a0cd1e926d33bb474ede50ac54ad7dac595519714f4608d71bc2d26bdc7afa9" Dec 03 22:31:42.221286 master-0 kubenswrapper[36504]: I1203 22:31:42.221208 36504 scope.go:117] "RemoveContainer" containerID="243fb3e7052bb47c1d450eafc9fdd3cc50be6dd1c068b3d430b5cfc2cf787cf2" Dec 03 22:31:42.335889 master-0 kubenswrapper[36504]: I1203 22:31:42.335835 36504 scope.go:117] "RemoveContainer" containerID="1134b64cbd728364a1ecbcc5af5ff8ad9874f65a096416a5ff243337bacd7f54" Dec 03 22:31:42.747512 master-0 kubenswrapper[36504]: I1203 22:31:42.747175 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-js9sr" event={"ID":"95de3c6d-6ec9-4ec8-8285-6cb03f8f992a","Type":"ContainerStarted","Data":"69ddcd62e79cd421621d8f9374139154d4176a26348a3754e84f12f7f06f0171"} Dec 03 22:31:42.748317 master-0 kubenswrapper[36504]: I1203 22:31:42.748269 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-js9sr" Dec 03 22:31:42.787263 master-0 kubenswrapper[36504]: I1203 22:31:42.776172 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-js9sr" podStartSLOduration=5.292326795 podStartE2EDuration="8.776152207s" podCreationTimestamp="2025-12-03 22:31:34 +0000 UTC" firstStartedPulling="2025-12-03 22:31:36.469330634 +0000 UTC m=+1261.689102641" lastFinishedPulling="2025-12-03 22:31:39.953156046 +0000 UTC m=+1265.172928053" observedRunningTime="2025-12-03 22:31:42.773011888 +0000 UTC m=+1267.992783895" watchObservedRunningTime="2025-12-03 22:31:42.776152207 +0000 UTC m=+1267.995924214" Dec 03 22:31:46.355303 master-0 kubenswrapper[36504]: I1203 22:31:46.355235 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-74hkl" Dec 03 22:31:48.502820 master-0 kubenswrapper[36504]: I1203 22:31:48.502669 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-2vjcj" Dec 03 22:31:50.104631 master-0 kubenswrapper[36504]: I1203 22:31:50.104556 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-js9sr" Dec 03 22:32:19.097033 master-0 kubenswrapper[36504]: I1203 22:32:19.096947 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:32:28.096264 master-0 kubenswrapper[36504]: I1203 22:32:28.096095 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:32:35.726112 master-0 kubenswrapper[36504]: E1203 22:32:35.726033 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:32:42.503480 master-0 kubenswrapper[36504]: I1203 22:32:42.503405 36504 scope.go:117] "RemoveContainer" containerID="979a6f05ddb40cd49bdfca416b102462f2afe9382b6f0e57dc84fd3f58196ff9" Dec 03 22:32:51.897930 master-0 kubenswrapper[36504]: I1203 22:32:51.897848 36504 generic.go:334] "Generic (PLEG): container finished" podID="839db6a2-6896-4e0f-b3ee-b0f54800ecd5" containerID="fec40eb67dbc0afe4f8e31cd28ac0f3db4c8b75308097d88e33b9a5d2cf9b2ea" exitCode=0 Dec 03 22:32:51.898843 master-0 kubenswrapper[36504]: I1203 22:32:51.897941 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-os-edpm-9fg6m" event={"ID":"839db6a2-6896-4e0f-b3ee-b0f54800ecd5","Type":"ContainerDied","Data":"fec40eb67dbc0afe4f8e31cd28ac0f3db4c8b75308097d88e33b9a5d2cf9b2ea"} Dec 03 22:32:53.538524 master-0 kubenswrapper[36504]: I1203 22:32:53.538450 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:32:53.657096 master-0 kubenswrapper[36504]: I1203 22:32:53.656916 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-inventory\") pod \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " Dec 03 22:32:53.657424 master-0 kubenswrapper[36504]: I1203 22:32:53.657294 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s6jh\" (UniqueName: \"kubernetes.io/projected/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-kube-api-access-9s6jh\") pod \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " Dec 03 22:32:53.657540 master-0 kubenswrapper[36504]: I1203 22:32:53.657499 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-ssh-key\") pod \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\" (UID: \"839db6a2-6896-4e0f-b3ee-b0f54800ecd5\") " Dec 03 22:32:53.663498 master-0 kubenswrapper[36504]: I1203 22:32:53.663363 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-kube-api-access-9s6jh" (OuterVolumeSpecName: "kube-api-access-9s6jh") pod "839db6a2-6896-4e0f-b3ee-b0f54800ecd5" (UID: "839db6a2-6896-4e0f-b3ee-b0f54800ecd5"). InnerVolumeSpecName "kube-api-access-9s6jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:32:53.698463 master-0 kubenswrapper[36504]: I1203 22:32:53.698329 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "839db6a2-6896-4e0f-b3ee-b0f54800ecd5" (UID: "839db6a2-6896-4e0f-b3ee-b0f54800ecd5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:53.723313 master-0 kubenswrapper[36504]: I1203 22:32:53.723221 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-inventory" (OuterVolumeSpecName: "inventory") pod "839db6a2-6896-4e0f-b3ee-b0f54800ecd5" (UID: "839db6a2-6896-4e0f-b3ee-b0f54800ecd5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:32:53.762192 master-0 kubenswrapper[36504]: I1203 22:32:53.762103 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:32:53.762192 master-0 kubenswrapper[36504]: I1203 22:32:53.762136 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:32:53.762192 master-0 kubenswrapper[36504]: I1203 22:32:53.762147 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s6jh\" (UniqueName: \"kubernetes.io/projected/839db6a2-6896-4e0f-b3ee-b0f54800ecd5-kube-api-access-9s6jh\") on node \"master-0\" DevicePath \"\"" Dec 03 22:32:53.930954 master-0 kubenswrapper[36504]: I1203 22:32:53.930591 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-dataplane-os-edpm-9fg6m" event={"ID":"839db6a2-6896-4e0f-b3ee-b0f54800ecd5","Type":"ContainerDied","Data":"c74e7681bea7ec2df9440b7769bf8992451d846b04123fc17b99f9115fe1b210"} Dec 03 22:32:53.930954 master-0 kubenswrapper[36504]: I1203 22:32:53.930654 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74e7681bea7ec2df9440b7769bf8992451d846b04123fc17b99f9115fe1b210" Dec 03 22:32:53.930954 master-0 kubenswrapper[36504]: I1203 22:32:53.930794 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-dataplane-os-edpm-9fg6m" Dec 03 22:32:54.154087 master-0 kubenswrapper[36504]: I1203 22:32:54.154003 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-dataplane-os-edpm-gkcmv"] Dec 03 22:32:54.155087 master-0 kubenswrapper[36504]: E1203 22:32:54.155045 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839db6a2-6896-4e0f-b3ee-b0f54800ecd5" containerName="install-os-dataplane-os-edpm" Dec 03 22:32:54.155087 master-0 kubenswrapper[36504]: I1203 22:32:54.155075 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="839db6a2-6896-4e0f-b3ee-b0f54800ecd5" containerName="install-os-dataplane-os-edpm" Dec 03 22:32:54.155604 master-0 kubenswrapper[36504]: I1203 22:32:54.155565 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="839db6a2-6896-4e0f-b3ee-b0f54800ecd5" containerName="install-os-dataplane-os-edpm" Dec 03 22:32:54.156873 master-0 kubenswrapper[36504]: I1203 22:32:54.156838 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.161103 master-0 kubenswrapper[36504]: I1203 22:32:54.161047 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:32:54.161219 master-0 kubenswrapper[36504]: I1203 22:32:54.161145 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:32:54.161420 master-0 kubenswrapper[36504]: I1203 22:32:54.161375 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:32:54.172297 master-0 kubenswrapper[36504]: I1203 22:32:54.172226 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-dataplane-os-edpm-gkcmv"] Dec 03 22:32:54.287570 master-0 kubenswrapper[36504]: I1203 22:32:54.287509 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-inventory\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.287914 master-0 kubenswrapper[36504]: I1203 22:32:54.287736 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-ssh-key\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.287914 master-0 kubenswrapper[36504]: I1203 22:32:54.287811 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xppwf\" (UniqueName: \"kubernetes.io/projected/0a5b11be-af24-4a43-b945-9417d6f48f3a-kube-api-access-xppwf\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.391590 master-0 kubenswrapper[36504]: I1203 22:32:54.391511 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-ssh-key\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.391887 master-0 kubenswrapper[36504]: I1203 22:32:54.391609 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xppwf\" (UniqueName: \"kubernetes.io/projected/0a5b11be-af24-4a43-b945-9417d6f48f3a-kube-api-access-xppwf\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.391887 master-0 kubenswrapper[36504]: I1203 22:32:54.391793 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-inventory\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.399096 master-0 kubenswrapper[36504]: I1203 22:32:54.396155 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-inventory\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.401975 master-0 kubenswrapper[36504]: I1203 22:32:54.401924 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-ssh-key\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.418584 master-0 kubenswrapper[36504]: I1203 22:32:54.418506 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xppwf\" (UniqueName: \"kubernetes.io/projected/0a5b11be-af24-4a43-b945-9417d6f48f3a-kube-api-access-xppwf\") pod \"configure-os-dataplane-os-edpm-gkcmv\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:54.523930 master-0 kubenswrapper[36504]: I1203 22:32:54.523733 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:32:55.162585 master-0 kubenswrapper[36504]: W1203 22:32:55.162491 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a5b11be_af24_4a43_b945_9417d6f48f3a.slice/crio-7219a8a2b6b2efa99f497cd67bf63204584945839af4ef9d07c5455f1c12db60 WatchSource:0}: Error finding container 7219a8a2b6b2efa99f497cd67bf63204584945839af4ef9d07c5455f1c12db60: Status 404 returned error can't find the container with id 7219a8a2b6b2efa99f497cd67bf63204584945839af4ef9d07c5455f1c12db60 Dec 03 22:32:55.174201 master-0 kubenswrapper[36504]: I1203 22:32:55.173724 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-dataplane-os-edpm-gkcmv"] Dec 03 22:32:55.965484 master-0 kubenswrapper[36504]: I1203 22:32:55.965227 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" event={"ID":"0a5b11be-af24-4a43-b945-9417d6f48f3a","Type":"ContainerStarted","Data":"7219a8a2b6b2efa99f497cd67bf63204584945839af4ef9d07c5455f1c12db60"} Dec 03 22:32:56.987395 master-0 kubenswrapper[36504]: I1203 22:32:56.987266 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" event={"ID":"0a5b11be-af24-4a43-b945-9417d6f48f3a","Type":"ContainerStarted","Data":"e2a66a754886953091992d0bc63be125714bc5a35b5fe8bd902327494aa70553"} Dec 03 22:32:57.407488 master-0 kubenswrapper[36504]: I1203 22:32:57.407371 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" podStartSLOduration=2.882787811 podStartE2EDuration="3.407338461s" podCreationTimestamp="2025-12-03 22:32:54 +0000 UTC" firstStartedPulling="2025-12-03 22:32:55.165998196 +0000 UTC m=+1340.385770203" lastFinishedPulling="2025-12-03 22:32:55.690548846 +0000 UTC m=+1340.910320853" observedRunningTime="2025-12-03 22:32:57.384362213 +0000 UTC m=+1342.604134250" watchObservedRunningTime="2025-12-03 22:32:57.407338461 +0000 UTC m=+1342.627110468" Dec 03 22:33:25.118512 master-0 kubenswrapper[36504]: I1203 22:33:25.118439 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:33:32.096357 master-0 kubenswrapper[36504]: I1203 22:33:32.096269 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:33:35.712164 master-0 kubenswrapper[36504]: E1203 22:33:35.712082 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:33:43.781809 master-0 kubenswrapper[36504]: I1203 22:33:43.781695 36504 generic.go:334] "Generic (PLEG): container finished" podID="0a5b11be-af24-4a43-b945-9417d6f48f3a" containerID="e2a66a754886953091992d0bc63be125714bc5a35b5fe8bd902327494aa70553" exitCode=0 Dec 03 22:33:43.781809 master-0 kubenswrapper[36504]: I1203 22:33:43.781792 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" event={"ID":"0a5b11be-af24-4a43-b945-9417d6f48f3a","Type":"ContainerDied","Data":"e2a66a754886953091992d0bc63be125714bc5a35b5fe8bd902327494aa70553"} Dec 03 22:33:45.371933 master-0 kubenswrapper[36504]: I1203 22:33:45.371881 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:33:45.447167 master-0 kubenswrapper[36504]: I1203 22:33:45.447114 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-inventory\") pod \"0a5b11be-af24-4a43-b945-9417d6f48f3a\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " Dec 03 22:33:45.447947 master-0 kubenswrapper[36504]: I1203 22:33:45.447923 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-ssh-key\") pod \"0a5b11be-af24-4a43-b945-9417d6f48f3a\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " Dec 03 22:33:45.448266 master-0 kubenswrapper[36504]: I1203 22:33:45.448252 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xppwf\" (UniqueName: \"kubernetes.io/projected/0a5b11be-af24-4a43-b945-9417d6f48f3a-kube-api-access-xppwf\") pod \"0a5b11be-af24-4a43-b945-9417d6f48f3a\" (UID: \"0a5b11be-af24-4a43-b945-9417d6f48f3a\") " Dec 03 22:33:45.453439 master-0 kubenswrapper[36504]: I1203 22:33:45.453413 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5b11be-af24-4a43-b945-9417d6f48f3a-kube-api-access-xppwf" (OuterVolumeSpecName: "kube-api-access-xppwf") pod "0a5b11be-af24-4a43-b945-9417d6f48f3a" (UID: "0a5b11be-af24-4a43-b945-9417d6f48f3a"). InnerVolumeSpecName "kube-api-access-xppwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:33:45.481481 master-0 kubenswrapper[36504]: I1203 22:33:45.481373 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-inventory" (OuterVolumeSpecName: "inventory") pod "0a5b11be-af24-4a43-b945-9417d6f48f3a" (UID: "0a5b11be-af24-4a43-b945-9417d6f48f3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:33:45.492951 master-0 kubenswrapper[36504]: I1203 22:33:45.492822 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0a5b11be-af24-4a43-b945-9417d6f48f3a" (UID: "0a5b11be-af24-4a43-b945-9417d6f48f3a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:33:45.553361 master-0 kubenswrapper[36504]: I1203 22:33:45.553287 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xppwf\" (UniqueName: \"kubernetes.io/projected/0a5b11be-af24-4a43-b945-9417d6f48f3a-kube-api-access-xppwf\") on node \"master-0\" DevicePath \"\"" Dec 03 22:33:45.553361 master-0 kubenswrapper[36504]: I1203 22:33:45.553343 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:33:45.553764 master-0 kubenswrapper[36504]: I1203 22:33:45.553399 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0a5b11be-af24-4a43-b945-9417d6f48f3a-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:33:45.831119 master-0 kubenswrapper[36504]: I1203 22:33:45.830938 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" event={"ID":"0a5b11be-af24-4a43-b945-9417d6f48f3a","Type":"ContainerDied","Data":"7219a8a2b6b2efa99f497cd67bf63204584945839af4ef9d07c5455f1c12db60"} Dec 03 22:33:45.831415 master-0 kubenswrapper[36504]: I1203 22:33:45.831212 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7219a8a2b6b2efa99f497cd67bf63204584945839af4ef9d07c5455f1c12db60" Dec 03 22:33:45.831415 master-0 kubenswrapper[36504]: I1203 22:33:45.831321 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-dataplane-os-edpm-gkcmv" Dec 03 22:33:45.928583 master-0 kubenswrapper[36504]: I1203 22:33:45.928479 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-dataplane-os-6m7rs"] Dec 03 22:33:45.929308 master-0 kubenswrapper[36504]: E1203 22:33:45.929278 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5b11be-af24-4a43-b945-9417d6f48f3a" containerName="configure-os-dataplane-os-edpm" Dec 03 22:33:45.929308 master-0 kubenswrapper[36504]: I1203 22:33:45.929303 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5b11be-af24-4a43-b945-9417d6f48f3a" containerName="configure-os-dataplane-os-edpm" Dec 03 22:33:45.929656 master-0 kubenswrapper[36504]: I1203 22:33:45.929627 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a5b11be-af24-4a43-b945-9417d6f48f3a" containerName="configure-os-dataplane-os-edpm" Dec 03 22:33:45.931096 master-0 kubenswrapper[36504]: I1203 22:33:45.931064 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:45.944155 master-0 kubenswrapper[36504]: I1203 22:33:45.944104 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:33:45.944578 master-0 kubenswrapper[36504]: I1203 22:33:45.944355 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:33:45.944819 master-0 kubenswrapper[36504]: I1203 22:33:45.944428 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:33:45.967002 master-0 kubenswrapper[36504]: I1203 22:33:45.966922 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-edpm\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-ssh-key-edpm\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:45.967340 master-0 kubenswrapper[36504]: I1203 22:33:45.967050 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-dataplane-os-6m7rs"] Dec 03 22:33:45.967340 master-0 kubenswrapper[36504]: I1203 22:33:45.967244 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn524\" (UniqueName: \"kubernetes.io/projected/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-kube-api-access-nn524\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:45.967434 master-0 kubenswrapper[36504]: I1203 22:33:45.967364 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-inventory-0\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.071793 master-0 kubenswrapper[36504]: I1203 22:33:46.071711 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn524\" (UniqueName: \"kubernetes.io/projected/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-kube-api-access-nn524\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.073428 master-0 kubenswrapper[36504]: I1203 22:33:46.071871 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-inventory-0\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.073428 master-0 kubenswrapper[36504]: I1203 22:33:46.072045 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-edpm\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-ssh-key-edpm\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.078951 master-0 kubenswrapper[36504]: I1203 22:33:46.077173 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-edpm\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-ssh-key-edpm\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.078951 master-0 kubenswrapper[36504]: I1203 22:33:46.078791 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-inventory-0\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.094952 master-0 kubenswrapper[36504]: I1203 22:33:46.094808 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn524\" (UniqueName: \"kubernetes.io/projected/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-kube-api-access-nn524\") pod \"ssh-known-hosts-dataplane-os-6m7rs\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.289021 master-0 kubenswrapper[36504]: I1203 22:33:46.288909 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:46.900619 master-0 kubenswrapper[36504]: I1203 22:33:46.900541 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-dataplane-os-6m7rs"] Dec 03 22:33:47.873648 master-0 kubenswrapper[36504]: I1203 22:33:47.873568 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" event={"ID":"b3a696d2-bcea-4fb3-bffc-64b13f710a7a","Type":"ContainerStarted","Data":"730fa9bacebaf1b5c967bc6733b9e305d0d9fbe5a2d67493b62456fcb6a0feae"} Dec 03 22:33:47.873648 master-0 kubenswrapper[36504]: I1203 22:33:47.873654 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" event={"ID":"b3a696d2-bcea-4fb3-bffc-64b13f710a7a","Type":"ContainerStarted","Data":"35b7ddeb4c761b0ffe4f84aea7ff91c77d66fc7a64a12faf3f5f3badf6ef5c6b"} Dec 03 22:33:47.904714 master-0 kubenswrapper[36504]: I1203 22:33:47.904625 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" podStartSLOduration=2.3778815030000002 podStartE2EDuration="2.904600789s" podCreationTimestamp="2025-12-03 22:33:45 +0000 UTC" firstStartedPulling="2025-12-03 22:33:46.902559615 +0000 UTC m=+1392.122331622" lastFinishedPulling="2025-12-03 22:33:47.429278911 +0000 UTC m=+1392.649050908" observedRunningTime="2025-12-03 22:33:47.894213754 +0000 UTC m=+1393.113985771" watchObservedRunningTime="2025-12-03 22:33:47.904600789 +0000 UTC m=+1393.124372786" Dec 03 22:33:54.065409 master-0 kubenswrapper[36504]: I1203 22:33:54.065312 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-84f8-account-create-update-ds2dl"] Dec 03 22:33:54.083593 master-0 kubenswrapper[36504]: I1203 22:33:54.083519 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-84f8-account-create-update-ds2dl"] Dec 03 22:33:54.988170 master-0 kubenswrapper[36504]: I1203 22:33:54.988100 36504 generic.go:334] "Generic (PLEG): container finished" podID="b3a696d2-bcea-4fb3-bffc-64b13f710a7a" containerID="730fa9bacebaf1b5c967bc6733b9e305d0d9fbe5a2d67493b62456fcb6a0feae" exitCode=0 Dec 03 22:33:54.988495 master-0 kubenswrapper[36504]: I1203 22:33:54.988180 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" event={"ID":"b3a696d2-bcea-4fb3-bffc-64b13f710a7a","Type":"ContainerDied","Data":"730fa9bacebaf1b5c967bc6733b9e305d0d9fbe5a2d67493b62456fcb6a0feae"} Dec 03 22:33:55.113535 master-0 kubenswrapper[36504]: I1203 22:33:55.113453 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e8ecdd-ac48-4859-9bc3-a17949d9879c" path="/var/lib/kubelet/pods/d5e8ecdd-ac48-4859-9bc3-a17949d9879c/volumes" Dec 03 22:33:56.715749 master-0 kubenswrapper[36504]: I1203 22:33:56.715676 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:56.753823 master-0 kubenswrapper[36504]: I1203 22:33:56.752343 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-inventory-0\") pod \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " Dec 03 22:33:56.753823 master-0 kubenswrapper[36504]: I1203 22:33:56.752589 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-edpm\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-ssh-key-edpm\") pod \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " Dec 03 22:33:56.753823 master-0 kubenswrapper[36504]: I1203 22:33:56.752842 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn524\" (UniqueName: \"kubernetes.io/projected/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-kube-api-access-nn524\") pod \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\" (UID: \"b3a696d2-bcea-4fb3-bffc-64b13f710a7a\") " Dec 03 22:33:56.758949 master-0 kubenswrapper[36504]: I1203 22:33:56.758869 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-kube-api-access-nn524" (OuterVolumeSpecName: "kube-api-access-nn524") pod "b3a696d2-bcea-4fb3-bffc-64b13f710a7a" (UID: "b3a696d2-bcea-4fb3-bffc-64b13f710a7a"). InnerVolumeSpecName "kube-api-access-nn524". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:33:56.794181 master-0 kubenswrapper[36504]: I1203 22:33:56.792578 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "b3a696d2-bcea-4fb3-bffc-64b13f710a7a" (UID: "b3a696d2-bcea-4fb3-bffc-64b13f710a7a"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:33:56.794181 master-0 kubenswrapper[36504]: I1203 22:33:56.792678 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-ssh-key-edpm" (OuterVolumeSpecName: "ssh-key-edpm") pod "b3a696d2-bcea-4fb3-bffc-64b13f710a7a" (UID: "b3a696d2-bcea-4fb3-bffc-64b13f710a7a"). InnerVolumeSpecName "ssh-key-edpm". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:33:56.860357 master-0 kubenswrapper[36504]: I1203 22:33:56.860185 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key-edpm\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-ssh-key-edpm\") on node \"master-0\" DevicePath \"\"" Dec 03 22:33:56.860357 master-0 kubenswrapper[36504]: I1203 22:33:56.860263 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn524\" (UniqueName: \"kubernetes.io/projected/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-kube-api-access-nn524\") on node \"master-0\" DevicePath \"\"" Dec 03 22:33:56.860357 master-0 kubenswrapper[36504]: I1203 22:33:56.860317 36504 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/b3a696d2-bcea-4fb3-bffc-64b13f710a7a-inventory-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:33:57.022116 master-0 kubenswrapper[36504]: I1203 22:33:57.021926 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" Dec 03 22:33:57.022437 master-0 kubenswrapper[36504]: I1203 22:33:57.021928 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-dataplane-os-6m7rs" event={"ID":"b3a696d2-bcea-4fb3-bffc-64b13f710a7a","Type":"ContainerDied","Data":"35b7ddeb4c761b0ffe4f84aea7ff91c77d66fc7a64a12faf3f5f3badf6ef5c6b"} Dec 03 22:33:57.022437 master-0 kubenswrapper[36504]: I1203 22:33:57.022277 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35b7ddeb4c761b0ffe4f84aea7ff91c77d66fc7a64a12faf3f5f3badf6ef5c6b" Dec 03 22:33:57.220543 master-0 kubenswrapper[36504]: I1203 22:33:57.220414 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-dataplane-os-edpm-tlmrq"] Dec 03 22:33:57.221284 master-0 kubenswrapper[36504]: E1203 22:33:57.221263 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a696d2-bcea-4fb3-bffc-64b13f710a7a" containerName="ssh-known-hosts-dataplane-os" Dec 03 22:33:57.221284 master-0 kubenswrapper[36504]: I1203 22:33:57.221284 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a696d2-bcea-4fb3-bffc-64b13f710a7a" containerName="ssh-known-hosts-dataplane-os" Dec 03 22:33:57.221838 master-0 kubenswrapper[36504]: I1203 22:33:57.221716 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a696d2-bcea-4fb3-bffc-64b13f710a7a" containerName="ssh-known-hosts-dataplane-os" Dec 03 22:33:57.223033 master-0 kubenswrapper[36504]: I1203 22:33:57.223003 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.230273 master-0 kubenswrapper[36504]: I1203 22:33:57.230216 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:33:57.230554 master-0 kubenswrapper[36504]: I1203 22:33:57.230458 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:33:57.230694 master-0 kubenswrapper[36504]: I1203 22:33:57.230648 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:33:57.238571 master-0 kubenswrapper[36504]: I1203 22:33:57.238483 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-dataplane-os-edpm-tlmrq"] Dec 03 22:33:57.277876 master-0 kubenswrapper[36504]: I1203 22:33:57.277064 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-inventory\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.277876 master-0 kubenswrapper[36504]: I1203 22:33:57.277436 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-ssh-key\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.279265 master-0 kubenswrapper[36504]: I1203 22:33:57.278483 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grzn8\" (UniqueName: \"kubernetes.io/projected/5620656e-5c79-45cf-a58b-7d89ff171b94-kube-api-access-grzn8\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.382261 master-0 kubenswrapper[36504]: I1203 22:33:57.382176 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grzn8\" (UniqueName: \"kubernetes.io/projected/5620656e-5c79-45cf-a58b-7d89ff171b94-kube-api-access-grzn8\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.382567 master-0 kubenswrapper[36504]: I1203 22:33:57.382280 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-inventory\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.382567 master-0 kubenswrapper[36504]: I1203 22:33:57.382378 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-ssh-key\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.388551 master-0 kubenswrapper[36504]: I1203 22:33:57.388335 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-inventory\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.389283 master-0 kubenswrapper[36504]: I1203 22:33:57.389225 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-ssh-key\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.405605 master-0 kubenswrapper[36504]: I1203 22:33:57.403070 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grzn8\" (UniqueName: \"kubernetes.io/projected/5620656e-5c79-45cf-a58b-7d89ff171b94-kube-api-access-grzn8\") pod \"run-os-dataplane-os-edpm-tlmrq\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:57.548815 master-0 kubenswrapper[36504]: I1203 22:33:57.548662 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:33:58.062000 master-0 kubenswrapper[36504]: I1203 22:33:58.061927 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0ba6-account-create-update-shlcw"] Dec 03 22:33:58.090299 master-0 kubenswrapper[36504]: I1203 22:33:58.090192 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6lvls"] Dec 03 22:33:58.125565 master-0 kubenswrapper[36504]: I1203 22:33:58.125477 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v6gf7"] Dec 03 22:33:58.142683 master-0 kubenswrapper[36504]: I1203 22:33:58.142607 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0ba6-account-create-update-shlcw"] Dec 03 22:33:58.167934 master-0 kubenswrapper[36504]: I1203 22:33:58.166910 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6lvls"] Dec 03 22:33:58.190862 master-0 kubenswrapper[36504]: I1203 22:33:58.190717 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v6gf7"] Dec 03 22:33:58.207379 master-0 kubenswrapper[36504]: I1203 22:33:58.207323 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-dataplane-os-edpm-tlmrq"] Dec 03 22:33:59.117416 master-0 kubenswrapper[36504]: I1203 22:33:59.117222 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e40216-8f57-45da-827f-a42c5fc3fb96" path="/var/lib/kubelet/pods/51e40216-8f57-45da-827f-a42c5fc3fb96/volumes" Dec 03 22:33:59.118782 master-0 kubenswrapper[36504]: I1203 22:33:59.118677 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa046c9-0892-46ec-8325-4739744f5c72" path="/var/lib/kubelet/pods/afa046c9-0892-46ec-8325-4739744f5c72/volumes" Dec 03 22:33:59.120676 master-0 kubenswrapper[36504]: I1203 22:33:59.120641 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad6d333-f7eb-48b5-bce2-53d69d9b2ab8" path="/var/lib/kubelet/pods/bad6d333-f7eb-48b5-bce2-53d69d9b2ab8/volumes" Dec 03 22:33:59.122048 master-0 kubenswrapper[36504]: I1203 22:33:59.121822 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-os-edpm-tlmrq" event={"ID":"5620656e-5c79-45cf-a58b-7d89ff171b94","Type":"ContainerStarted","Data":"4f4199ec0a8001d77912c634c23d2470f1a7ca6ba64cc353f8cb98841939ec90"} Dec 03 22:33:59.122048 master-0 kubenswrapper[36504]: I1203 22:33:59.121874 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-os-edpm-tlmrq" event={"ID":"5620656e-5c79-45cf-a58b-7d89ff171b94","Type":"ContainerStarted","Data":"4deb1fb72fb5a9b355d58ca61ce0a9b1a5a8ffbe00e4879ead6e45c189925800"} Dec 03 22:33:59.139475 master-0 kubenswrapper[36504]: I1203 22:33:59.139366 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-dataplane-os-edpm-tlmrq" podStartSLOduration=1.637853057 podStartE2EDuration="2.139341174s" podCreationTimestamp="2025-12-03 22:33:57 +0000 UTC" firstStartedPulling="2025-12-03 22:33:58.175116273 +0000 UTC m=+1403.394888280" lastFinishedPulling="2025-12-03 22:33:58.67660439 +0000 UTC m=+1403.896376397" observedRunningTime="2025-12-03 22:33:59.12514313 +0000 UTC m=+1404.344915157" watchObservedRunningTime="2025-12-03 22:33:59.139341174 +0000 UTC m=+1404.359113181" Dec 03 22:34:00.054141 master-0 kubenswrapper[36504]: I1203 22:34:00.054037 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jcc6q"] Dec 03 22:34:00.072128 master-0 kubenswrapper[36504]: I1203 22:34:00.072048 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jcc6q"] Dec 03 22:34:01.045210 master-0 kubenswrapper[36504]: I1203 22:34:01.045133 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2d6c-account-create-update-55xwh"] Dec 03 22:34:01.062662 master-0 kubenswrapper[36504]: I1203 22:34:01.062574 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2d6c-account-create-update-55xwh"] Dec 03 22:34:01.113931 master-0 kubenswrapper[36504]: I1203 22:34:01.113830 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92513a58-839a-4447-97a0-8291a5ba09c9" path="/var/lib/kubelet/pods/92513a58-839a-4447-97a0-8291a5ba09c9/volumes" Dec 03 22:34:01.115226 master-0 kubenswrapper[36504]: I1203 22:34:01.115171 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27ac245-600a-4798-a749-bd7884a3fe95" path="/var/lib/kubelet/pods/e27ac245-600a-4798-a749-bd7884a3fe95/volumes" Dec 03 22:34:07.247595 master-0 kubenswrapper[36504]: I1203 22:34:07.247459 36504 generic.go:334] "Generic (PLEG): container finished" podID="5620656e-5c79-45cf-a58b-7d89ff171b94" containerID="4f4199ec0a8001d77912c634c23d2470f1a7ca6ba64cc353f8cb98841939ec90" exitCode=0 Dec 03 22:34:07.247595 master-0 kubenswrapper[36504]: I1203 22:34:07.247534 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-os-edpm-tlmrq" event={"ID":"5620656e-5c79-45cf-a58b-7d89ff171b94","Type":"ContainerDied","Data":"4f4199ec0a8001d77912c634c23d2470f1a7ca6ba64cc353f8cb98841939ec90"} Dec 03 22:34:08.987394 master-0 kubenswrapper[36504]: I1203 22:34:08.987337 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:34:09.109931 master-0 kubenswrapper[36504]: I1203 22:34:09.109688 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-ssh-key\") pod \"5620656e-5c79-45cf-a58b-7d89ff171b94\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " Dec 03 22:34:09.109931 master-0 kubenswrapper[36504]: I1203 22:34:09.109759 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grzn8\" (UniqueName: \"kubernetes.io/projected/5620656e-5c79-45cf-a58b-7d89ff171b94-kube-api-access-grzn8\") pod \"5620656e-5c79-45cf-a58b-7d89ff171b94\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " Dec 03 22:34:09.110263 master-0 kubenswrapper[36504]: I1203 22:34:09.109997 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-inventory\") pod \"5620656e-5c79-45cf-a58b-7d89ff171b94\" (UID: \"5620656e-5c79-45cf-a58b-7d89ff171b94\") " Dec 03 22:34:09.114141 master-0 kubenswrapper[36504]: I1203 22:34:09.114076 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5620656e-5c79-45cf-a58b-7d89ff171b94-kube-api-access-grzn8" (OuterVolumeSpecName: "kube-api-access-grzn8") pod "5620656e-5c79-45cf-a58b-7d89ff171b94" (UID: "5620656e-5c79-45cf-a58b-7d89ff171b94"). InnerVolumeSpecName "kube-api-access-grzn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:34:09.152187 master-0 kubenswrapper[36504]: I1203 22:34:09.152084 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-inventory" (OuterVolumeSpecName: "inventory") pod "5620656e-5c79-45cf-a58b-7d89ff171b94" (UID: "5620656e-5c79-45cf-a58b-7d89ff171b94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:34:09.152519 master-0 kubenswrapper[36504]: I1203 22:34:09.152199 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5620656e-5c79-45cf-a58b-7d89ff171b94" (UID: "5620656e-5c79-45cf-a58b-7d89ff171b94"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:34:09.214233 master-0 kubenswrapper[36504]: I1203 22:34:09.214170 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:34:09.214233 master-0 kubenswrapper[36504]: I1203 22:34:09.214221 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5620656e-5c79-45cf-a58b-7d89ff171b94-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:34:09.214233 master-0 kubenswrapper[36504]: I1203 22:34:09.214236 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grzn8\" (UniqueName: \"kubernetes.io/projected/5620656e-5c79-45cf-a58b-7d89ff171b94-kube-api-access-grzn8\") on node \"master-0\" DevicePath \"\"" Dec 03 22:34:09.281988 master-0 kubenswrapper[36504]: I1203 22:34:09.281913 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-dataplane-os-edpm-tlmrq" event={"ID":"5620656e-5c79-45cf-a58b-7d89ff171b94","Type":"ContainerDied","Data":"4deb1fb72fb5a9b355d58ca61ce0a9b1a5a8ffbe00e4879ead6e45c189925800"} Dec 03 22:34:09.281988 master-0 kubenswrapper[36504]: I1203 22:34:09.281984 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4deb1fb72fb5a9b355d58ca61ce0a9b1a5a8ffbe00e4879ead6e45c189925800" Dec 03 22:34:09.282330 master-0 kubenswrapper[36504]: I1203 22:34:09.282058 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-dataplane-os-edpm-tlmrq" Dec 03 22:34:09.441793 master-0 kubenswrapper[36504]: I1203 22:34:09.435795 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-dataplane-os-edpm-9j8wn"] Dec 03 22:34:09.441793 master-0 kubenswrapper[36504]: E1203 22:34:09.436991 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5620656e-5c79-45cf-a58b-7d89ff171b94" containerName="run-os-dataplane-os-edpm" Dec 03 22:34:09.441793 master-0 kubenswrapper[36504]: I1203 22:34:09.437014 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5620656e-5c79-45cf-a58b-7d89ff171b94" containerName="run-os-dataplane-os-edpm" Dec 03 22:34:09.441793 master-0 kubenswrapper[36504]: I1203 22:34:09.437453 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="5620656e-5c79-45cf-a58b-7d89ff171b94" containerName="run-os-dataplane-os-edpm" Dec 03 22:34:09.441793 master-0 kubenswrapper[36504]: I1203 22:34:09.438735 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.450797 master-0 kubenswrapper[36504]: I1203 22:34:09.448480 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:34:09.450797 master-0 kubenswrapper[36504]: I1203 22:34:09.449708 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:34:09.458931 master-0 kubenswrapper[36504]: I1203 22:34:09.455063 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:34:09.482780 master-0 kubenswrapper[36504]: I1203 22:34:09.482686 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-dataplane-os-edpm-9j8wn"] Dec 03 22:34:09.525047 master-0 kubenswrapper[36504]: I1203 22:34:09.524960 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-inventory\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.525518 master-0 kubenswrapper[36504]: I1203 22:34:09.525469 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5p7q\" (UniqueName: \"kubernetes.io/projected/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-kube-api-access-b5p7q\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.526610 master-0 kubenswrapper[36504]: I1203 22:34:09.526416 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-ssh-key\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.583583 master-0 kubenswrapper[36504]: E1203 22:34:09.583467 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5620656e_5c79_45cf_a58b_7d89ff171b94.slice/crio-4deb1fb72fb5a9b355d58ca61ce0a9b1a5a8ffbe00e4879ead6e45c189925800\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5620656e_5c79_45cf_a58b_7d89ff171b94.slice\": RecentStats: unable to find data in memory cache]" Dec 03 22:34:09.630150 master-0 kubenswrapper[36504]: I1203 22:34:09.629959 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-ssh-key\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.630562 master-0 kubenswrapper[36504]: I1203 22:34:09.630181 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-inventory\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.630562 master-0 kubenswrapper[36504]: I1203 22:34:09.630287 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5p7q\" (UniqueName: \"kubernetes.io/projected/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-kube-api-access-b5p7q\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.636793 master-0 kubenswrapper[36504]: I1203 22:34:09.634648 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-inventory\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.636793 master-0 kubenswrapper[36504]: I1203 22:34:09.635191 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-ssh-key\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.655850 master-0 kubenswrapper[36504]: I1203 22:34:09.651542 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5p7q\" (UniqueName: \"kubernetes.io/projected/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-kube-api-access-b5p7q\") pod \"reboot-os-dataplane-os-edpm-9j8wn\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:09.767442 master-0 kubenswrapper[36504]: I1203 22:34:09.767254 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:34:10.446122 master-0 kubenswrapper[36504]: W1203 22:34:10.446031 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96d763b6_6b9f_426b_b2f6_b9e007b7e67f.slice/crio-27190ca2afb8983aa37eea0c138f9f6e050477ab51265fa55351bd00dc27fe8e WatchSource:0}: Error finding container 27190ca2afb8983aa37eea0c138f9f6e050477ab51265fa55351bd00dc27fe8e: Status 404 returned error can't find the container with id 27190ca2afb8983aa37eea0c138f9f6e050477ab51265fa55351bd00dc27fe8e Dec 03 22:34:10.452814 master-0 kubenswrapper[36504]: I1203 22:34:10.452736 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-dataplane-os-edpm-9j8wn"] Dec 03 22:34:11.317743 master-0 kubenswrapper[36504]: I1203 22:34:11.317647 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" event={"ID":"96d763b6-6b9f-426b-b2f6-b9e007b7e67f","Type":"ContainerStarted","Data":"27190ca2afb8983aa37eea0c138f9f6e050477ab51265fa55351bd00dc27fe8e"} Dec 03 22:34:12.357560 master-0 kubenswrapper[36504]: I1203 22:34:12.357471 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" event={"ID":"96d763b6-6b9f-426b-b2f6-b9e007b7e67f","Type":"ContainerStarted","Data":"5d0c9aad20567a2e7c1021e19c31de3d8e04d90ba5b01c8179feeb1b7861538a"} Dec 03 22:34:12.397592 master-0 kubenswrapper[36504]: I1203 22:34:12.397410 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" podStartSLOduration=2.895656256 podStartE2EDuration="3.39737944s" podCreationTimestamp="2025-12-03 22:34:09 +0000 UTC" firstStartedPulling="2025-12-03 22:34:10.449151319 +0000 UTC m=+1415.668923336" lastFinishedPulling="2025-12-03 22:34:10.950874513 +0000 UTC m=+1416.170646520" observedRunningTime="2025-12-03 22:34:12.385011423 +0000 UTC m=+1417.604783450" watchObservedRunningTime="2025-12-03 22:34:12.39737944 +0000 UTC m=+1417.617151447" Dec 03 22:34:33.095883 master-0 kubenswrapper[36504]: I1203 22:34:33.095792 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:34:35.079572 master-0 kubenswrapper[36504]: I1203 22:34:35.079428 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-113a-account-create-update-6xzl6"] Dec 03 22:34:35.116166 master-0 kubenswrapper[36504]: I1203 22:34:35.115608 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-113a-account-create-update-6xzl6"] Dec 03 22:34:35.719332 master-0 kubenswrapper[36504]: E1203 22:34:35.719248 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:34:37.114475 master-0 kubenswrapper[36504]: I1203 22:34:37.114404 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6914d837-9818-4960-aa61-a0ea34ae574b" path="/var/lib/kubelet/pods/6914d837-9818-4960-aa61-a0ea34ae574b/volumes" Dec 03 22:34:38.076444 master-0 kubenswrapper[36504]: I1203 22:34:38.076373 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d991-account-create-update-n4kt9"] Dec 03 22:34:38.097460 master-0 kubenswrapper[36504]: I1203 22:34:38.097393 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d991-account-create-update-n4kt9"] Dec 03 22:34:38.111453 master-0 kubenswrapper[36504]: I1203 22:34:38.111392 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a73c-account-create-update-2m5sr"] Dec 03 22:34:38.125184 master-0 kubenswrapper[36504]: I1203 22:34:38.125116 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-8lq7d"] Dec 03 22:34:38.136733 master-0 kubenswrapper[36504]: I1203 22:34:38.136693 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8bwst"] Dec 03 22:34:38.149529 master-0 kubenswrapper[36504]: I1203 22:34:38.149500 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sglc2"] Dec 03 22:34:38.161450 master-0 kubenswrapper[36504]: I1203 22:34:38.161418 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a73c-account-create-update-2m5sr"] Dec 03 22:34:38.175258 master-0 kubenswrapper[36504]: I1203 22:34:38.175206 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-66454"] Dec 03 22:34:38.188449 master-0 kubenswrapper[36504]: I1203 22:34:38.188387 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-8lq7d"] Dec 03 22:34:38.210982 master-0 kubenswrapper[36504]: I1203 22:34:38.210904 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8bwst"] Dec 03 22:34:38.225859 master-0 kubenswrapper[36504]: I1203 22:34:38.225779 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sglc2"] Dec 03 22:34:38.240932 master-0 kubenswrapper[36504]: I1203 22:34:38.240884 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-66454"] Dec 03 22:34:38.253934 master-0 kubenswrapper[36504]: I1203 22:34:38.253872 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0539-account-create-update-h6tq8"] Dec 03 22:34:38.265589 master-0 kubenswrapper[36504]: I1203 22:34:38.265504 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0539-account-create-update-h6tq8"] Dec 03 22:34:39.052894 master-0 kubenswrapper[36504]: I1203 22:34:39.052814 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cj96j"] Dec 03 22:34:39.071301 master-0 kubenswrapper[36504]: I1203 22:34:39.071190 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cj96j"] Dec 03 22:34:39.113258 master-0 kubenswrapper[36504]: I1203 22:34:39.112858 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="185d021a-be33-4c90-a8b6-ddd16d53b53d" path="/var/lib/kubelet/pods/185d021a-be33-4c90-a8b6-ddd16d53b53d/volumes" Dec 03 22:34:39.113829 master-0 kubenswrapper[36504]: I1203 22:34:39.113751 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c70d18-a17c-4ceb-a58a-6cb354489cc8" path="/var/lib/kubelet/pods/39c70d18-a17c-4ceb-a58a-6cb354489cc8/volumes" Dec 03 22:34:39.114544 master-0 kubenswrapper[36504]: I1203 22:34:39.114497 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55659879-0543-4282-abb3-d03ca6ecf2ce" path="/var/lib/kubelet/pods/55659879-0543-4282-abb3-d03ca6ecf2ce/volumes" Dec 03 22:34:39.115350 master-0 kubenswrapper[36504]: I1203 22:34:39.115305 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e04fc00-ca89-4a25-9bb4-2f59d9f14388" path="/var/lib/kubelet/pods/9e04fc00-ca89-4a25-9bb4-2f59d9f14388/volumes" Dec 03 22:34:39.117006 master-0 kubenswrapper[36504]: I1203 22:34:39.116902 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27383b2-a1e5-4e55-9709-136413c4c96a" path="/var/lib/kubelet/pods/a27383b2-a1e5-4e55-9709-136413c4c96a/volumes" Dec 03 22:34:39.117848 master-0 kubenswrapper[36504]: I1203 22:34:39.117809 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d98163-c65a-412b-a4c4-32ce8bff30a8" path="/var/lib/kubelet/pods/b0d98163-c65a-412b-a4c4-32ce8bff30a8/volumes" Dec 03 22:34:39.118576 master-0 kubenswrapper[36504]: I1203 22:34:39.118532 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbce4b8e-8c01-42bf-b475-ec56b388fa80" path="/var/lib/kubelet/pods/dbce4b8e-8c01-42bf-b475-ec56b388fa80/volumes" Dec 03 22:34:39.119886 master-0 kubenswrapper[36504]: I1203 22:34:39.119842 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10040d6-faaa-4181-b651-e26b54ae8a33" path="/var/lib/kubelet/pods/f10040d6-faaa-4181-b651-e26b54ae8a33/volumes" Dec 03 22:34:42.633763 master-0 kubenswrapper[36504]: I1203 22:34:42.633693 36504 scope.go:117] "RemoveContainer" containerID="97596bfea8c5ebd68ff6974cd790e0daa9d71633b399ed223de65410251a1448" Dec 03 22:34:42.669418 master-0 kubenswrapper[36504]: I1203 22:34:42.668956 36504 scope.go:117] "RemoveContainer" containerID="bcb1f1322002bce60ffab99d1283a433117edab66db6e25f19984b4bed5604ab" Dec 03 22:34:42.759503 master-0 kubenswrapper[36504]: I1203 22:34:42.759444 36504 scope.go:117] "RemoveContainer" containerID="fe9df3e82d8605ffe93bfdab93d76cc8f20a10afb1ef1485946567c82cdb3209" Dec 03 22:34:42.801156 master-0 kubenswrapper[36504]: I1203 22:34:42.801082 36504 scope.go:117] "RemoveContainer" containerID="2e1195f7c14306b10e00e53907ae64b41ec55a6cfecbb723f31437fbc1228799" Dec 03 22:34:42.886444 master-0 kubenswrapper[36504]: I1203 22:34:42.886273 36504 scope.go:117] "RemoveContainer" containerID="38cb0943a7e5d3013c44a9450a646d8483eb910274e21b5318223c637bc23680" Dec 03 22:34:42.935970 master-0 kubenswrapper[36504]: I1203 22:34:42.935924 36504 scope.go:117] "RemoveContainer" containerID="24b2dbf2959087f1218c3c20db41bf10628f97b5c6560d2fc0ee9ed39d8c8944" Dec 03 22:34:42.972915 master-0 kubenswrapper[36504]: I1203 22:34:42.972718 36504 scope.go:117] "RemoveContainer" containerID="bc21d4349c616b40b28ce13c6bf39a7d9f4dd91b8968bbaab7c158228c652214" Dec 03 22:34:43.065003 master-0 kubenswrapper[36504]: I1203 22:34:43.064939 36504 scope.go:117] "RemoveContainer" containerID="06378f7a35baeb277d82156dd96ffe06496d39394eed645f16cf5f7848c9200e" Dec 03 22:34:43.143864 master-0 kubenswrapper[36504]: I1203 22:34:43.143797 36504 scope.go:117] "RemoveContainer" containerID="2ab2101c3564d8e46df5763baa7b94814c3cc8f0d1b338e27e85182a9d97a871" Dec 03 22:34:43.174611 master-0 kubenswrapper[36504]: I1203 22:34:43.174158 36504 scope.go:117] "RemoveContainer" containerID="77a5d364d521b3672d4e1b7973c8c11b8d59e9103eb2ce07e5e595bd0e9613eb" Dec 03 22:34:43.204305 master-0 kubenswrapper[36504]: I1203 22:34:43.204252 36504 scope.go:117] "RemoveContainer" containerID="66860b89689a005b87841659b3bc16c01f3a81e68efcd96c5f3f64340f611f25" Dec 03 22:34:43.243154 master-0 kubenswrapper[36504]: I1203 22:34:43.243106 36504 scope.go:117] "RemoveContainer" containerID="aec52144dd58623639f37392efde253da5e0a918f5f8e908536bbfe7320902af" Dec 03 22:34:43.274335 master-0 kubenswrapper[36504]: I1203 22:34:43.274286 36504 scope.go:117] "RemoveContainer" containerID="1240ac558d6fad0b18f98df9311dcfd6c5752f54920f5c1b317ddc02cbe0ca26" Dec 03 22:34:43.303703 master-0 kubenswrapper[36504]: I1203 22:34:43.303654 36504 scope.go:117] "RemoveContainer" containerID="3f7f81cf5e2c3ae297d2b11e646a0fe97de4ef4507232b9045ef00bc1f3f3453" Dec 03 22:34:43.333369 master-0 kubenswrapper[36504]: I1203 22:34:43.333316 36504 scope.go:117] "RemoveContainer" containerID="e38349a8e2c4760958e3f3aeb3a37d7dc01f25c0108f92de9d4f787600919d20" Dec 03 22:34:43.388209 master-0 kubenswrapper[36504]: I1203 22:34:43.388147 36504 scope.go:117] "RemoveContainer" containerID="834322f81c10cc9b200b6b0795cef57bcd622a98f1882bc1e3cc706b92e74906" Dec 03 22:34:43.416412 master-0 kubenswrapper[36504]: I1203 22:34:43.413400 36504 scope.go:117] "RemoveContainer" containerID="9d3b1c630b279b2e5edf98141096ff43c7e6717a6d38ed6ddb1d9a68ae164702" Dec 03 22:34:43.442476 master-0 kubenswrapper[36504]: I1203 22:34:43.442415 36504 scope.go:117] "RemoveContainer" containerID="7686b747fcda6baeebfab44a169cdaf949623061781dd9e89816ab06feee6aeb" Dec 03 22:34:43.479611 master-0 kubenswrapper[36504]: I1203 22:34:43.479563 36504 scope.go:117] "RemoveContainer" containerID="c0506e45f23e5b48d58dc5df76675eee7a757807270faeadfc27b651e298570b" Dec 03 22:34:43.547334 master-0 kubenswrapper[36504]: I1203 22:34:43.547156 36504 scope.go:117] "RemoveContainer" containerID="410682d4622016aedc9341f9ee150e7a656ba0a58d7a6865d5e7fac7cd751f8b" Dec 03 22:34:43.587263 master-0 kubenswrapper[36504]: I1203 22:34:43.587203 36504 scope.go:117] "RemoveContainer" containerID="5f2bde362e7220f54a9e1257828cfea401e9ab086abf600d81456241c3b24b1a" Dec 03 22:34:46.062626 master-0 kubenswrapper[36504]: I1203 22:34:46.062541 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8r7tj"] Dec 03 22:34:46.083469 master-0 kubenswrapper[36504]: I1203 22:34:46.083396 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8r7tj"] Dec 03 22:34:47.112622 master-0 kubenswrapper[36504]: I1203 22:34:47.112569 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d143967f-1161-4824-b45f-67d0b695536f" path="/var/lib/kubelet/pods/d143967f-1161-4824-b45f-67d0b695536f/volumes" Dec 03 22:34:52.096307 master-0 kubenswrapper[36504]: I1203 22:34:52.096149 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:35:18.065842 master-0 kubenswrapper[36504]: I1203 22:35:18.065713 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-n4ht5"] Dec 03 22:35:18.084007 master-0 kubenswrapper[36504]: I1203 22:35:18.083916 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-n4ht5"] Dec 03 22:35:19.112224 master-0 kubenswrapper[36504]: I1203 22:35:19.112159 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99d0cdcb-2d58-4530-a135-0e89f8c77065" path="/var/lib/kubelet/pods/99d0cdcb-2d58-4530-a135-0e89f8c77065/volumes" Dec 03 22:35:26.563019 master-0 kubenswrapper[36504]: I1203 22:35:26.562948 36504 generic.go:334] "Generic (PLEG): container finished" podID="96d763b6-6b9f-426b-b2f6-b9e007b7e67f" containerID="5d0c9aad20567a2e7c1021e19c31de3d8e04d90ba5b01c8179feeb1b7861538a" exitCode=0 Dec 03 22:35:26.563019 master-0 kubenswrapper[36504]: I1203 22:35:26.563003 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" event={"ID":"96d763b6-6b9f-426b-b2f6-b9e007b7e67f","Type":"ContainerDied","Data":"5d0c9aad20567a2e7c1021e19c31de3d8e04d90ba5b01c8179feeb1b7861538a"} Dec 03 22:35:27.048307 master-0 kubenswrapper[36504]: I1203 22:35:27.048102 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hlr94"] Dec 03 22:35:27.065124 master-0 kubenswrapper[36504]: I1203 22:35:27.064927 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-v4x4d"] Dec 03 22:35:27.089782 master-0 kubenswrapper[36504]: I1203 22:35:27.089467 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hlr94"] Dec 03 22:35:27.112528 master-0 kubenswrapper[36504]: I1203 22:35:27.112451 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22075cd6-19d1-4b03-8517-a9d678a23e23" path="/var/lib/kubelet/pods/22075cd6-19d1-4b03-8517-a9d678a23e23/volumes" Dec 03 22:35:27.113391 master-0 kubenswrapper[36504]: I1203 22:35:27.113360 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-v4x4d"] Dec 03 22:35:28.200323 master-0 kubenswrapper[36504]: I1203 22:35:28.200260 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:35:28.292110 master-0 kubenswrapper[36504]: I1203 22:35:28.291967 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-inventory\") pod \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " Dec 03 22:35:28.292704 master-0 kubenswrapper[36504]: I1203 22:35:28.292672 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-ssh-key\") pod \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " Dec 03 22:35:28.292819 master-0 kubenswrapper[36504]: I1203 22:35:28.292790 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5p7q\" (UniqueName: \"kubernetes.io/projected/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-kube-api-access-b5p7q\") pod \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\" (UID: \"96d763b6-6b9f-426b-b2f6-b9e007b7e67f\") " Dec 03 22:35:28.296780 master-0 kubenswrapper[36504]: I1203 22:35:28.296689 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-kube-api-access-b5p7q" (OuterVolumeSpecName: "kube-api-access-b5p7q") pod "96d763b6-6b9f-426b-b2f6-b9e007b7e67f" (UID: "96d763b6-6b9f-426b-b2f6-b9e007b7e67f"). InnerVolumeSpecName "kube-api-access-b5p7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:35:28.331162 master-0 kubenswrapper[36504]: I1203 22:35:28.330970 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-inventory" (OuterVolumeSpecName: "inventory") pod "96d763b6-6b9f-426b-b2f6-b9e007b7e67f" (UID: "96d763b6-6b9f-426b-b2f6-b9e007b7e67f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:35:28.333594 master-0 kubenswrapper[36504]: I1203 22:35:28.333457 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96d763b6-6b9f-426b-b2f6-b9e007b7e67f" (UID: "96d763b6-6b9f-426b-b2f6-b9e007b7e67f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:35:28.398114 master-0 kubenswrapper[36504]: I1203 22:35:28.398021 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:35:28.398114 master-0 kubenswrapper[36504]: I1203 22:35:28.398083 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5p7q\" (UniqueName: \"kubernetes.io/projected/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-kube-api-access-b5p7q\") on node \"master-0\" DevicePath \"\"" Dec 03 22:35:28.398114 master-0 kubenswrapper[36504]: I1203 22:35:28.398099 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d763b6-6b9f-426b-b2f6-b9e007b7e67f-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:35:28.597372 master-0 kubenswrapper[36504]: I1203 22:35:28.597183 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" event={"ID":"96d763b6-6b9f-426b-b2f6-b9e007b7e67f","Type":"ContainerDied","Data":"27190ca2afb8983aa37eea0c138f9f6e050477ab51265fa55351bd00dc27fe8e"} Dec 03 22:35:28.597372 master-0 kubenswrapper[36504]: I1203 22:35:28.597257 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27190ca2afb8983aa37eea0c138f9f6e050477ab51265fa55351bd00dc27fe8e" Dec 03 22:35:28.597372 master-0 kubenswrapper[36504]: I1203 22:35:28.597261 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-dataplane-os-edpm-9j8wn" Dec 03 22:35:29.117298 master-0 kubenswrapper[36504]: I1203 22:35:29.117214 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b12b6b8-6e84-4e56-b403-d4d31d309852" path="/var/lib/kubelet/pods/3b12b6b8-6e84-4e56-b403-d4d31d309852/volumes" Dec 03 22:35:35.729653 master-0 kubenswrapper[36504]: E1203 22:35:35.728883 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:35:36.850163 master-0 kubenswrapper[36504]: I1203 22:35:36.850072 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-dataplane-services-edpm-99dtv"] Dec 03 22:35:36.851124 master-0 kubenswrapper[36504]: E1203 22:35:36.851091 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d763b6-6b9f-426b-b2f6-b9e007b7e67f" containerName="reboot-os-dataplane-os-edpm" Dec 03 22:35:36.851204 master-0 kubenswrapper[36504]: I1203 22:35:36.851126 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d763b6-6b9f-426b-b2f6-b9e007b7e67f" containerName="reboot-os-dataplane-os-edpm" Dec 03 22:35:36.851599 master-0 kubenswrapper[36504]: I1203 22:35:36.851572 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d763b6-6b9f-426b-b2f6-b9e007b7e67f" containerName="reboot-os-dataplane-os-edpm" Dec 03 22:35:36.853053 master-0 kubenswrapper[36504]: I1203 22:35:36.853022 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.855985 master-0 kubenswrapper[36504]: I1203 22:35:36.855945 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-telemetry-default-certs-0" Dec 03 22:35:36.856238 master-0 kubenswrapper[36504]: I1203 22:35:36.856191 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-neutron-metadata-default-certs-0" Dec 03 22:35:36.856293 master-0 kubenswrapper[36504]: I1203 22:35:36.856228 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-libvirt-default-certs-0" Dec 03 22:35:36.856473 master-0 kubenswrapper[36504]: I1203 22:35:36.856446 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:35:36.856473 master-0 kubenswrapper[36504]: I1203 22:35:36.856464 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:35:36.856581 master-0 kubenswrapper[36504]: I1203 22:35:36.856556 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"edpm-ovn-default-certs-0" Dec 03 22:35:36.857165 master-0 kubenswrapper[36504]: I1203 22:35:36.857133 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:35:36.890209 master-0 kubenswrapper[36504]: I1203 22:35:36.872713 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-dataplane-services-edpm-99dtv"] Dec 03 22:35:36.893576 master-0 kubenswrapper[36504]: I1203 22:35:36.893524 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-inventory\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893717 master-0 kubenswrapper[36504]: I1203 22:35:36.893598 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-nova-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893717 master-0 kubenswrapper[36504]: I1203 22:35:36.893642 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ssh-key\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893717 master-0 kubenswrapper[36504]: I1203 22:35:36.893682 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-libvirt-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893717 master-0 kubenswrapper[36504]: I1203 22:35:36.893710 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-telemetry-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893918 master-0 kubenswrapper[36504]: I1203 22:35:36.893815 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-ovn-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893918 master-0 kubenswrapper[36504]: I1203 22:35:36.893852 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.893918 master-0 kubenswrapper[36504]: I1203 22:35:36.893878 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-telemetry-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.894017 master-0 kubenswrapper[36504]: I1203 22:35:36.893927 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"edpm-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.894017 master-0 kubenswrapper[36504]: I1203 22:35:36.893969 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.894017 master-0 kubenswrapper[36504]: I1203 22:35:36.893999 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhq6h\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-kube-api-access-fhq6h\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.894109 master-0 kubenswrapper[36504]: I1203 22:35:36.894099 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.996891 master-0 kubenswrapper[36504]: I1203 22:35:36.996811 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-nova-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.996911 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ssh-key\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.996951 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-libvirt-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.996981 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-telemetry-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.997067 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-ovn-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.997112 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.997140 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-telemetry-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.997187 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"edpm-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997237 master-0 kubenswrapper[36504]: I1203 22:35:36.997235 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997560 master-0 kubenswrapper[36504]: I1203 22:35:36.997265 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhq6h\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-kube-api-access-fhq6h\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997560 master-0 kubenswrapper[36504]: I1203 22:35:36.997331 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:36.997560 master-0 kubenswrapper[36504]: I1203 22:35:36.997472 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-inventory\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.004880 master-0 kubenswrapper[36504]: I1203 22:35:37.004795 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-libvirt-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.005091 master-0 kubenswrapper[36504]: I1203 22:35:37.004904 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-nova-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.005188 master-0 kubenswrapper[36504]: I1203 22:35:37.005129 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.005421 master-0 kubenswrapper[36504]: I1203 22:35:37.005357 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-libvirt-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.006228 master-0 kubenswrapper[36504]: I1203 22:35:37.006183 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ssh-key\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.006528 master-0 kubenswrapper[36504]: I1203 22:35:37.006495 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-neutron-metadata-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.006857 master-0 kubenswrapper[36504]: I1203 22:35:37.006803 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-inventory\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.006930 master-0 kubenswrapper[36504]: I1203 22:35:37.006816 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ovn-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.007840 master-0 kubenswrapper[36504]: I1203 22:35:37.007738 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-telemetry-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.007939 master-0 kubenswrapper[36504]: I1203 22:35:37.007758 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-telemetry-combined-ca-bundle\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.017517 master-0 kubenswrapper[36504]: I1203 22:35:37.016979 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"edpm-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-ovn-default-certs-0\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.017517 master-0 kubenswrapper[36504]: I1203 22:35:37.017429 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhq6h\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-kube-api-access-fhq6h\") pod \"install-certs-dataplane-services-edpm-99dtv\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.055712 master-0 kubenswrapper[36504]: I1203 22:35:37.055610 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8lw82"] Dec 03 22:35:37.081195 master-0 kubenswrapper[36504]: I1203 22:35:37.081091 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8lw82"] Dec 03 22:35:37.120684 master-0 kubenswrapper[36504]: I1203 22:35:37.120556 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486e7352-51c4-4e75-9ff9-ead9eb721c74" path="/var/lib/kubelet/pods/486e7352-51c4-4e75-9ff9-ead9eb721c74/volumes" Dec 03 22:35:37.206546 master-0 kubenswrapper[36504]: I1203 22:35:37.206399 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:35:37.878287 master-0 kubenswrapper[36504]: I1203 22:35:37.878157 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-dataplane-services-edpm-99dtv"] Dec 03 22:35:38.056547 master-0 kubenswrapper[36504]: I1203 22:35:38.056375 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-baebb-db-sync-wq9c6"] Dec 03 22:35:38.075078 master-0 kubenswrapper[36504]: I1203 22:35:38.074989 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-hghcd"] Dec 03 22:35:38.090455 master-0 kubenswrapper[36504]: I1203 22:35:38.090384 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-baebb-db-sync-wq9c6"] Dec 03 22:35:38.096161 master-0 kubenswrapper[36504]: I1203 22:35:38.096102 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:35:38.109342 master-0 kubenswrapper[36504]: I1203 22:35:38.109254 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-hghcd"] Dec 03 22:35:38.760243 master-0 kubenswrapper[36504]: I1203 22:35:38.760156 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-services-edpm-99dtv" event={"ID":"8bc03a9b-2490-4ece-9707-25a2e5db6ae1","Type":"ContainerStarted","Data":"8f5a997826a42537c57480d3efdba723e958af5ffa2ff198e5586f09267d23f4"} Dec 03 22:35:38.760243 master-0 kubenswrapper[36504]: I1203 22:35:38.760234 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-services-edpm-99dtv" event={"ID":"8bc03a9b-2490-4ece-9707-25a2e5db6ae1","Type":"ContainerStarted","Data":"347c4fd1afb9b41a9695a5d297247b55ba89eacc3a81d93f76d3b4359dd91c31"} Dec 03 22:35:38.792585 master-0 kubenswrapper[36504]: I1203 22:35:38.792438 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-dataplane-services-edpm-99dtv" podStartSLOduration=2.342766296 podStartE2EDuration="2.792401529s" podCreationTimestamp="2025-12-03 22:35:36 +0000 UTC" firstStartedPulling="2025-12-03 22:35:37.864051152 +0000 UTC m=+1503.083823159" lastFinishedPulling="2025-12-03 22:35:38.313686385 +0000 UTC m=+1503.533458392" observedRunningTime="2025-12-03 22:35:38.780684812 +0000 UTC m=+1504.000456829" watchObservedRunningTime="2025-12-03 22:35:38.792401529 +0000 UTC m=+1504.012173536" Dec 03 22:35:39.118341 master-0 kubenswrapper[36504]: I1203 22:35:39.118256 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79800ee3-07d8-43e7-9263-15e8cdeef26d" path="/var/lib/kubelet/pods/79800ee3-07d8-43e7-9263-15e8cdeef26d/volumes" Dec 03 22:35:39.119195 master-0 kubenswrapper[36504]: I1203 22:35:39.119157 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f03542-de5f-457c-8843-01ea5d2febce" path="/var/lib/kubelet/pods/c4f03542-de5f-457c-8843-01ea5d2febce/volumes" Dec 03 22:35:44.016701 master-0 kubenswrapper[36504]: I1203 22:35:44.016618 36504 scope.go:117] "RemoveContainer" containerID="f73b6b3ecae967d9bb29ee494c0c19b97d70da3f1232551294416b1005658838" Dec 03 22:35:44.063087 master-0 kubenswrapper[36504]: I1203 22:35:44.062722 36504 scope.go:117] "RemoveContainer" containerID="0f9ea37d9377d62f31c3bf6f416c2a3014c34f4af1eb0915aa7afcfc799796bb" Dec 03 22:35:44.128644 master-0 kubenswrapper[36504]: I1203 22:35:44.128588 36504 scope.go:117] "RemoveContainer" containerID="33176e2d1fb664c8b39f0ac856d7f6bdbdc4e6ffadca38e31d0edd73a918f866" Dec 03 22:35:44.231288 master-0 kubenswrapper[36504]: I1203 22:35:44.231240 36504 scope.go:117] "RemoveContainer" containerID="0b0691b47f7c7349fc28ccbb6511cc23b35ecb7ebcbd369f30bb089fed3c7142" Dec 03 22:35:44.283862 master-0 kubenswrapper[36504]: I1203 22:35:44.283801 36504 scope.go:117] "RemoveContainer" containerID="ed22f8f052b7eb8696fb7d126014ca7ee249f63bbee1339902a7792b160ced4b" Dec 03 22:35:44.367125 master-0 kubenswrapper[36504]: I1203 22:35:44.366375 36504 scope.go:117] "RemoveContainer" containerID="4ba7fbc701630d14c229d5082ae0c81d6fecad586043aa1c771966dd3808b291" Dec 03 22:35:44.422066 master-0 kubenswrapper[36504]: I1203 22:35:44.421995 36504 scope.go:117] "RemoveContainer" containerID="3c4552f29ecea698a744372808b816cad64f4c05790364bc740c3ca6a59ee260" Dec 03 22:36:03.095898 master-0 kubenswrapper[36504]: I1203 22:36:03.095749 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:36:12.327977 master-0 kubenswrapper[36504]: I1203 22:36:12.327884 36504 generic.go:334] "Generic (PLEG): container finished" podID="8bc03a9b-2490-4ece-9707-25a2e5db6ae1" containerID="8f5a997826a42537c57480d3efdba723e958af5ffa2ff198e5586f09267d23f4" exitCode=0 Dec 03 22:36:12.328871 master-0 kubenswrapper[36504]: I1203 22:36:12.327969 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-services-edpm-99dtv" event={"ID":"8bc03a9b-2490-4ece-9707-25a2e5db6ae1","Type":"ContainerDied","Data":"8f5a997826a42537c57480d3efdba723e958af5ffa2ff198e5586f09267d23f4"} Dec 03 22:36:14.131573 master-0 kubenswrapper[36504]: I1203 22:36:14.131494 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:36:14.333009 master-0 kubenswrapper[36504]: I1203 22:36:14.332725 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ovn-combined-ca-bundle\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333009 master-0 kubenswrapper[36504]: I1203 22:36:14.332883 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhq6h\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-kube-api-access-fhq6h\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333009 master-0 kubenswrapper[36504]: I1203 22:36:14.333008 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-neutron-metadata-default-certs-0\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333047 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-ovn-default-certs-0\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333070 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-neutron-metadata-combined-ca-bundle\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333108 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-telemetry-combined-ca-bundle\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333257 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-libvirt-combined-ca-bundle\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333349 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-inventory\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333372 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ssh-key\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333397 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-libvirt-default-certs-0\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333419 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-nova-combined-ca-bundle\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.333629 master-0 kubenswrapper[36504]: I1203 22:36:14.333490 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"edpm-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-telemetry-default-certs-0\") pod \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\" (UID: \"8bc03a9b-2490-4ece-9707-25a2e5db6ae1\") " Dec 03 22:36:14.339347 master-0 kubenswrapper[36504]: I1203 22:36:14.336996 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "edpm-neutron-metadata-default-certs-0") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "edpm-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:14.339909 master-0 kubenswrapper[36504]: I1203 22:36:14.339860 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.340141 master-0 kubenswrapper[36504]: I1203 22:36:14.340079 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-kube-api-access-fhq6h" (OuterVolumeSpecName: "kube-api-access-fhq6h") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "kube-api-access-fhq6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:14.348651 master-0 kubenswrapper[36504]: I1203 22:36:14.347760 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-libvirt-default-certs-0" (OuterVolumeSpecName: "edpm-libvirt-default-certs-0") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "edpm-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:14.348651 master-0 kubenswrapper[36504]: I1203 22:36:14.347958 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-telemetry-default-certs-0" (OuterVolumeSpecName: "edpm-telemetry-default-certs-0") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "edpm-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:14.348651 master-0 kubenswrapper[36504]: I1203 22:36:14.348005 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.348651 master-0 kubenswrapper[36504]: I1203 22:36:14.348027 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.350781 master-0 kubenswrapper[36504]: I1203 22:36:14.350729 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.351482 master-0 kubenswrapper[36504]: I1203 22:36:14.351406 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-ovn-default-certs-0" (OuterVolumeSpecName: "edpm-ovn-default-certs-0") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "edpm-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:14.359903 master-0 kubenswrapper[36504]: I1203 22:36:14.356962 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.367848 master-0 kubenswrapper[36504]: I1203 22:36:14.367562 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-dataplane-services-edpm-99dtv" event={"ID":"8bc03a9b-2490-4ece-9707-25a2e5db6ae1","Type":"ContainerDied","Data":"347c4fd1afb9b41a9695a5d297247b55ba89eacc3a81d93f76d3b4359dd91c31"} Dec 03 22:36:14.367848 master-0 kubenswrapper[36504]: I1203 22:36:14.367626 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347c4fd1afb9b41a9695a5d297247b55ba89eacc3a81d93f76d3b4359dd91c31" Dec 03 22:36:14.367848 master-0 kubenswrapper[36504]: I1203 22:36:14.367696 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-dataplane-services-edpm-99dtv" Dec 03 22:36:14.377940 master-0 kubenswrapper[36504]: I1203 22:36:14.377883 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.384362 master-0 kubenswrapper[36504]: I1203 22:36:14.383084 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-inventory" (OuterVolumeSpecName: "inventory") pod "8bc03a9b-2490-4ece-9707-25a2e5db6ae1" (UID: "8bc03a9b-2490-4ece-9707-25a2e5db6ae1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:36:14.438916 master-0 kubenswrapper[36504]: I1203 22:36:14.438856 36504 reconciler_common.go:293] "Volume detached for volume \"edpm-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-libvirt-default-certs-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.439272 master-0 kubenswrapper[36504]: I1203 22:36:14.439253 36504 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-nova-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.439409 master-0 kubenswrapper[36504]: I1203 22:36:14.439365 36504 reconciler_common.go:293] "Volume detached for volume \"edpm-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-telemetry-default-certs-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.439511 master-0 kubenswrapper[36504]: I1203 22:36:14.439493 36504 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ovn-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.439614 master-0 kubenswrapper[36504]: I1203 22:36:14.439600 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhq6h\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-kube-api-access-fhq6h\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.439737 master-0 kubenswrapper[36504]: I1203 22:36:14.439719 36504 reconciler_common.go:293] "Volume detached for volume \"edpm-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-neutron-metadata-default-certs-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.439862 master-0 kubenswrapper[36504]: I1203 22:36:14.439843 36504 reconciler_common.go:293] "Volume detached for volume \"edpm-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-edpm-ovn-default-certs-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.441295 master-0 kubenswrapper[36504]: I1203 22:36:14.441256 36504 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.441418 master-0 kubenswrapper[36504]: I1203 22:36:14.441300 36504 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-telemetry-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.441418 master-0 kubenswrapper[36504]: I1203 22:36:14.441313 36504 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-libvirt-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.441418 master-0 kubenswrapper[36504]: I1203 22:36:14.441328 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.441418 master-0 kubenswrapper[36504]: I1203 22:36:14.441339 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8bc03a9b-2490-4ece-9707-25a2e5db6ae1-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: I1203 22:36:14.498481 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-dataplane-services-edpm-hgmsd"] Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: E1203 22:36:14.499469 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bc03a9b-2490-4ece-9707-25a2e5db6ae1" containerName="install-certs-dataplane-services-edpm" Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: I1203 22:36:14.499491 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc03a9b-2490-4ece-9707-25a2e5db6ae1" containerName="install-certs-dataplane-services-edpm" Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: I1203 22:36:14.499850 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bc03a9b-2490-4ece-9707-25a2e5db6ae1" containerName="install-certs-dataplane-services-edpm" Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: I1203 22:36:14.502430 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: I1203 22:36:14.509108 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 22:36:14.520795 master-0 kubenswrapper[36504]: I1203 22:36:14.516313 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-dataplane-services-edpm-hgmsd"] Dec 03 22:36:14.648361 master-0 kubenswrapper[36504]: I1203 22:36:14.648259 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-inventory\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.648657 master-0 kubenswrapper[36504]: I1203 22:36:14.648438 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovn-combined-ca-bundle\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.648657 master-0 kubenswrapper[36504]: I1203 22:36:14.648484 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-895q6\" (UniqueName: \"kubernetes.io/projected/5684b100-8e95-4df2-89d2-f5aa9ce5d281-kube-api-access-895q6\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.648657 master-0 kubenswrapper[36504]: I1203 22:36:14.648530 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ssh-key\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.648760 master-0 kubenswrapper[36504]: I1203 22:36:14.648658 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovncontroller-config-0\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.751397 master-0 kubenswrapper[36504]: I1203 22:36:14.751323 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovncontroller-config-0\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.751397 master-0 kubenswrapper[36504]: I1203 22:36:14.751410 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-inventory\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.751808 master-0 kubenswrapper[36504]: I1203 22:36:14.751530 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovn-combined-ca-bundle\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.751808 master-0 kubenswrapper[36504]: I1203 22:36:14.751579 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-895q6\" (UniqueName: \"kubernetes.io/projected/5684b100-8e95-4df2-89d2-f5aa9ce5d281-kube-api-access-895q6\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.751808 master-0 kubenswrapper[36504]: I1203 22:36:14.751631 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ssh-key\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.752391 master-0 kubenswrapper[36504]: I1203 22:36:14.752351 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovncontroller-config-0\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.758828 master-0 kubenswrapper[36504]: I1203 22:36:14.757267 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-inventory\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.758828 master-0 kubenswrapper[36504]: I1203 22:36:14.757510 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovn-combined-ca-bundle\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.758828 master-0 kubenswrapper[36504]: I1203 22:36:14.758007 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ssh-key\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.771112 master-0 kubenswrapper[36504]: I1203 22:36:14.771057 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-895q6\" (UniqueName: \"kubernetes.io/projected/5684b100-8e95-4df2-89d2-f5aa9ce5d281-kube-api-access-895q6\") pod \"ovn-dataplane-services-edpm-hgmsd\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:14.873386 master-0 kubenswrapper[36504]: I1203 22:36:14.873308 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:36:15.450414 master-0 kubenswrapper[36504]: I1203 22:36:15.450337 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-dataplane-services-edpm-hgmsd"] Dec 03 22:36:15.454252 master-0 kubenswrapper[36504]: W1203 22:36:15.454178 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5684b100_8e95_4df2_89d2_f5aa9ce5d281.slice/crio-334ae9d6ce34df166db1ead5a72d4d5cddb818ad9739b0cac177cbcbbcd6f112 WatchSource:0}: Error finding container 334ae9d6ce34df166db1ead5a72d4d5cddb818ad9739b0cac177cbcbbcd6f112: Status 404 returned error can't find the container with id 334ae9d6ce34df166db1ead5a72d4d5cddb818ad9739b0cac177cbcbbcd6f112 Dec 03 22:36:16.423180 master-0 kubenswrapper[36504]: I1203 22:36:16.421476 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-services-edpm-hgmsd" event={"ID":"5684b100-8e95-4df2-89d2-f5aa9ce5d281","Type":"ContainerStarted","Data":"c834e4d46f16de3472c5e2e4d17bf436f631d1db639d6c83428da30021141459"} Dec 03 22:36:16.423180 master-0 kubenswrapper[36504]: I1203 22:36:16.421601 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-services-edpm-hgmsd" event={"ID":"5684b100-8e95-4df2-89d2-f5aa9ce5d281","Type":"ContainerStarted","Data":"334ae9d6ce34df166db1ead5a72d4d5cddb818ad9739b0cac177cbcbbcd6f112"} Dec 03 22:36:16.452669 master-0 kubenswrapper[36504]: I1203 22:36:16.452556 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-dataplane-services-edpm-hgmsd" podStartSLOduration=1.809701885 podStartE2EDuration="2.452528887s" podCreationTimestamp="2025-12-03 22:36:14 +0000 UTC" firstStartedPulling="2025-12-03 22:36:15.458027808 +0000 UTC m=+1540.677799815" lastFinishedPulling="2025-12-03 22:36:16.10085481 +0000 UTC m=+1541.320626817" observedRunningTime="2025-12-03 22:36:16.445402365 +0000 UTC m=+1541.665174382" watchObservedRunningTime="2025-12-03 22:36:16.452528887 +0000 UTC m=+1541.672300894" Dec 03 22:36:35.724481 master-0 kubenswrapper[36504]: E1203 22:36:35.724382 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:36:45.079312 master-0 kubenswrapper[36504]: I1203 22:36:45.076273 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-whrpw"] Dec 03 22:36:45.131343 master-0 kubenswrapper[36504]: I1203 22:36:45.131228 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-whrpw"] Dec 03 22:36:46.048276 master-0 kubenswrapper[36504]: I1203 22:36:46.048186 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-c69rr"] Dec 03 22:36:46.062913 master-0 kubenswrapper[36504]: I1203 22:36:46.062837 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-c69rr"] Dec 03 22:36:46.451812 master-0 kubenswrapper[36504]: I1203 22:36:46.442537 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sc47k"] Dec 03 22:36:46.451812 master-0 kubenswrapper[36504]: I1203 22:36:46.447370 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.472168 master-0 kubenswrapper[36504]: I1203 22:36:46.472086 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sc47k"] Dec 03 22:36:46.508419 master-0 kubenswrapper[36504]: I1203 22:36:46.507266 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-catalog-content\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.508419 master-0 kubenswrapper[36504]: I1203 22:36:46.507602 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-utilities\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.508419 master-0 kubenswrapper[36504]: I1203 22:36:46.507946 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdsf\" (UniqueName: \"kubernetes.io/projected/119f779b-b4d0-4469-924d-f2f1680edcd1-kube-api-access-fhdsf\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.611171 master-0 kubenswrapper[36504]: I1203 22:36:46.611083 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-utilities\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.611505 master-0 kubenswrapper[36504]: I1203 22:36:46.611228 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdsf\" (UniqueName: \"kubernetes.io/projected/119f779b-b4d0-4469-924d-f2f1680edcd1-kube-api-access-fhdsf\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.611505 master-0 kubenswrapper[36504]: I1203 22:36:46.611368 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-catalog-content\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.611755 master-0 kubenswrapper[36504]: I1203 22:36:46.611717 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-utilities\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.611918 master-0 kubenswrapper[36504]: I1203 22:36:46.611792 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-catalog-content\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.636632 master-0 kubenswrapper[36504]: I1203 22:36:46.636568 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdsf\" (UniqueName: \"kubernetes.io/projected/119f779b-b4d0-4469-924d-f2f1680edcd1-kube-api-access-fhdsf\") pod \"certified-operators-sc47k\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:46.779680 master-0 kubenswrapper[36504]: I1203 22:36:46.779506 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:47.120922 master-0 kubenswrapper[36504]: I1203 22:36:47.120761 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6193a696-93bd-4303-82fd-bf39ce403d80" path="/var/lib/kubelet/pods/6193a696-93bd-4303-82fd-bf39ce403d80/volumes" Dec 03 22:36:47.122250 master-0 kubenswrapper[36504]: I1203 22:36:47.121650 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c228f8-3d7c-49c0-b63c-ce25dbe64ca1" path="/var/lib/kubelet/pods/76c228f8-3d7c-49c0-b63c-ce25dbe64ca1/volumes" Dec 03 22:36:47.456420 master-0 kubenswrapper[36504]: W1203 22:36:47.456339 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119f779b_b4d0_4469_924d_f2f1680edcd1.slice/crio-4e4139363e7c8a028269fc05b1de8f2f38940fcebb8f343299539caf7dd72bbc WatchSource:0}: Error finding container 4e4139363e7c8a028269fc05b1de8f2f38940fcebb8f343299539caf7dd72bbc: Status 404 returned error can't find the container with id 4e4139363e7c8a028269fc05b1de8f2f38940fcebb8f343299539caf7dd72bbc Dec 03 22:36:47.462917 master-0 kubenswrapper[36504]: I1203 22:36:47.462843 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sc47k"] Dec 03 22:36:47.886886 master-0 kubenswrapper[36504]: I1203 22:36:47.886804 36504 generic.go:334] "Generic (PLEG): container finished" podID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerID="170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97" exitCode=0 Dec 03 22:36:47.887211 master-0 kubenswrapper[36504]: I1203 22:36:47.886898 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerDied","Data":"170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97"} Dec 03 22:36:47.887211 master-0 kubenswrapper[36504]: I1203 22:36:47.886981 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerStarted","Data":"4e4139363e7c8a028269fc05b1de8f2f38940fcebb8f343299539caf7dd72bbc"} Dec 03 22:36:47.890890 master-0 kubenswrapper[36504]: I1203 22:36:47.890801 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:36:48.047511 master-0 kubenswrapper[36504]: I1203 22:36:48.047356 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3742-account-create-update-frr2z"] Dec 03 22:36:48.084951 master-0 kubenswrapper[36504]: I1203 22:36:48.082092 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e258-account-create-update-lf7cr"] Dec 03 22:36:48.109561 master-0 kubenswrapper[36504]: I1203 22:36:48.109391 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3742-account-create-update-frr2z"] Dec 03 22:36:48.130652 master-0 kubenswrapper[36504]: I1203 22:36:48.130380 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e258-account-create-update-lf7cr"] Dec 03 22:36:48.912798 master-0 kubenswrapper[36504]: I1203 22:36:48.910245 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerStarted","Data":"4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15"} Dec 03 22:36:49.128226 master-0 kubenswrapper[36504]: I1203 22:36:49.127544 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a046886b-4b70-4fb1-82ce-ef92db1677f7" path="/var/lib/kubelet/pods/a046886b-4b70-4fb1-82ce-ef92db1677f7/volumes" Dec 03 22:36:49.128650 master-0 kubenswrapper[36504]: I1203 22:36:49.128613 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad30900d-dcdc-49cb-ab6e-601c1d18a77c" path="/var/lib/kubelet/pods/ad30900d-dcdc-49cb-ab6e-601c1d18a77c/volumes" Dec 03 22:36:49.129695 master-0 kubenswrapper[36504]: I1203 22:36:49.129659 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-w6cn8"] Dec 03 22:36:49.129753 master-0 kubenswrapper[36504]: I1203 22:36:49.129702 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-3ad6-account-create-update-gj27c"] Dec 03 22:36:49.142795 master-0 kubenswrapper[36504]: I1203 22:36:49.141017 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-w6cn8"] Dec 03 22:36:49.166638 master-0 kubenswrapper[36504]: I1203 22:36:49.166561 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-3ad6-account-create-update-gj27c"] Dec 03 22:36:49.927692 master-0 kubenswrapper[36504]: I1203 22:36:49.927550 36504 generic.go:334] "Generic (PLEG): container finished" podID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerID="4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15" exitCode=0 Dec 03 22:36:49.928704 master-0 kubenswrapper[36504]: I1203 22:36:49.927681 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerDied","Data":"4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15"} Dec 03 22:36:50.946464 master-0 kubenswrapper[36504]: I1203 22:36:50.946384 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerStarted","Data":"733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14"} Dec 03 22:36:50.978885 master-0 kubenswrapper[36504]: I1203 22:36:50.978785 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sc47k" podStartSLOduration=2.543815295 podStartE2EDuration="4.97873752s" podCreationTimestamp="2025-12-03 22:36:46 +0000 UTC" firstStartedPulling="2025-12-03 22:36:47.890637431 +0000 UTC m=+1573.110409438" lastFinishedPulling="2025-12-03 22:36:50.325559656 +0000 UTC m=+1575.545331663" observedRunningTime="2025-12-03 22:36:50.970210953 +0000 UTC m=+1576.189982980" watchObservedRunningTime="2025-12-03 22:36:50.97873752 +0000 UTC m=+1576.198509527" Dec 03 22:36:51.113351 master-0 kubenswrapper[36504]: I1203 22:36:51.113247 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b698a34-7bd9-4125-ad5c-885db2cf4959" path="/var/lib/kubelet/pods/1b698a34-7bd9-4125-ad5c-885db2cf4959/volumes" Dec 03 22:36:51.114175 master-0 kubenswrapper[36504]: I1203 22:36:51.114142 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6954864a-d445-46ef-8b09-dcb0543b8b23" path="/var/lib/kubelet/pods/6954864a-d445-46ef-8b09-dcb0543b8b23/volumes" Dec 03 22:36:56.780402 master-0 kubenswrapper[36504]: I1203 22:36:56.780302 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:56.780402 master-0 kubenswrapper[36504]: I1203 22:36:56.780407 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:56.836275 master-0 kubenswrapper[36504]: I1203 22:36:56.836208 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:57.081615 master-0 kubenswrapper[36504]: I1203 22:36:57.081461 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:57.158084 master-0 kubenswrapper[36504]: I1203 22:36:57.157980 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sc47k"] Dec 03 22:36:58.096407 master-0 kubenswrapper[36504]: I1203 22:36:58.096336 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:36:59.051319 master-0 kubenswrapper[36504]: I1203 22:36:59.051204 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sc47k" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="registry-server" containerID="cri-o://733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14" gracePeriod=2 Dec 03 22:36:59.667674 master-0 kubenswrapper[36504]: I1203 22:36:59.667632 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:36:59.823501 master-0 kubenswrapper[36504]: I1203 22:36:59.823257 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-utilities\") pod \"119f779b-b4d0-4469-924d-f2f1680edcd1\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " Dec 03 22:36:59.823832 master-0 kubenswrapper[36504]: I1203 22:36:59.823611 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-catalog-content\") pod \"119f779b-b4d0-4469-924d-f2f1680edcd1\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " Dec 03 22:36:59.823832 master-0 kubenswrapper[36504]: I1203 22:36:59.823755 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdsf\" (UniqueName: \"kubernetes.io/projected/119f779b-b4d0-4469-924d-f2f1680edcd1-kube-api-access-fhdsf\") pod \"119f779b-b4d0-4469-924d-f2f1680edcd1\" (UID: \"119f779b-b4d0-4469-924d-f2f1680edcd1\") " Dec 03 22:36:59.824809 master-0 kubenswrapper[36504]: I1203 22:36:59.824714 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-utilities" (OuterVolumeSpecName: "utilities") pod "119f779b-b4d0-4469-924d-f2f1680edcd1" (UID: "119f779b-b4d0-4469-924d-f2f1680edcd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:36:59.826406 master-0 kubenswrapper[36504]: I1203 22:36:59.826341 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:59.830161 master-0 kubenswrapper[36504]: I1203 22:36:59.827536 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119f779b-b4d0-4469-924d-f2f1680edcd1-kube-api-access-fhdsf" (OuterVolumeSpecName: "kube-api-access-fhdsf") pod "119f779b-b4d0-4469-924d-f2f1680edcd1" (UID: "119f779b-b4d0-4469-924d-f2f1680edcd1"). InnerVolumeSpecName "kube-api-access-fhdsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:36:59.889851 master-0 kubenswrapper[36504]: I1203 22:36:59.889595 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "119f779b-b4d0-4469-924d-f2f1680edcd1" (UID: "119f779b-b4d0-4469-924d-f2f1680edcd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:36:59.928734 master-0 kubenswrapper[36504]: I1203 22:36:59.928663 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119f779b-b4d0-4469-924d-f2f1680edcd1-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:36:59.928734 master-0 kubenswrapper[36504]: I1203 22:36:59.928716 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdsf\" (UniqueName: \"kubernetes.io/projected/119f779b-b4d0-4469-924d-f2f1680edcd1-kube-api-access-fhdsf\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:00.074841 master-0 kubenswrapper[36504]: I1203 22:37:00.074652 36504 generic.go:334] "Generic (PLEG): container finished" podID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerID="733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14" exitCode=0 Dec 03 22:37:00.074841 master-0 kubenswrapper[36504]: I1203 22:37:00.074720 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerDied","Data":"733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14"} Dec 03 22:37:00.074841 master-0 kubenswrapper[36504]: I1203 22:37:00.074755 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sc47k" event={"ID":"119f779b-b4d0-4469-924d-f2f1680edcd1","Type":"ContainerDied","Data":"4e4139363e7c8a028269fc05b1de8f2f38940fcebb8f343299539caf7dd72bbc"} Dec 03 22:37:00.074841 master-0 kubenswrapper[36504]: I1203 22:37:00.074798 36504 scope.go:117] "RemoveContainer" containerID="733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14" Dec 03 22:37:00.075215 master-0 kubenswrapper[36504]: I1203 22:37:00.074977 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sc47k" Dec 03 22:37:00.104085 master-0 kubenswrapper[36504]: I1203 22:37:00.102564 36504 scope.go:117] "RemoveContainer" containerID="4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15" Dec 03 22:37:00.134448 master-0 kubenswrapper[36504]: I1203 22:37:00.134374 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sc47k"] Dec 03 22:37:00.141889 master-0 kubenswrapper[36504]: I1203 22:37:00.141833 36504 scope.go:117] "RemoveContainer" containerID="170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97" Dec 03 22:37:00.148016 master-0 kubenswrapper[36504]: I1203 22:37:00.147944 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sc47k"] Dec 03 22:37:00.199515 master-0 kubenswrapper[36504]: I1203 22:37:00.199451 36504 scope.go:117] "RemoveContainer" containerID="733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14" Dec 03 22:37:00.200089 master-0 kubenswrapper[36504]: E1203 22:37:00.200028 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14\": container with ID starting with 733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14 not found: ID does not exist" containerID="733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14" Dec 03 22:37:00.200170 master-0 kubenswrapper[36504]: I1203 22:37:00.200094 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14"} err="failed to get container status \"733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14\": rpc error: code = NotFound desc = could not find container \"733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14\": container with ID starting with 733b2d94131577d3b84366b97c3a463e18773228a08298f402b943b697b15a14 not found: ID does not exist" Dec 03 22:37:00.200170 master-0 kubenswrapper[36504]: I1203 22:37:00.200136 36504 scope.go:117] "RemoveContainer" containerID="4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15" Dec 03 22:37:00.200703 master-0 kubenswrapper[36504]: E1203 22:37:00.200666 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15\": container with ID starting with 4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15 not found: ID does not exist" containerID="4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15" Dec 03 22:37:00.200780 master-0 kubenswrapper[36504]: I1203 22:37:00.200707 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15"} err="failed to get container status \"4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15\": rpc error: code = NotFound desc = could not find container \"4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15\": container with ID starting with 4f64069afc82e34d63e7f7911fdfbaa00798aaa13ad1b364e0f9f48a2db4bf15 not found: ID does not exist" Dec 03 22:37:00.200780 master-0 kubenswrapper[36504]: I1203 22:37:00.200733 36504 scope.go:117] "RemoveContainer" containerID="170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97" Dec 03 22:37:00.201288 master-0 kubenswrapper[36504]: E1203 22:37:00.201246 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97\": container with ID starting with 170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97 not found: ID does not exist" containerID="170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97" Dec 03 22:37:00.201288 master-0 kubenswrapper[36504]: I1203 22:37:00.201281 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97"} err="failed to get container status \"170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97\": rpc error: code = NotFound desc = could not find container \"170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97\": container with ID starting with 170a8cf7ab6b9ce9123a8ecbfda185dd828d53ce74bf21a87c2ac87f6404ce97 not found: ID does not exist" Dec 03 22:37:01.113932 master-0 kubenswrapper[36504]: I1203 22:37:01.113856 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" path="/var/lib/kubelet/pods/119f779b-b4d0-4469-924d-f2f1680edcd1/volumes" Dec 03 22:37:17.097206 master-0 kubenswrapper[36504]: I1203 22:37:17.096839 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:37:28.063839 master-0 kubenswrapper[36504]: I1203 22:37:28.062355 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h9tlt"] Dec 03 22:37:28.082265 master-0 kubenswrapper[36504]: I1203 22:37:28.082128 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-h9tlt"] Dec 03 22:37:28.605418 master-0 kubenswrapper[36504]: I1203 22:37:28.605336 36504 generic.go:334] "Generic (PLEG): container finished" podID="5684b100-8e95-4df2-89d2-f5aa9ce5d281" containerID="c834e4d46f16de3472c5e2e4d17bf436f631d1db639d6c83428da30021141459" exitCode=0 Dec 03 22:37:28.605418 master-0 kubenswrapper[36504]: I1203 22:37:28.605409 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-services-edpm-hgmsd" event={"ID":"5684b100-8e95-4df2-89d2-f5aa9ce5d281","Type":"ContainerDied","Data":"c834e4d46f16de3472c5e2e4d17bf436f631d1db639d6c83428da30021141459"} Dec 03 22:37:29.111238 master-0 kubenswrapper[36504]: I1203 22:37:29.111173 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900943d0-fa02-401a-ab6a-7ed811802669" path="/var/lib/kubelet/pods/900943d0-fa02-401a-ab6a-7ed811802669/volumes" Dec 03 22:37:30.277784 master-0 kubenswrapper[36504]: I1203 22:37:30.277708 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:37:30.393999 master-0 kubenswrapper[36504]: I1203 22:37:30.393832 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ssh-key\") pod \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " Dec 03 22:37:30.393999 master-0 kubenswrapper[36504]: I1203 22:37:30.393908 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovncontroller-config-0\") pod \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " Dec 03 22:37:30.393999 master-0 kubenswrapper[36504]: I1203 22:37:30.393950 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-inventory\") pod \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " Dec 03 22:37:30.394561 master-0 kubenswrapper[36504]: I1203 22:37:30.394014 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovn-combined-ca-bundle\") pod \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " Dec 03 22:37:30.394561 master-0 kubenswrapper[36504]: I1203 22:37:30.394145 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-895q6\" (UniqueName: \"kubernetes.io/projected/5684b100-8e95-4df2-89d2-f5aa9ce5d281-kube-api-access-895q6\") pod \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\" (UID: \"5684b100-8e95-4df2-89d2-f5aa9ce5d281\") " Dec 03 22:37:30.398289 master-0 kubenswrapper[36504]: I1203 22:37:30.398227 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5684b100-8e95-4df2-89d2-f5aa9ce5d281-kube-api-access-895q6" (OuterVolumeSpecName: "kube-api-access-895q6") pod "5684b100-8e95-4df2-89d2-f5aa9ce5d281" (UID: "5684b100-8e95-4df2-89d2-f5aa9ce5d281"). InnerVolumeSpecName "kube-api-access-895q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:37:30.398792 master-0 kubenswrapper[36504]: I1203 22:37:30.398732 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "5684b100-8e95-4df2-89d2-f5aa9ce5d281" (UID: "5684b100-8e95-4df2-89d2-f5aa9ce5d281"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:37:30.425753 master-0 kubenswrapper[36504]: I1203 22:37:30.425671 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "5684b100-8e95-4df2-89d2-f5aa9ce5d281" (UID: "5684b100-8e95-4df2-89d2-f5aa9ce5d281"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:37:30.429712 master-0 kubenswrapper[36504]: I1203 22:37:30.429646 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-inventory" (OuterVolumeSpecName: "inventory") pod "5684b100-8e95-4df2-89d2-f5aa9ce5d281" (UID: "5684b100-8e95-4df2-89d2-f5aa9ce5d281"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:37:30.435624 master-0 kubenswrapper[36504]: I1203 22:37:30.435587 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5684b100-8e95-4df2-89d2-f5aa9ce5d281" (UID: "5684b100-8e95-4df2-89d2-f5aa9ce5d281"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:37:30.498943 master-0 kubenswrapper[36504]: I1203 22:37:30.498856 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:30.498943 master-0 kubenswrapper[36504]: I1203 22:37:30.498925 36504 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovncontroller-config-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:30.498943 master-0 kubenswrapper[36504]: I1203 22:37:30.498948 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:30.498943 master-0 kubenswrapper[36504]: I1203 22:37:30.498964 36504 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5684b100-8e95-4df2-89d2-f5aa9ce5d281-ovn-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:30.499368 master-0 kubenswrapper[36504]: I1203 22:37:30.498979 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-895q6\" (UniqueName: \"kubernetes.io/projected/5684b100-8e95-4df2-89d2-f5aa9ce5d281-kube-api-access-895q6\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:30.645002 master-0 kubenswrapper[36504]: I1203 22:37:30.643445 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-dataplane-services-edpm-hgmsd" event={"ID":"5684b100-8e95-4df2-89d2-f5aa9ce5d281","Type":"ContainerDied","Data":"334ae9d6ce34df166db1ead5a72d4d5cddb818ad9739b0cac177cbcbbcd6f112"} Dec 03 22:37:30.645002 master-0 kubenswrapper[36504]: I1203 22:37:30.643522 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334ae9d6ce34df166db1ead5a72d4d5cddb818ad9739b0cac177cbcbbcd6f112" Dec 03 22:37:30.645002 master-0 kubenswrapper[36504]: I1203 22:37:30.643567 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-dataplane-services-edpm-hgmsd" Dec 03 22:37:30.759626 master-0 kubenswrapper[36504]: I1203 22:37:30.759537 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-dataplane-services-edpm-r588g"] Dec 03 22:37:30.760610 master-0 kubenswrapper[36504]: E1203 22:37:30.760565 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="extract-content" Dec 03 22:37:30.760610 master-0 kubenswrapper[36504]: I1203 22:37:30.760599 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="extract-content" Dec 03 22:37:30.760746 master-0 kubenswrapper[36504]: E1203 22:37:30.760628 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="registry-server" Dec 03 22:37:30.760746 master-0 kubenswrapper[36504]: I1203 22:37:30.760643 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="registry-server" Dec 03 22:37:30.760746 master-0 kubenswrapper[36504]: E1203 22:37:30.760718 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5684b100-8e95-4df2-89d2-f5aa9ce5d281" containerName="ovn-dataplane-services-edpm" Dec 03 22:37:30.760746 master-0 kubenswrapper[36504]: I1203 22:37:30.760729 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="5684b100-8e95-4df2-89d2-f5aa9ce5d281" containerName="ovn-dataplane-services-edpm" Dec 03 22:37:30.760954 master-0 kubenswrapper[36504]: E1203 22:37:30.760792 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="extract-utilities" Dec 03 22:37:30.760954 master-0 kubenswrapper[36504]: I1203 22:37:30.760806 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="extract-utilities" Dec 03 22:37:30.761233 master-0 kubenswrapper[36504]: I1203 22:37:30.761193 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="119f779b-b4d0-4469-924d-f2f1680edcd1" containerName="registry-server" Dec 03 22:37:30.761295 master-0 kubenswrapper[36504]: I1203 22:37:30.761263 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="5684b100-8e95-4df2-89d2-f5aa9ce5d281" containerName="ovn-dataplane-services-edpm" Dec 03 22:37:30.762674 master-0 kubenswrapper[36504]: I1203 22:37:30.762632 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:30.766453 master-0 kubenswrapper[36504]: I1203 22:37:30.766398 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 22:37:30.766609 master-0 kubenswrapper[36504]: I1203 22:37:30.766565 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:37:30.766696 master-0 kubenswrapper[36504]: I1203 22:37:30.766614 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 22:37:30.766836 master-0 kubenswrapper[36504]: I1203 22:37:30.766792 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:37:30.766911 master-0 kubenswrapper[36504]: I1203 22:37:30.766866 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:37:30.777799 master-0 kubenswrapper[36504]: I1203 22:37:30.777715 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-services-edpm-r588g"] Dec 03 22:37:30.929185 master-0 kubenswrapper[36504]: I1203 22:37:30.929023 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhptd\" (UniqueName: \"kubernetes.io/projected/73cc4b4d-689d-4eac-a022-97bace1bfb8c-kube-api-access-hhptd\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:30.929457 master-0 kubenswrapper[36504]: I1203 22:37:30.929193 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:30.929457 master-0 kubenswrapper[36504]: I1203 22:37:30.929258 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-inventory\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:30.929984 master-0 kubenswrapper[36504]: I1203 22:37:30.929931 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:30.930263 master-0 kubenswrapper[36504]: I1203 22:37:30.930235 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-ssh-key\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:30.930314 master-0 kubenswrapper[36504]: I1203 22:37:30.930282 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.034561 master-0 kubenswrapper[36504]: I1203 22:37:31.034467 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.034882 master-0 kubenswrapper[36504]: I1203 22:37:31.034635 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-ssh-key\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.034882 master-0 kubenswrapper[36504]: I1203 22:37:31.034691 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.034882 master-0 kubenswrapper[36504]: I1203 22:37:31.034835 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhptd\" (UniqueName: \"kubernetes.io/projected/73cc4b4d-689d-4eac-a022-97bace1bfb8c-kube-api-access-hhptd\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.034997 master-0 kubenswrapper[36504]: I1203 22:37:31.034967 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.035046 master-0 kubenswrapper[36504]: I1203 22:37:31.035003 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-inventory\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.039001 master-0 kubenswrapper[36504]: I1203 22:37:31.038949 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-ssh-key\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.039621 master-0 kubenswrapper[36504]: I1203 22:37:31.039595 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-inventory\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.040195 master-0 kubenswrapper[36504]: I1203 22:37:31.040125 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.040365 master-0 kubenswrapper[36504]: I1203 22:37:31.040279 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.042645 master-0 kubenswrapper[36504]: I1203 22:37:31.042590 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-nova-metadata-neutron-config-0\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.052880 master-0 kubenswrapper[36504]: I1203 22:37:31.052758 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhptd\" (UniqueName: \"kubernetes.io/projected/73cc4b4d-689d-4eac-a022-97bace1bfb8c-kube-api-access-hhptd\") pod \"neutron-metadata-dataplane-services-edpm-r588g\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.086215 master-0 kubenswrapper[36504]: I1203 22:37:31.086142 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:37:31.674629 master-0 kubenswrapper[36504]: I1203 22:37:31.674573 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-dataplane-services-edpm-r588g"] Dec 03 22:37:32.679229 master-0 kubenswrapper[36504]: I1203 22:37:32.679155 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" event={"ID":"73cc4b4d-689d-4eac-a022-97bace1bfb8c","Type":"ContainerStarted","Data":"991d29c724807770a429744dab614c328f5088ce62b69aa156db9ddf3e5c0152"} Dec 03 22:37:32.679229 master-0 kubenswrapper[36504]: I1203 22:37:32.679218 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" event={"ID":"73cc4b4d-689d-4eac-a022-97bace1bfb8c","Type":"ContainerStarted","Data":"40e0a97d201af92a9e4eaeba0f7a29ed4252004ad7ea45f40ed0f6223b732b52"} Dec 03 22:37:34.283926 master-0 kubenswrapper[36504]: I1203 22:37:34.282513 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" podStartSLOduration=3.747336677 podStartE2EDuration="4.282482598s" podCreationTimestamp="2025-12-03 22:37:30 +0000 UTC" firstStartedPulling="2025-12-03 22:37:31.669448098 +0000 UTC m=+1616.889220105" lastFinishedPulling="2025-12-03 22:37:32.204594019 +0000 UTC m=+1617.424366026" observedRunningTime="2025-12-03 22:37:32.713027263 +0000 UTC m=+1617.932799290" watchObservedRunningTime="2025-12-03 22:37:34.282482598 +0000 UTC m=+1619.502254635" Dec 03 22:37:34.293122 master-0 kubenswrapper[36504]: I1203 22:37:34.293036 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-ftthv"] Dec 03 22:37:34.321137 master-0 kubenswrapper[36504]: I1203 22:37:34.321071 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-ftthv"] Dec 03 22:37:34.340017 master-0 kubenswrapper[36504]: I1203 22:37:34.339893 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-3663-account-create-update-7qg85"] Dec 03 22:37:34.356814 master-0 kubenswrapper[36504]: I1203 22:37:34.356688 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-3663-account-create-update-7qg85"] Dec 03 22:37:34.762603 master-0 kubenswrapper[36504]: I1203 22:37:34.762487 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pdqmn"] Dec 03 22:37:34.773170 master-0 kubenswrapper[36504]: I1203 22:37:34.773022 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.793249 master-0 kubenswrapper[36504]: I1203 22:37:34.793066 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdqmn"] Dec 03 22:37:34.826241 master-0 kubenswrapper[36504]: I1203 22:37:34.818035 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88h2z\" (UniqueName: \"kubernetes.io/projected/a9c3b206-5750-4988-84d6-d1fc1627d632-kube-api-access-88h2z\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.826241 master-0 kubenswrapper[36504]: I1203 22:37:34.818147 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-catalog-content\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.826241 master-0 kubenswrapper[36504]: I1203 22:37:34.818299 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-utilities\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.921526 master-0 kubenswrapper[36504]: I1203 22:37:34.921412 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-utilities\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.922088 master-0 kubenswrapper[36504]: I1203 22:37:34.921951 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88h2z\" (UniqueName: \"kubernetes.io/projected/a9c3b206-5750-4988-84d6-d1fc1627d632-kube-api-access-88h2z\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.922088 master-0 kubenswrapper[36504]: I1203 22:37:34.922010 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-catalog-content\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.922849 master-0 kubenswrapper[36504]: I1203 22:37:34.922814 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-catalog-content\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.923145 master-0 kubenswrapper[36504]: I1203 22:37:34.923087 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-utilities\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:34.955008 master-0 kubenswrapper[36504]: I1203 22:37:34.954925 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88h2z\" (UniqueName: \"kubernetes.io/projected/a9c3b206-5750-4988-84d6-d1fc1627d632-kube-api-access-88h2z\") pod \"redhat-operators-pdqmn\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:35.115528 master-0 kubenswrapper[36504]: I1203 22:37:35.115475 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a" path="/var/lib/kubelet/pods/0ebd83a6-08e5-4f36-b253-fd52e0c6ab4a/volumes" Dec 03 22:37:35.120885 master-0 kubenswrapper[36504]: I1203 22:37:35.120851 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac6200f-b04a-499d-befd-180a060fa107" path="/var/lib/kubelet/pods/cac6200f-b04a-499d-befd-180a060fa107/volumes" Dec 03 22:37:35.122387 master-0 kubenswrapper[36504]: I1203 22:37:35.122361 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:35.667166 master-0 kubenswrapper[36504]: I1203 22:37:35.667101 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pdqmn"] Dec 03 22:37:35.732251 master-0 kubenswrapper[36504]: I1203 22:37:35.732182 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerStarted","Data":"7bb5027e6b47e9dbe7ccb4e2e6bcdd45ee7a3d5354abc83d6d7f2c34c9cd74ca"} Dec 03 22:37:35.748056 master-0 kubenswrapper[36504]: E1203 22:37:35.747988 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:37:36.746161 master-0 kubenswrapper[36504]: I1203 22:37:36.746079 36504 generic.go:334] "Generic (PLEG): container finished" podID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerID="9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd" exitCode=0 Dec 03 22:37:36.746958 master-0 kubenswrapper[36504]: I1203 22:37:36.746183 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerDied","Data":"9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd"} Dec 03 22:37:37.762048 master-0 kubenswrapper[36504]: I1203 22:37:37.761977 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerStarted","Data":"9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23"} Dec 03 22:37:38.778634 master-0 kubenswrapper[36504]: I1203 22:37:38.778562 36504 generic.go:334] "Generic (PLEG): container finished" podID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerID="9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23" exitCode=0 Dec 03 22:37:38.779647 master-0 kubenswrapper[36504]: I1203 22:37:38.779571 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerDied","Data":"9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23"} Dec 03 22:37:40.812553 master-0 kubenswrapper[36504]: I1203 22:37:40.812477 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerStarted","Data":"bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7"} Dec 03 22:37:40.854266 master-0 kubenswrapper[36504]: I1203 22:37:40.853803 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pdqmn" podStartSLOduration=3.744891292 podStartE2EDuration="6.853763313s" podCreationTimestamp="2025-12-03 22:37:34 +0000 UTC" firstStartedPulling="2025-12-03 22:37:36.748493885 +0000 UTC m=+1621.968265892" lastFinishedPulling="2025-12-03 22:37:39.857365906 +0000 UTC m=+1625.077137913" observedRunningTime="2025-12-03 22:37:40.839236559 +0000 UTC m=+1626.059008576" watchObservedRunningTime="2025-12-03 22:37:40.853763313 +0000 UTC m=+1626.073535330" Dec 03 22:37:44.696170 master-0 kubenswrapper[36504]: I1203 22:37:44.696099 36504 scope.go:117] "RemoveContainer" containerID="821cb11c51b954e4d57104a6195f336f4d6ac6d743668fa80deb00a7ba68d5cc" Dec 03 22:37:44.725601 master-0 kubenswrapper[36504]: I1203 22:37:44.725533 36504 scope.go:117] "RemoveContainer" containerID="c0a35831890dae029eccdf90661de68d520117c870fcf940dfb47072e61bb8bf" Dec 03 22:37:44.797932 master-0 kubenswrapper[36504]: I1203 22:37:44.797874 36504 scope.go:117] "RemoveContainer" containerID="f0e9c5ed4f553b79cb34acd474a2008633c4b1f9f75e4cd26e4e2843eabf92f6" Dec 03 22:37:44.899386 master-0 kubenswrapper[36504]: I1203 22:37:44.899327 36504 scope.go:117] "RemoveContainer" containerID="88bbcb3ed67ebc1dfcb335b10768d701956846acbdefc3dfaac0ca483d36eeeb" Dec 03 22:37:44.928909 master-0 kubenswrapper[36504]: I1203 22:37:44.926620 36504 scope.go:117] "RemoveContainer" containerID="a4978bc2541d6eea4f37344e4ff6e83411dd9e7f06b5a532e0f86331d9b5593b" Dec 03 22:37:44.988954 master-0 kubenswrapper[36504]: I1203 22:37:44.988860 36504 scope.go:117] "RemoveContainer" containerID="82aa7af98106cf32c5d8e139da882a326a2f55e112fee3d8177eecbc252ab61d" Dec 03 22:37:45.050735 master-0 kubenswrapper[36504]: I1203 22:37:45.050446 36504 scope.go:117] "RemoveContainer" containerID="f086d403457a9dff4d9e008bda0b8e7a8feda5e5f24a4bb4be068d364eaf654d" Dec 03 22:37:45.086753 master-0 kubenswrapper[36504]: I1203 22:37:45.086675 36504 scope.go:117] "RemoveContainer" containerID="e3dc98d1f10f75204ba94845d8f275947fd6f668502cc1755ae53718f00e44e5" Dec 03 22:37:45.115196 master-0 kubenswrapper[36504]: I1203 22:37:45.113384 36504 scope.go:117] "RemoveContainer" containerID="556fdebdc51cb83ab8b96b22316114542b4ce68660f4fcb52774343ec06bac65" Dec 03 22:37:45.123130 master-0 kubenswrapper[36504]: I1203 22:37:45.123056 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:45.124126 master-0 kubenswrapper[36504]: I1203 22:37:45.123809 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:45.179502 master-0 kubenswrapper[36504]: I1203 22:37:45.179455 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:45.959755 master-0 kubenswrapper[36504]: I1203 22:37:45.959660 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:46.031395 master-0 kubenswrapper[36504]: I1203 22:37:46.031197 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdqmn"] Dec 03 22:37:47.924857 master-0 kubenswrapper[36504]: I1203 22:37:47.924752 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pdqmn" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="registry-server" containerID="cri-o://bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7" gracePeriod=2 Dec 03 22:37:48.519475 master-0 kubenswrapper[36504]: I1203 22:37:48.519435 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:48.663360 master-0 kubenswrapper[36504]: I1203 22:37:48.662174 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-utilities\") pod \"a9c3b206-5750-4988-84d6-d1fc1627d632\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " Dec 03 22:37:48.663360 master-0 kubenswrapper[36504]: I1203 22:37:48.663202 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-utilities" (OuterVolumeSpecName: "utilities") pod "a9c3b206-5750-4988-84d6-d1fc1627d632" (UID: "a9c3b206-5750-4988-84d6-d1fc1627d632"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:37:48.663718 master-0 kubenswrapper[36504]: I1203 22:37:48.663484 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-catalog-content\") pod \"a9c3b206-5750-4988-84d6-d1fc1627d632\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " Dec 03 22:37:48.674807 master-0 kubenswrapper[36504]: I1203 22:37:48.674731 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88h2z\" (UniqueName: \"kubernetes.io/projected/a9c3b206-5750-4988-84d6-d1fc1627d632-kube-api-access-88h2z\") pod \"a9c3b206-5750-4988-84d6-d1fc1627d632\" (UID: \"a9c3b206-5750-4988-84d6-d1fc1627d632\") " Dec 03 22:37:48.678599 master-0 kubenswrapper[36504]: I1203 22:37:48.678532 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c3b206-5750-4988-84d6-d1fc1627d632-kube-api-access-88h2z" (OuterVolumeSpecName: "kube-api-access-88h2z") pod "a9c3b206-5750-4988-84d6-d1fc1627d632" (UID: "a9c3b206-5750-4988-84d6-d1fc1627d632"). InnerVolumeSpecName "kube-api-access-88h2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:37:48.678873 master-0 kubenswrapper[36504]: I1203 22:37:48.678730 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:48.782297 master-0 kubenswrapper[36504]: I1203 22:37:48.782236 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88h2z\" (UniqueName: \"kubernetes.io/projected/a9c3b206-5750-4988-84d6-d1fc1627d632-kube-api-access-88h2z\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:48.799336 master-0 kubenswrapper[36504]: I1203 22:37:48.798748 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9c3b206-5750-4988-84d6-d1fc1627d632" (UID: "a9c3b206-5750-4988-84d6-d1fc1627d632"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:37:48.886462 master-0 kubenswrapper[36504]: I1203 22:37:48.886376 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9c3b206-5750-4988-84d6-d1fc1627d632-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:37:48.944583 master-0 kubenswrapper[36504]: I1203 22:37:48.944375 36504 generic.go:334] "Generic (PLEG): container finished" podID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerID="bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7" exitCode=0 Dec 03 22:37:48.944583 master-0 kubenswrapper[36504]: I1203 22:37:48.944456 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerDied","Data":"bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7"} Dec 03 22:37:48.944583 master-0 kubenswrapper[36504]: I1203 22:37:48.944530 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pdqmn" event={"ID":"a9c3b206-5750-4988-84d6-d1fc1627d632","Type":"ContainerDied","Data":"7bb5027e6b47e9dbe7ccb4e2e6bcdd45ee7a3d5354abc83d6d7f2c34c9cd74ca"} Dec 03 22:37:48.944583 master-0 kubenswrapper[36504]: I1203 22:37:48.944560 36504 scope.go:117] "RemoveContainer" containerID="bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7" Dec 03 22:37:48.945456 master-0 kubenswrapper[36504]: I1203 22:37:48.944600 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pdqmn" Dec 03 22:37:48.999238 master-0 kubenswrapper[36504]: I1203 22:37:48.999154 36504 scope.go:117] "RemoveContainer" containerID="9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23" Dec 03 22:37:49.055092 master-0 kubenswrapper[36504]: I1203 22:37:49.054999 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pdqmn"] Dec 03 22:37:49.063597 master-0 kubenswrapper[36504]: I1203 22:37:49.063554 36504 scope.go:117] "RemoveContainer" containerID="9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd" Dec 03 22:37:49.080487 master-0 kubenswrapper[36504]: I1203 22:37:49.080397 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pdqmn"] Dec 03 22:37:49.118914 master-0 kubenswrapper[36504]: I1203 22:37:49.118798 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" path="/var/lib/kubelet/pods/a9c3b206-5750-4988-84d6-d1fc1627d632/volumes" Dec 03 22:37:49.136086 master-0 kubenswrapper[36504]: I1203 22:37:49.136032 36504 scope.go:117] "RemoveContainer" containerID="bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7" Dec 03 22:37:49.136917 master-0 kubenswrapper[36504]: E1203 22:37:49.136753 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7\": container with ID starting with bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7 not found: ID does not exist" containerID="bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7" Dec 03 22:37:49.137010 master-0 kubenswrapper[36504]: I1203 22:37:49.136941 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7"} err="failed to get container status \"bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7\": rpc error: code = NotFound desc = could not find container \"bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7\": container with ID starting with bc00dc0b582a984a280edba05148b57a0875d8e46d888ca8e472a61465ad25a7 not found: ID does not exist" Dec 03 22:37:49.137010 master-0 kubenswrapper[36504]: I1203 22:37:49.136982 36504 scope.go:117] "RemoveContainer" containerID="9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23" Dec 03 22:37:49.137694 master-0 kubenswrapper[36504]: E1203 22:37:49.137665 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23\": container with ID starting with 9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23 not found: ID does not exist" containerID="9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23" Dec 03 22:37:49.137761 master-0 kubenswrapper[36504]: I1203 22:37:49.137705 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23"} err="failed to get container status \"9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23\": rpc error: code = NotFound desc = could not find container \"9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23\": container with ID starting with 9b77073bd99cd70658eb9f9223270aefc43625d89ccc97a4b68d0771df8a7c23 not found: ID does not exist" Dec 03 22:37:49.137761 master-0 kubenswrapper[36504]: I1203 22:37:49.137735 36504 scope.go:117] "RemoveContainer" containerID="9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd" Dec 03 22:37:49.138167 master-0 kubenswrapper[36504]: E1203 22:37:49.138119 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd\": container with ID starting with 9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd not found: ID does not exist" containerID="9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd" Dec 03 22:37:49.138243 master-0 kubenswrapper[36504]: I1203 22:37:49.138164 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd"} err="failed to get container status \"9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd\": rpc error: code = NotFound desc = could not find container \"9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd\": container with ID starting with 9dd7e4bd766e708236ccfa2814932eeb19dc8f2b75248f0625c543ec88bda7dd not found: ID does not exist" Dec 03 22:37:54.114547 master-0 kubenswrapper[36504]: I1203 22:37:54.114459 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ckgg"] Dec 03 22:37:54.142456 master-0 kubenswrapper[36504]: I1203 22:37:54.142360 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9ckgg"] Dec 03 22:37:55.122726 master-0 kubenswrapper[36504]: I1203 22:37:55.116543 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa3c88c3-0ad0-4fc0-941c-52bde77db555" path="/var/lib/kubelet/pods/aa3c88c3-0ad0-4fc0-941c-52bde77db555/volumes" Dec 03 22:37:56.043909 master-0 kubenswrapper[36504]: I1203 22:37:56.043841 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-2lkqc"] Dec 03 22:37:56.061404 master-0 kubenswrapper[36504]: I1203 22:37:56.061326 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-2lkqc"] Dec 03 22:37:57.054728 master-0 kubenswrapper[36504]: I1203 22:37:57.054542 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nn2xp"] Dec 03 22:37:57.070484 master-0 kubenswrapper[36504]: I1203 22:37:57.070379 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nn2xp"] Dec 03 22:37:57.112849 master-0 kubenswrapper[36504]: I1203 22:37:57.112766 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26d6e38-228d-4663-9826-d79e438d324e" path="/var/lib/kubelet/pods/a26d6e38-228d-4663-9826-d79e438d324e/volumes" Dec 03 22:37:57.114130 master-0 kubenswrapper[36504]: I1203 22:37:57.114106 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0b5f74-16f0-4381-86a2-4ba49c379be8" path="/var/lib/kubelet/pods/bc0b5f74-16f0-4381-86a2-4ba49c379be8/volumes" Dec 03 22:38:04.103707 master-0 kubenswrapper[36504]: I1203 22:38:04.103592 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:38:20.098887 master-0 kubenswrapper[36504]: I1203 22:38:20.098828 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:38:33.634853 master-0 kubenswrapper[36504]: I1203 22:38:33.634743 36504 generic.go:334] "Generic (PLEG): container finished" podID="73cc4b4d-689d-4eac-a022-97bace1bfb8c" containerID="991d29c724807770a429744dab614c328f5088ce62b69aa156db9ddf3e5c0152" exitCode=0 Dec 03 22:38:33.635624 master-0 kubenswrapper[36504]: I1203 22:38:33.634848 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" event={"ID":"73cc4b4d-689d-4eac-a022-97bace1bfb8c","Type":"ContainerDied","Data":"991d29c724807770a429744dab614c328f5088ce62b69aa156db9ddf3e5c0152"} Dec 03 22:38:35.246300 master-0 kubenswrapper[36504]: I1203 22:38:35.246215 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:38:35.375222 master-0 kubenswrapper[36504]: I1203 22:38:35.375156 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhptd\" (UniqueName: \"kubernetes.io/projected/73cc4b4d-689d-4eac-a022-97bace1bfb8c-kube-api-access-hhptd\") pod \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " Dec 03 22:38:35.375520 master-0 kubenswrapper[36504]: I1203 22:38:35.375437 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-ssh-key\") pod \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " Dec 03 22:38:35.375520 master-0 kubenswrapper[36504]: I1203 22:38:35.375491 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-nova-metadata-neutron-config-0\") pod \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " Dec 03 22:38:35.375520 master-0 kubenswrapper[36504]: I1203 22:38:35.375517 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-inventory\") pod \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " Dec 03 22:38:35.375719 master-0 kubenswrapper[36504]: I1203 22:38:35.375689 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-ovn-metadata-agent-neutron-config-0\") pod \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " Dec 03 22:38:35.375839 master-0 kubenswrapper[36504]: I1203 22:38:35.375797 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-metadata-combined-ca-bundle\") pod \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\" (UID: \"73cc4b4d-689d-4eac-a022-97bace1bfb8c\") " Dec 03 22:38:35.379463 master-0 kubenswrapper[36504]: I1203 22:38:35.379345 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73cc4b4d-689d-4eac-a022-97bace1bfb8c-kube-api-access-hhptd" (OuterVolumeSpecName: "kube-api-access-hhptd") pod "73cc4b4d-689d-4eac-a022-97bace1bfb8c" (UID: "73cc4b4d-689d-4eac-a022-97bace1bfb8c"). InnerVolumeSpecName "kube-api-access-hhptd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:38:35.382726 master-0 kubenswrapper[36504]: I1203 22:38:35.382635 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "73cc4b4d-689d-4eac-a022-97bace1bfb8c" (UID: "73cc4b4d-689d-4eac-a022-97bace1bfb8c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:35.409869 master-0 kubenswrapper[36504]: I1203 22:38:35.409755 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "73cc4b4d-689d-4eac-a022-97bace1bfb8c" (UID: "73cc4b4d-689d-4eac-a022-97bace1bfb8c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:35.419601 master-0 kubenswrapper[36504]: I1203 22:38:35.419531 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-inventory" (OuterVolumeSpecName: "inventory") pod "73cc4b4d-689d-4eac-a022-97bace1bfb8c" (UID: "73cc4b4d-689d-4eac-a022-97bace1bfb8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:35.422439 master-0 kubenswrapper[36504]: I1203 22:38:35.422400 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "73cc4b4d-689d-4eac-a022-97bace1bfb8c" (UID: "73cc4b4d-689d-4eac-a022-97bace1bfb8c"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:35.424012 master-0 kubenswrapper[36504]: I1203 22:38:35.423964 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "73cc4b4d-689d-4eac-a022-97bace1bfb8c" (UID: "73cc4b4d-689d-4eac-a022-97bace1bfb8c"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:38:35.481070 master-0 kubenswrapper[36504]: I1203 22:38:35.480931 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:38:35.481070 master-0 kubenswrapper[36504]: I1203 22:38:35.480998 36504 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-nova-metadata-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:38:35.481070 master-0 kubenswrapper[36504]: I1203 22:38:35.481013 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:38:35.481070 master-0 kubenswrapper[36504]: I1203 22:38:35.481023 36504 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-ovn-metadata-agent-neutron-config-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:38:35.481070 master-0 kubenswrapper[36504]: I1203 22:38:35.481036 36504 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc4b4d-689d-4eac-a022-97bace1bfb8c-neutron-metadata-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:38:35.481070 master-0 kubenswrapper[36504]: I1203 22:38:35.481059 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhptd\" (UniqueName: \"kubernetes.io/projected/73cc4b4d-689d-4eac-a022-97bace1bfb8c-kube-api-access-hhptd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:38:35.667090 master-0 kubenswrapper[36504]: I1203 22:38:35.666880 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" event={"ID":"73cc4b4d-689d-4eac-a022-97bace1bfb8c","Type":"ContainerDied","Data":"40e0a97d201af92a9e4eaeba0f7a29ed4252004ad7ea45f40ed0f6223b732b52"} Dec 03 22:38:35.667090 master-0 kubenswrapper[36504]: I1203 22:38:35.666940 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40e0a97d201af92a9e4eaeba0f7a29ed4252004ad7ea45f40ed0f6223b732b52" Dec 03 22:38:35.667090 master-0 kubenswrapper[36504]: I1203 22:38:35.667009 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-dataplane-services-edpm-r588g" Dec 03 22:38:35.738371 master-0 kubenswrapper[36504]: E1203 22:38:35.738300 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:38:35.781359 master-0 kubenswrapper[36504]: I1203 22:38:35.781246 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-dataplane-services-edpm-tb2h9"] Dec 03 22:38:35.782277 master-0 kubenswrapper[36504]: E1203 22:38:35.782236 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="extract-content" Dec 03 22:38:35.782277 master-0 kubenswrapper[36504]: I1203 22:38:35.782268 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="extract-content" Dec 03 22:38:35.782436 master-0 kubenswrapper[36504]: E1203 22:38:35.782307 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="registry-server" Dec 03 22:38:35.782436 master-0 kubenswrapper[36504]: I1203 22:38:35.782318 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="registry-server" Dec 03 22:38:35.782436 master-0 kubenswrapper[36504]: E1203 22:38:35.782383 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="extract-utilities" Dec 03 22:38:35.782436 master-0 kubenswrapper[36504]: I1203 22:38:35.782392 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="extract-utilities" Dec 03 22:38:35.782436 master-0 kubenswrapper[36504]: E1203 22:38:35.782407 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73cc4b4d-689d-4eac-a022-97bace1bfb8c" containerName="neutron-metadata-dataplane-services-edpm" Dec 03 22:38:35.782436 master-0 kubenswrapper[36504]: I1203 22:38:35.782413 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="73cc4b4d-689d-4eac-a022-97bace1bfb8c" containerName="neutron-metadata-dataplane-services-edpm" Dec 03 22:38:35.782845 master-0 kubenswrapper[36504]: I1203 22:38:35.782782 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c3b206-5750-4988-84d6-d1fc1627d632" containerName="registry-server" Dec 03 22:38:35.782845 master-0 kubenswrapper[36504]: I1203 22:38:35.782844 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="73cc4b4d-689d-4eac-a022-97bace1bfb8c" containerName="neutron-metadata-dataplane-services-edpm" Dec 03 22:38:35.784081 master-0 kubenswrapper[36504]: I1203 22:38:35.784056 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.786636 master-0 kubenswrapper[36504]: I1203 22:38:35.786548 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 22:38:35.787456 master-0 kubenswrapper[36504]: I1203 22:38:35.787416 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:38:35.787531 master-0 kubenswrapper[36504]: I1203 22:38:35.787487 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:38:35.787802 master-0 kubenswrapper[36504]: I1203 22:38:35.787750 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:38:35.795038 master-0 kubenswrapper[36504]: I1203 22:38:35.794969 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-ssh-key\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.795289 master-0 kubenswrapper[36504]: I1203 22:38:35.795250 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-combined-ca-bundle\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.795401 master-0 kubenswrapper[36504]: I1203 22:38:35.795305 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-inventory\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.795730 master-0 kubenswrapper[36504]: I1203 22:38:35.795702 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lmtd\" (UniqueName: \"kubernetes.io/projected/04707111-d21e-4af6-b198-76291f4c06d9-kube-api-access-5lmtd\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.795860 master-0 kubenswrapper[36504]: I1203 22:38:35.795844 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-secret-0\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.821719 master-0 kubenswrapper[36504]: I1203 22:38:35.821611 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-dataplane-services-edpm-tb2h9"] Dec 03 22:38:35.897661 master-0 kubenswrapper[36504]: I1203 22:38:35.897483 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-combined-ca-bundle\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.897661 master-0 kubenswrapper[36504]: I1203 22:38:35.897553 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-inventory\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.898074 master-0 kubenswrapper[36504]: I1203 22:38:35.898031 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lmtd\" (UniqueName: \"kubernetes.io/projected/04707111-d21e-4af6-b198-76291f4c06d9-kube-api-access-5lmtd\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.898074 master-0 kubenswrapper[36504]: I1203 22:38:35.898067 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-secret-0\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.898173 master-0 kubenswrapper[36504]: I1203 22:38:35.898130 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-ssh-key\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.902533 master-0 kubenswrapper[36504]: I1203 22:38:35.902497 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-inventory\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.902692 master-0 kubenswrapper[36504]: I1203 22:38:35.902606 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-ssh-key\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.902692 master-0 kubenswrapper[36504]: I1203 22:38:35.902606 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-secret-0\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.904342 master-0 kubenswrapper[36504]: I1203 22:38:35.904287 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-combined-ca-bundle\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:35.916333 master-0 kubenswrapper[36504]: I1203 22:38:35.916272 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lmtd\" (UniqueName: \"kubernetes.io/projected/04707111-d21e-4af6-b198-76291f4c06d9-kube-api-access-5lmtd\") pod \"libvirt-dataplane-services-edpm-tb2h9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:36.105122 master-0 kubenswrapper[36504]: I1203 22:38:36.104954 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:38:36.674054 master-0 kubenswrapper[36504]: W1203 22:38:36.673798 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04707111_d21e_4af6_b198_76291f4c06d9.slice/crio-a7dc0701414f9f615392d475255ff6d1f6df4a811868df8c2ffc8126e2d5c5a8 WatchSource:0}: Error finding container a7dc0701414f9f615392d475255ff6d1f6df4a811868df8c2ffc8126e2d5c5a8: Status 404 returned error can't find the container with id a7dc0701414f9f615392d475255ff6d1f6df4a811868df8c2ffc8126e2d5c5a8 Dec 03 22:38:36.680804 master-0 kubenswrapper[36504]: I1203 22:38:36.680721 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-dataplane-services-edpm-tb2h9"] Dec 03 22:38:37.704573 master-0 kubenswrapper[36504]: I1203 22:38:37.704485 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" event={"ID":"04707111-d21e-4af6-b198-76291f4c06d9","Type":"ContainerStarted","Data":"4d28758097faaaa12c8d21e52b4a27a3036ffc555e5687bb593157069297ae16"} Dec 03 22:38:37.704573 master-0 kubenswrapper[36504]: I1203 22:38:37.704574 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" event={"ID":"04707111-d21e-4af6-b198-76291f4c06d9","Type":"ContainerStarted","Data":"a7dc0701414f9f615392d475255ff6d1f6df4a811868df8c2ffc8126e2d5c5a8"} Dec 03 22:38:37.746309 master-0 kubenswrapper[36504]: I1203 22:38:37.745092 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" podStartSLOduration=2.216074602 podStartE2EDuration="2.745056559s" podCreationTimestamp="2025-12-03 22:38:35 +0000 UTC" firstStartedPulling="2025-12-03 22:38:36.678505926 +0000 UTC m=+1681.898277933" lastFinishedPulling="2025-12-03 22:38:37.207487883 +0000 UTC m=+1682.427259890" observedRunningTime="2025-12-03 22:38:37.731274498 +0000 UTC m=+1682.951046515" watchObservedRunningTime="2025-12-03 22:38:37.745056559 +0000 UTC m=+1682.964828566" Dec 03 22:38:43.069829 master-0 kubenswrapper[36504]: I1203 22:38:43.069740 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzdd5"] Dec 03 22:38:43.087788 master-0 kubenswrapper[36504]: I1203 22:38:43.087708 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fzdd5"] Dec 03 22:38:43.118482 master-0 kubenswrapper[36504]: I1203 22:38:43.118395 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b524cbe9-1597-4642-a543-a29a92375c0d" path="/var/lib/kubelet/pods/b524cbe9-1597-4642-a543-a29a92375c0d/volumes" Dec 03 22:38:45.387443 master-0 kubenswrapper[36504]: I1203 22:38:45.387376 36504 scope.go:117] "RemoveContainer" containerID="649d688dfef8ebcff089c631cb163152da87a58648d7b1a66c36c2a31818aaf6" Dec 03 22:38:45.443448 master-0 kubenswrapper[36504]: I1203 22:38:45.443381 36504 scope.go:117] "RemoveContainer" containerID="f8361c0f5cf9a6c8c5afb7de59d0f6b25c830c2067447840afc674bc481aa945" Dec 03 22:38:45.525706 master-0 kubenswrapper[36504]: I1203 22:38:45.525644 36504 scope.go:117] "RemoveContainer" containerID="31d50adcee68d6ae036c3da047ab3a89d7d449701ffbbf10d419bc89bae5d3b0" Dec 03 22:38:45.587882 master-0 kubenswrapper[36504]: I1203 22:38:45.587825 36504 scope.go:117] "RemoveContainer" containerID="204f255c14881341d8537a4d603b58e261b7f1036050cd39c7ecc4c9e13f4d6c" Dec 03 22:39:34.096637 master-0 kubenswrapper[36504]: I1203 22:39:34.096546 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:39:35.085507 master-0 kubenswrapper[36504]: I1203 22:39:35.085415 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-5sqwd"] Dec 03 22:39:35.144697 master-0 kubenswrapper[36504]: I1203 22:39:35.144626 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-5sqwd"] Dec 03 22:39:35.726269 master-0 kubenswrapper[36504]: E1203 22:39:35.726193 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:39:37.058737 master-0 kubenswrapper[36504]: I1203 22:39:37.058528 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-ce39-account-create-update-ngk2l"] Dec 03 22:39:37.072953 master-0 kubenswrapper[36504]: I1203 22:39:37.072857 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-ce39-account-create-update-ngk2l"] Dec 03 22:39:37.114102 master-0 kubenswrapper[36504]: I1203 22:39:37.114033 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d55dcc-3278-423b-b8a5-f0677d010e28" path="/var/lib/kubelet/pods/23d55dcc-3278-423b-b8a5-f0677d010e28/volumes" Dec 03 22:39:37.114800 master-0 kubenswrapper[36504]: I1203 22:39:37.114748 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef8451d-cd60-4d12-8a93-231274fbccf2" path="/var/lib/kubelet/pods/eef8451d-cd60-4d12-8a93-231274fbccf2/volumes" Dec 03 22:39:43.065031 master-0 kubenswrapper[36504]: I1203 22:39:43.064927 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-c4sh9"] Dec 03 22:39:43.078252 master-0 kubenswrapper[36504]: I1203 22:39:43.078155 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-c4sh9"] Dec 03 22:39:43.125583 master-0 kubenswrapper[36504]: I1203 22:39:43.125469 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2682f06-2124-466c-8041-403c8a864821" path="/var/lib/kubelet/pods/f2682f06-2124-466c-8041-403c8a864821/volumes" Dec 03 22:39:44.064331 master-0 kubenswrapper[36504]: I1203 22:39:44.064212 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c786-account-create-update-x6vz9"] Dec 03 22:39:44.082074 master-0 kubenswrapper[36504]: I1203 22:39:44.081961 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c786-account-create-update-x6vz9"] Dec 03 22:39:45.119455 master-0 kubenswrapper[36504]: I1203 22:39:45.119346 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e10bb0-61c3-423b-b9b8-086b32e604ec" path="/var/lib/kubelet/pods/f8e10bb0-61c3-423b-b9b8-086b32e604ec/volumes" Dec 03 22:39:45.805693 master-0 kubenswrapper[36504]: I1203 22:39:45.804702 36504 scope.go:117] "RemoveContainer" containerID="5cdc24b245f4fe12309797c09b0449423c3401d8e3176e5d6eace3d1bd4afae6" Dec 03 22:39:45.843597 master-0 kubenswrapper[36504]: I1203 22:39:45.843553 36504 scope.go:117] "RemoveContainer" containerID="6ff51e950cdbb5d7a1e4aed3edc76b4bfb56d4e6eab93bf801e2e53f4e5b6510" Dec 03 22:39:45.899042 master-0 kubenswrapper[36504]: I1203 22:39:45.898993 36504 scope.go:117] "RemoveContainer" containerID="5da860a370d517d8bd16379dab0a91f8ac71bf4cff2427aacff9c3fe72f1bb30" Dec 03 22:39:45.959083 master-0 kubenswrapper[36504]: I1203 22:39:45.959015 36504 scope.go:117] "RemoveContainer" containerID="13147e5ac3b7e4b019bb8f8e41a53efa49011505dce1c173e6002716576cf6c2" Dec 03 22:39:50.096472 master-0 kubenswrapper[36504]: I1203 22:39:50.096330 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:40:33.675515 master-0 kubenswrapper[36504]: I1203 22:40:33.675426 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k5mxs"] Dec 03 22:40:33.680100 master-0 kubenswrapper[36504]: I1203 22:40:33.680057 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.694662 master-0 kubenswrapper[36504]: I1203 22:40:33.694584 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5mxs"] Dec 03 22:40:33.842508 master-0 kubenswrapper[36504]: I1203 22:40:33.842420 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-catalog-content\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.843564 master-0 kubenswrapper[36504]: I1203 22:40:33.843490 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-utilities\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.844411 master-0 kubenswrapper[36504]: I1203 22:40:33.844330 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klxdg\" (UniqueName: \"kubernetes.io/projected/e8d3636d-7640-447e-aa8c-1fd42e0679d8-kube-api-access-klxdg\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.947984 master-0 kubenswrapper[36504]: I1203 22:40:33.947592 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-utilities\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.947984 master-0 kubenswrapper[36504]: I1203 22:40:33.947899 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klxdg\" (UniqueName: \"kubernetes.io/projected/e8d3636d-7640-447e-aa8c-1fd42e0679d8-kube-api-access-klxdg\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.947984 master-0 kubenswrapper[36504]: I1203 22:40:33.947949 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-catalog-content\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.948415 master-0 kubenswrapper[36504]: I1203 22:40:33.948365 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-utilities\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.949403 master-0 kubenswrapper[36504]: I1203 22:40:33.949375 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-catalog-content\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:33.968246 master-0 kubenswrapper[36504]: I1203 22:40:33.968199 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klxdg\" (UniqueName: \"kubernetes.io/projected/e8d3636d-7640-447e-aa8c-1fd42e0679d8-kube-api-access-klxdg\") pod \"community-operators-k5mxs\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:34.010539 master-0 kubenswrapper[36504]: I1203 22:40:34.010466 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:34.671601 master-0 kubenswrapper[36504]: W1203 22:40:34.671190 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d3636d_7640_447e_aa8c_1fd42e0679d8.slice/crio-8d64f9d163eb65e409489feec55522d98850c3d4085ee3f87990cf2c2953b7ea WatchSource:0}: Error finding container 8d64f9d163eb65e409489feec55522d98850c3d4085ee3f87990cf2c2953b7ea: Status 404 returned error can't find the container with id 8d64f9d163eb65e409489feec55522d98850c3d4085ee3f87990cf2c2953b7ea Dec 03 22:40:34.677825 master-0 kubenswrapper[36504]: I1203 22:40:34.677426 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5mxs"] Dec 03 22:40:34.700843 master-0 kubenswrapper[36504]: I1203 22:40:34.699971 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jnwwt"] Dec 03 22:40:34.707968 master-0 kubenswrapper[36504]: I1203 22:40:34.707912 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.727887 master-0 kubenswrapper[36504]: I1203 22:40:34.727827 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnwwt"] Dec 03 22:40:34.788455 master-0 kubenswrapper[36504]: I1203 22:40:34.788237 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-catalog-content\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.789300 master-0 kubenswrapper[36504]: I1203 22:40:34.789200 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k29w\" (UniqueName: \"kubernetes.io/projected/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-kube-api-access-5k29w\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.789380 master-0 kubenswrapper[36504]: I1203 22:40:34.789339 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-utilities\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.893209 master-0 kubenswrapper[36504]: I1203 22:40:34.893130 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k29w\" (UniqueName: \"kubernetes.io/projected/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-kube-api-access-5k29w\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.893627 master-0 kubenswrapper[36504]: I1203 22:40:34.893601 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-utilities\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.893951 master-0 kubenswrapper[36504]: I1203 22:40:34.893929 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-catalog-content\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.894469 master-0 kubenswrapper[36504]: I1203 22:40:34.894421 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-utilities\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.894877 master-0 kubenswrapper[36504]: I1203 22:40:34.894811 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-catalog-content\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:34.913874 master-0 kubenswrapper[36504]: I1203 22:40:34.913632 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k29w\" (UniqueName: \"kubernetes.io/projected/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-kube-api-access-5k29w\") pod \"redhat-marketplace-jnwwt\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:35.057634 master-0 kubenswrapper[36504]: I1203 22:40:35.057555 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:35.531656 master-0 kubenswrapper[36504]: I1203 22:40:35.531456 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerID="732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062" exitCode=0 Dec 03 22:40:35.531656 master-0 kubenswrapper[36504]: I1203 22:40:35.531534 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerDied","Data":"732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062"} Dec 03 22:40:35.531656 master-0 kubenswrapper[36504]: I1203 22:40:35.531579 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerStarted","Data":"8d64f9d163eb65e409489feec55522d98850c3d4085ee3f87990cf2c2953b7ea"} Dec 03 22:40:35.622414 master-0 kubenswrapper[36504]: W1203 22:40:35.622348 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782ee702_cc5d_4fd5_9efd_a3b97138bb2d.slice/crio-265521cf2f444e18fb681696dafb4280762b9cf1ab897e190eaf2133a5c8bc50 WatchSource:0}: Error finding container 265521cf2f444e18fb681696dafb4280762b9cf1ab897e190eaf2133a5c8bc50: Status 404 returned error can't find the container with id 265521cf2f444e18fb681696dafb4280762b9cf1ab897e190eaf2133a5c8bc50 Dec 03 22:40:35.622560 master-0 kubenswrapper[36504]: I1203 22:40:35.622423 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnwwt"] Dec 03 22:40:35.733719 master-0 kubenswrapper[36504]: E1203 22:40:35.733639 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:40:36.559274 master-0 kubenswrapper[36504]: I1203 22:40:36.559203 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerStarted","Data":"eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6"} Dec 03 22:40:36.562409 master-0 kubenswrapper[36504]: I1203 22:40:36.562349 36504 generic.go:334] "Generic (PLEG): container finished" podID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerID="738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef" exitCode=0 Dec 03 22:40:36.562488 master-0 kubenswrapper[36504]: I1203 22:40:36.562418 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnwwt" event={"ID":"782ee702-cc5d-4fd5-9efd-a3b97138bb2d","Type":"ContainerDied","Data":"738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef"} Dec 03 22:40:36.562488 master-0 kubenswrapper[36504]: I1203 22:40:36.562452 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnwwt" event={"ID":"782ee702-cc5d-4fd5-9efd-a3b97138bb2d","Type":"ContainerStarted","Data":"265521cf2f444e18fb681696dafb4280762b9cf1ab897e190eaf2133a5c8bc50"} Dec 03 22:40:37.585133 master-0 kubenswrapper[36504]: I1203 22:40:37.585044 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerID="eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6" exitCode=0 Dec 03 22:40:37.588981 master-0 kubenswrapper[36504]: I1203 22:40:37.585161 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerDied","Data":"eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6"} Dec 03 22:40:37.596513 master-0 kubenswrapper[36504]: I1203 22:40:37.596447 36504 generic.go:334] "Generic (PLEG): container finished" podID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerID="64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87" exitCode=0 Dec 03 22:40:37.596639 master-0 kubenswrapper[36504]: I1203 22:40:37.596516 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnwwt" event={"ID":"782ee702-cc5d-4fd5-9efd-a3b97138bb2d","Type":"ContainerDied","Data":"64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87"} Dec 03 22:40:38.613684 master-0 kubenswrapper[36504]: I1203 22:40:38.613496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnwwt" event={"ID":"782ee702-cc5d-4fd5-9efd-a3b97138bb2d","Type":"ContainerStarted","Data":"758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c"} Dec 03 22:40:38.617504 master-0 kubenswrapper[36504]: I1203 22:40:38.617452 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerStarted","Data":"7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7"} Dec 03 22:40:38.653685 master-0 kubenswrapper[36504]: I1203 22:40:38.653564 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jnwwt" podStartSLOduration=3.227088788 podStartE2EDuration="4.653534414s" podCreationTimestamp="2025-12-03 22:40:34 +0000 UTC" firstStartedPulling="2025-12-03 22:40:36.565498971 +0000 UTC m=+1801.785270978" lastFinishedPulling="2025-12-03 22:40:37.991944597 +0000 UTC m=+1803.211716604" observedRunningTime="2025-12-03 22:40:38.638194669 +0000 UTC m=+1803.857966676" watchObservedRunningTime="2025-12-03 22:40:38.653534414 +0000 UTC m=+1803.873306421" Dec 03 22:40:38.675255 master-0 kubenswrapper[36504]: I1203 22:40:38.675147 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k5mxs" podStartSLOduration=3.223785739 podStartE2EDuration="5.675120384s" podCreationTimestamp="2025-12-03 22:40:33 +0000 UTC" firstStartedPulling="2025-12-03 22:40:35.534247635 +0000 UTC m=+1800.754019662" lastFinishedPulling="2025-12-03 22:40:37.9855823 +0000 UTC m=+1803.205354307" observedRunningTime="2025-12-03 22:40:38.668184988 +0000 UTC m=+1803.887957055" watchObservedRunningTime="2025-12-03 22:40:38.675120384 +0000 UTC m=+1803.894892391" Dec 03 22:40:44.011234 master-0 kubenswrapper[36504]: I1203 22:40:44.011042 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:44.012018 master-0 kubenswrapper[36504]: I1203 22:40:44.011600 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:44.068838 master-0 kubenswrapper[36504]: I1203 22:40:44.068711 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:44.781886 master-0 kubenswrapper[36504]: I1203 22:40:44.781702 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:44.855719 master-0 kubenswrapper[36504]: I1203 22:40:44.855649 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5mxs"] Dec 03 22:40:45.058484 master-0 kubenswrapper[36504]: I1203 22:40:45.058300 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:45.058484 master-0 kubenswrapper[36504]: I1203 22:40:45.058369 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:45.109116 master-0 kubenswrapper[36504]: I1203 22:40:45.109044 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:40:45.125370 master-0 kubenswrapper[36504]: I1203 22:40:45.125283 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:45.789973 master-0 kubenswrapper[36504]: I1203 22:40:45.789883 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:46.748494 master-0 kubenswrapper[36504]: I1203 22:40:46.748432 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k5mxs" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="registry-server" containerID="cri-o://7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7" gracePeriod=2 Dec 03 22:40:46.846512 master-0 kubenswrapper[36504]: I1203 22:40:46.846433 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnwwt"] Dec 03 22:40:47.368941 master-0 kubenswrapper[36504]: I1203 22:40:47.363077 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:47.507866 master-0 kubenswrapper[36504]: I1203 22:40:47.507698 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-catalog-content\") pod \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " Dec 03 22:40:47.507866 master-0 kubenswrapper[36504]: I1203 22:40:47.507762 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-utilities\") pod \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " Dec 03 22:40:47.508301 master-0 kubenswrapper[36504]: I1203 22:40:47.508270 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klxdg\" (UniqueName: \"kubernetes.io/projected/e8d3636d-7640-447e-aa8c-1fd42e0679d8-kube-api-access-klxdg\") pod \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\" (UID: \"e8d3636d-7640-447e-aa8c-1fd42e0679d8\") " Dec 03 22:40:47.508498 master-0 kubenswrapper[36504]: I1203 22:40:47.508463 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-utilities" (OuterVolumeSpecName: "utilities") pod "e8d3636d-7640-447e-aa8c-1fd42e0679d8" (UID: "e8d3636d-7640-447e-aa8c-1fd42e0679d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:47.509545 master-0 kubenswrapper[36504]: I1203 22:40:47.509513 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:40:47.512044 master-0 kubenswrapper[36504]: I1203 22:40:47.511984 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8d3636d-7640-447e-aa8c-1fd42e0679d8-kube-api-access-klxdg" (OuterVolumeSpecName: "kube-api-access-klxdg") pod "e8d3636d-7640-447e-aa8c-1fd42e0679d8" (UID: "e8d3636d-7640-447e-aa8c-1fd42e0679d8"). InnerVolumeSpecName "kube-api-access-klxdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:40:47.559882 master-0 kubenswrapper[36504]: I1203 22:40:47.559714 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8d3636d-7640-447e-aa8c-1fd42e0679d8" (UID: "e8d3636d-7640-447e-aa8c-1fd42e0679d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:47.612863 master-0 kubenswrapper[36504]: I1203 22:40:47.612756 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8d3636d-7640-447e-aa8c-1fd42e0679d8-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:40:47.612863 master-0 kubenswrapper[36504]: I1203 22:40:47.612837 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klxdg\" (UniqueName: \"kubernetes.io/projected/e8d3636d-7640-447e-aa8c-1fd42e0679d8-kube-api-access-klxdg\") on node \"master-0\" DevicePath \"\"" Dec 03 22:40:47.767667 master-0 kubenswrapper[36504]: I1203 22:40:47.767501 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerID="7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7" exitCode=0 Dec 03 22:40:47.767667 master-0 kubenswrapper[36504]: I1203 22:40:47.767576 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5mxs" Dec 03 22:40:47.768639 master-0 kubenswrapper[36504]: I1203 22:40:47.767560 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerDied","Data":"7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7"} Dec 03 22:40:47.768639 master-0 kubenswrapper[36504]: I1203 22:40:47.767711 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5mxs" event={"ID":"e8d3636d-7640-447e-aa8c-1fd42e0679d8","Type":"ContainerDied","Data":"8d64f9d163eb65e409489feec55522d98850c3d4085ee3f87990cf2c2953b7ea"} Dec 03 22:40:47.768639 master-0 kubenswrapper[36504]: I1203 22:40:47.767733 36504 scope.go:117] "RemoveContainer" containerID="7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7" Dec 03 22:40:47.768639 master-0 kubenswrapper[36504]: I1203 22:40:47.768149 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jnwwt" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="registry-server" containerID="cri-o://758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c" gracePeriod=2 Dec 03 22:40:47.831534 master-0 kubenswrapper[36504]: I1203 22:40:47.823240 36504 scope.go:117] "RemoveContainer" containerID="eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6" Dec 03 22:40:47.831534 master-0 kubenswrapper[36504]: I1203 22:40:47.823402 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5mxs"] Dec 03 22:40:47.838276 master-0 kubenswrapper[36504]: I1203 22:40:47.838215 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k5mxs"] Dec 03 22:40:47.876456 master-0 kubenswrapper[36504]: I1203 22:40:47.876411 36504 scope.go:117] "RemoveContainer" containerID="732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062" Dec 03 22:40:48.027070 master-0 kubenswrapper[36504]: I1203 22:40:48.026988 36504 scope.go:117] "RemoveContainer" containerID="7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7" Dec 03 22:40:48.027854 master-0 kubenswrapper[36504]: E1203 22:40:48.027763 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7\": container with ID starting with 7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7 not found: ID does not exist" containerID="7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7" Dec 03 22:40:48.027930 master-0 kubenswrapper[36504]: I1203 22:40:48.027885 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7"} err="failed to get container status \"7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7\": rpc error: code = NotFound desc = could not find container \"7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7\": container with ID starting with 7f58f851b8657c3c7a2886ab4aa2fef535fdcc2d4579bc23ae82403999b5c6c7 not found: ID does not exist" Dec 03 22:40:48.027930 master-0 kubenswrapper[36504]: I1203 22:40:48.027929 36504 scope.go:117] "RemoveContainer" containerID="eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6" Dec 03 22:40:48.028331 master-0 kubenswrapper[36504]: E1203 22:40:48.028283 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6\": container with ID starting with eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6 not found: ID does not exist" containerID="eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6" Dec 03 22:40:48.028391 master-0 kubenswrapper[36504]: I1203 22:40:48.028337 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6"} err="failed to get container status \"eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6\": rpc error: code = NotFound desc = could not find container \"eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6\": container with ID starting with eb3d11044524bdf5790cd00f463a3a71cd365134e0f4f159ef817f5b38db57d6 not found: ID does not exist" Dec 03 22:40:48.028391 master-0 kubenswrapper[36504]: I1203 22:40:48.028370 36504 scope.go:117] "RemoveContainer" containerID="732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062" Dec 03 22:40:48.028799 master-0 kubenswrapper[36504]: E1203 22:40:48.028723 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062\": container with ID starting with 732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062 not found: ID does not exist" containerID="732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062" Dec 03 22:40:48.028869 master-0 kubenswrapper[36504]: I1203 22:40:48.028802 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062"} err="failed to get container status \"732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062\": rpc error: code = NotFound desc = could not find container \"732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062\": container with ID starting with 732c1872f06eb5e813fc1fd9abb4cfe6e4fcececdde44fd970f798dc47400062 not found: ID does not exist" Dec 03 22:40:48.379473 master-0 kubenswrapper[36504]: I1203 22:40:48.379436 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:48.441498 master-0 kubenswrapper[36504]: I1203 22:40:48.441421 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-catalog-content\") pod \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " Dec 03 22:40:48.441498 master-0 kubenswrapper[36504]: I1203 22:40:48.441496 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-utilities\") pod \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " Dec 03 22:40:48.441919 master-0 kubenswrapper[36504]: I1203 22:40:48.441802 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k29w\" (UniqueName: \"kubernetes.io/projected/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-kube-api-access-5k29w\") pod \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\" (UID: \"782ee702-cc5d-4fd5-9efd-a3b97138bb2d\") " Dec 03 22:40:48.444960 master-0 kubenswrapper[36504]: I1203 22:40:48.444896 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-utilities" (OuterVolumeSpecName: "utilities") pod "782ee702-cc5d-4fd5-9efd-a3b97138bb2d" (UID: "782ee702-cc5d-4fd5-9efd-a3b97138bb2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:48.447755 master-0 kubenswrapper[36504]: I1203 22:40:48.447686 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-kube-api-access-5k29w" (OuterVolumeSpecName: "kube-api-access-5k29w") pod "782ee702-cc5d-4fd5-9efd-a3b97138bb2d" (UID: "782ee702-cc5d-4fd5-9efd-a3b97138bb2d"). InnerVolumeSpecName "kube-api-access-5k29w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:40:48.462420 master-0 kubenswrapper[36504]: I1203 22:40:48.462328 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "782ee702-cc5d-4fd5-9efd-a3b97138bb2d" (UID: "782ee702-cc5d-4fd5-9efd-a3b97138bb2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:40:48.546588 master-0 kubenswrapper[36504]: I1203 22:40:48.546403 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:40:48.546588 master-0 kubenswrapper[36504]: I1203 22:40:48.546471 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:40:48.546588 master-0 kubenswrapper[36504]: I1203 22:40:48.546492 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k29w\" (UniqueName: \"kubernetes.io/projected/782ee702-cc5d-4fd5-9efd-a3b97138bb2d-kube-api-access-5k29w\") on node \"master-0\" DevicePath \"\"" Dec 03 22:40:48.787616 master-0 kubenswrapper[36504]: I1203 22:40:48.787552 36504 generic.go:334] "Generic (PLEG): container finished" podID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerID="758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c" exitCode=0 Dec 03 22:40:48.787616 master-0 kubenswrapper[36504]: I1203 22:40:48.787605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnwwt" event={"ID":"782ee702-cc5d-4fd5-9efd-a3b97138bb2d","Type":"ContainerDied","Data":"758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c"} Dec 03 22:40:48.788390 master-0 kubenswrapper[36504]: I1203 22:40:48.787667 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnwwt" Dec 03 22:40:48.788390 master-0 kubenswrapper[36504]: I1203 22:40:48.787694 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnwwt" event={"ID":"782ee702-cc5d-4fd5-9efd-a3b97138bb2d","Type":"ContainerDied","Data":"265521cf2f444e18fb681696dafb4280762b9cf1ab897e190eaf2133a5c8bc50"} Dec 03 22:40:48.788390 master-0 kubenswrapper[36504]: I1203 22:40:48.787744 36504 scope.go:117] "RemoveContainer" containerID="758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c" Dec 03 22:40:48.814798 master-0 kubenswrapper[36504]: I1203 22:40:48.814733 36504 scope.go:117] "RemoveContainer" containerID="64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87" Dec 03 22:40:48.871799 master-0 kubenswrapper[36504]: I1203 22:40:48.871610 36504 scope.go:117] "RemoveContainer" containerID="738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef" Dec 03 22:40:48.876136 master-0 kubenswrapper[36504]: I1203 22:40:48.876067 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnwwt"] Dec 03 22:40:48.897639 master-0 kubenswrapper[36504]: I1203 22:40:48.897547 36504 scope.go:117] "RemoveContainer" containerID="758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c" Dec 03 22:40:48.898034 master-0 kubenswrapper[36504]: I1203 22:40:48.897765 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnwwt"] Dec 03 22:40:48.898443 master-0 kubenswrapper[36504]: E1203 22:40:48.898384 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c\": container with ID starting with 758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c not found: ID does not exist" containerID="758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c" Dec 03 22:40:48.898524 master-0 kubenswrapper[36504]: I1203 22:40:48.898460 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c"} err="failed to get container status \"758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c\": rpc error: code = NotFound desc = could not find container \"758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c\": container with ID starting with 758579d21ddecf52e2d6d88628319f491658a8a7f41862fce2dbd829853def7c not found: ID does not exist" Dec 03 22:40:48.898524 master-0 kubenswrapper[36504]: I1203 22:40:48.898515 36504 scope.go:117] "RemoveContainer" containerID="64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87" Dec 03 22:40:48.899091 master-0 kubenswrapper[36504]: E1203 22:40:48.899042 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87\": container with ID starting with 64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87 not found: ID does not exist" containerID="64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87" Dec 03 22:40:48.899174 master-0 kubenswrapper[36504]: I1203 22:40:48.899093 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87"} err="failed to get container status \"64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87\": rpc error: code = NotFound desc = could not find container \"64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87\": container with ID starting with 64b46a0a8e620d73fb6aac1f446bebe4babb8d687b7b2201c04e766fbbefda87 not found: ID does not exist" Dec 03 22:40:48.899174 master-0 kubenswrapper[36504]: I1203 22:40:48.899135 36504 scope.go:117] "RemoveContainer" containerID="738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef" Dec 03 22:40:48.899808 master-0 kubenswrapper[36504]: E1203 22:40:48.899743 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef\": container with ID starting with 738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef not found: ID does not exist" containerID="738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef" Dec 03 22:40:48.899893 master-0 kubenswrapper[36504]: I1203 22:40:48.899814 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef"} err="failed to get container status \"738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef\": rpc error: code = NotFound desc = could not find container \"738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef\": container with ID starting with 738d634ecf12e31c7d62fc1182ba45ca562915c68fead0432153ee500da27fef not found: ID does not exist" Dec 03 22:40:49.115921 master-0 kubenswrapper[36504]: I1203 22:40:49.113467 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" path="/var/lib/kubelet/pods/782ee702-cc5d-4fd5-9efd-a3b97138bb2d/volumes" Dec 03 22:40:49.115921 master-0 kubenswrapper[36504]: I1203 22:40:49.114420 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" path="/var/lib/kubelet/pods/e8d3636d-7640-447e-aa8c-1fd42e0679d8/volumes" Dec 03 22:40:58.067901 master-0 kubenswrapper[36504]: I1203 22:40:58.067809 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-nbld9"] Dec 03 22:40:58.083486 master-0 kubenswrapper[36504]: I1203 22:40:58.083413 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-nbld9"] Dec 03 22:40:59.097218 master-0 kubenswrapper[36504]: I1203 22:40:59.097126 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:40:59.125305 master-0 kubenswrapper[36504]: I1203 22:40:59.124598 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c78c17-5e9b-4583-ab1c-6067e305c3de" path="/var/lib/kubelet/pods/a6c78c17-5e9b-4583-ab1c-6067e305c3de/volumes" Dec 03 22:41:35.727189 master-0 kubenswrapper[36504]: E1203 22:41:35.727113 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:41:46.157619 master-0 kubenswrapper[36504]: I1203 22:41:46.157541 36504 scope.go:117] "RemoveContainer" containerID="2dbd9e1c9414c257cc93f177b87fc808d48e40e22cfe21c24b9d5cbf30251c3f" Dec 03 22:41:46.199707 master-0 kubenswrapper[36504]: I1203 22:41:46.199665 36504 scope.go:117] "RemoveContainer" containerID="bb44caac1b06e75cd223259e13c1684297a0475c7dc2eb3ebba39a5d8ededb79" Dec 03 22:42:00.095803 master-0 kubenswrapper[36504]: I1203 22:42:00.095715 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:42:04.095730 master-0 kubenswrapper[36504]: I1203 22:42:04.095650 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:42:10.943466 master-0 kubenswrapper[36504]: I1203 22:42:10.943403 36504 generic.go:334] "Generic (PLEG): container finished" podID="04707111-d21e-4af6-b198-76291f4c06d9" containerID="4d28758097faaaa12c8d21e52b4a27a3036ffc555e5687bb593157069297ae16" exitCode=0 Dec 03 22:42:10.944221 master-0 kubenswrapper[36504]: I1203 22:42:10.943502 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" event={"ID":"04707111-d21e-4af6-b198-76291f4c06d9","Type":"ContainerDied","Data":"4d28758097faaaa12c8d21e52b4a27a3036ffc555e5687bb593157069297ae16"} Dec 03 22:42:12.574217 master-0 kubenswrapper[36504]: I1203 22:42:12.573664 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:42:12.686682 master-0 kubenswrapper[36504]: I1203 22:42:12.686187 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-ssh-key\") pod \"04707111-d21e-4af6-b198-76291f4c06d9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " Dec 03 22:42:12.687077 master-0 kubenswrapper[36504]: I1203 22:42:12.687043 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-combined-ca-bundle\") pod \"04707111-d21e-4af6-b198-76291f4c06d9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " Dec 03 22:42:12.687188 master-0 kubenswrapper[36504]: I1203 22:42:12.687158 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-secret-0\") pod \"04707111-d21e-4af6-b198-76291f4c06d9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " Dec 03 22:42:12.687350 master-0 kubenswrapper[36504]: I1203 22:42:12.687325 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lmtd\" (UniqueName: \"kubernetes.io/projected/04707111-d21e-4af6-b198-76291f4c06d9-kube-api-access-5lmtd\") pod \"04707111-d21e-4af6-b198-76291f4c06d9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " Dec 03 22:42:12.687423 master-0 kubenswrapper[36504]: I1203 22:42:12.687399 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-inventory\") pod \"04707111-d21e-4af6-b198-76291f4c06d9\" (UID: \"04707111-d21e-4af6-b198-76291f4c06d9\") " Dec 03 22:42:12.690900 master-0 kubenswrapper[36504]: I1203 22:42:12.690829 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "04707111-d21e-4af6-b198-76291f4c06d9" (UID: "04707111-d21e-4af6-b198-76291f4c06d9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:12.691992 master-0 kubenswrapper[36504]: I1203 22:42:12.691924 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04707111-d21e-4af6-b198-76291f4c06d9-kube-api-access-5lmtd" (OuterVolumeSpecName: "kube-api-access-5lmtd") pod "04707111-d21e-4af6-b198-76291f4c06d9" (UID: "04707111-d21e-4af6-b198-76291f4c06d9"). InnerVolumeSpecName "kube-api-access-5lmtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:42:12.725309 master-0 kubenswrapper[36504]: I1203 22:42:12.725173 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "04707111-d21e-4af6-b198-76291f4c06d9" (UID: "04707111-d21e-4af6-b198-76291f4c06d9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:12.727461 master-0 kubenswrapper[36504]: I1203 22:42:12.727394 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "04707111-d21e-4af6-b198-76291f4c06d9" (UID: "04707111-d21e-4af6-b198-76291f4c06d9"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:12.728007 master-0 kubenswrapper[36504]: I1203 22:42:12.727969 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-inventory" (OuterVolumeSpecName: "inventory") pod "04707111-d21e-4af6-b198-76291f4c06d9" (UID: "04707111-d21e-4af6-b198-76291f4c06d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:42:12.794461 master-0 kubenswrapper[36504]: I1203 22:42:12.793953 36504 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:42:12.794461 master-0 kubenswrapper[36504]: I1203 22:42:12.794006 36504 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-libvirt-secret-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:42:12.794461 master-0 kubenswrapper[36504]: I1203 22:42:12.794031 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lmtd\" (UniqueName: \"kubernetes.io/projected/04707111-d21e-4af6-b198-76291f4c06d9-kube-api-access-5lmtd\") on node \"master-0\" DevicePath \"\"" Dec 03 22:42:12.794461 master-0 kubenswrapper[36504]: I1203 22:42:12.794053 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:42:12.794461 master-0 kubenswrapper[36504]: I1203 22:42:12.794064 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/04707111-d21e-4af6-b198-76291f4c06d9-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:42:12.972620 master-0 kubenswrapper[36504]: I1203 22:42:12.972442 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" event={"ID":"04707111-d21e-4af6-b198-76291f4c06d9","Type":"ContainerDied","Data":"a7dc0701414f9f615392d475255ff6d1f6df4a811868df8c2ffc8126e2d5c5a8"} Dec 03 22:42:12.972620 master-0 kubenswrapper[36504]: I1203 22:42:12.972508 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7dc0701414f9f615392d475255ff6d1f6df4a811868df8c2ffc8126e2d5c5a8" Dec 03 22:42:12.972620 master-0 kubenswrapper[36504]: I1203 22:42:12.972596 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-dataplane-services-edpm-tb2h9" Dec 03 22:42:13.119452 master-0 kubenswrapper[36504]: I1203 22:42:13.119364 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-dataplane-services-edpm-ldv7n"] Dec 03 22:42:13.120458 master-0 kubenswrapper[36504]: E1203 22:42:13.120419 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="extract-content" Dec 03 22:42:13.120458 master-0 kubenswrapper[36504]: I1203 22:42:13.120452 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="extract-content" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: E1203 22:42:13.120468 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="registry-server" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: I1203 22:42:13.120477 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="registry-server" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: E1203 22:42:13.120535 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="extract-utilities" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: I1203 22:42:13.120546 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="extract-utilities" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: E1203 22:42:13.120563 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="extract-content" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: I1203 22:42:13.120572 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="extract-content" Dec 03 22:42:13.120580 master-0 kubenswrapper[36504]: E1203 22:42:13.120584 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="registry-server" Dec 03 22:42:13.120876 master-0 kubenswrapper[36504]: I1203 22:42:13.120595 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="registry-server" Dec 03 22:42:13.120876 master-0 kubenswrapper[36504]: E1203 22:42:13.120631 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="extract-utilities" Dec 03 22:42:13.120876 master-0 kubenswrapper[36504]: I1203 22:42:13.120642 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="extract-utilities" Dec 03 22:42:13.120876 master-0 kubenswrapper[36504]: E1203 22:42:13.120667 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04707111-d21e-4af6-b198-76291f4c06d9" containerName="libvirt-dataplane-services-edpm" Dec 03 22:42:13.120876 master-0 kubenswrapper[36504]: I1203 22:42:13.120675 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="04707111-d21e-4af6-b198-76291f4c06d9" containerName="libvirt-dataplane-services-edpm" Dec 03 22:42:13.121103 master-0 kubenswrapper[36504]: I1203 22:42:13.121070 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8d3636d-7640-447e-aa8c-1fd42e0679d8" containerName="registry-server" Dec 03 22:42:13.121147 master-0 kubenswrapper[36504]: I1203 22:42:13.121121 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="04707111-d21e-4af6-b198-76291f4c06d9" containerName="libvirt-dataplane-services-edpm" Dec 03 22:42:13.121185 master-0 kubenswrapper[36504]: I1203 22:42:13.121154 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="782ee702-cc5d-4fd5-9efd-a3b97138bb2d" containerName="registry-server" Dec 03 22:42:13.122715 master-0 kubenswrapper[36504]: I1203 22:42:13.122678 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.126492 master-0 kubenswrapper[36504]: I1203 22:42:13.126444 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:42:13.126680 master-0 kubenswrapper[36504]: I1203 22:42:13.126653 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:42:13.126849 master-0 kubenswrapper[36504]: I1203 22:42:13.126812 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 22:42:13.126975 master-0 kubenswrapper[36504]: I1203 22:42:13.126948 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 22:42:13.127179 master-0 kubenswrapper[36504]: I1203 22:42:13.127152 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:42:13.160629 master-0 kubenswrapper[36504]: I1203 22:42:13.155983 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-dataplane-services-edpm-ldv7n"] Dec 03 22:42:13.309357 master-0 kubenswrapper[36504]: I1203 22:42:13.309198 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-1\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.309357 master-0 kubenswrapper[36504]: I1203 22:42:13.309258 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-inventory\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.309357 master-0 kubenswrapper[36504]: I1203 22:42:13.309313 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-1\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.309810 master-0 kubenswrapper[36504]: I1203 22:42:13.309696 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-ssh-key\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.309810 master-0 kubenswrapper[36504]: I1203 22:42:13.309749 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-0\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.309988 master-0 kubenswrapper[36504]: I1203 22:42:13.309949 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-combined-ca-bundle\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.310038 master-0 kubenswrapper[36504]: I1203 22:42:13.310003 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68q5\" (UniqueName: \"kubernetes.io/projected/0b812398-2798-4234-a991-5978100dc9b0-kube-api-access-b68q5\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.310097 master-0 kubenswrapper[36504]: I1203 22:42:13.310075 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-0\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413326 master-0 kubenswrapper[36504]: I1203 22:42:13.413246 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-combined-ca-bundle\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413326 master-0 kubenswrapper[36504]: I1203 22:42:13.413325 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68q5\" (UniqueName: \"kubernetes.io/projected/0b812398-2798-4234-a991-5978100dc9b0-kube-api-access-b68q5\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413684 master-0 kubenswrapper[36504]: I1203 22:42:13.413371 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-0\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413684 master-0 kubenswrapper[36504]: I1203 22:42:13.413404 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-1\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413684 master-0 kubenswrapper[36504]: I1203 22:42:13.413428 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-inventory\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413684 master-0 kubenswrapper[36504]: I1203 22:42:13.413464 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-1\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413684 master-0 kubenswrapper[36504]: I1203 22:42:13.413514 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-ssh-key\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.413684 master-0 kubenswrapper[36504]: I1203 22:42:13.413590 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-0\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.417497 master-0 kubenswrapper[36504]: I1203 22:42:13.417443 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-1\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.417497 master-0 kubenswrapper[36504]: I1203 22:42:13.417481 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-0\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.418608 master-0 kubenswrapper[36504]: I1203 22:42:13.418580 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-inventory\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.419357 master-0 kubenswrapper[36504]: I1203 22:42:13.419323 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-1\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.420205 master-0 kubenswrapper[36504]: I1203 22:42:13.420142 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-ssh-key\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.421538 master-0 kubenswrapper[36504]: I1203 22:42:13.421469 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-combined-ca-bundle\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.424437 master-0 kubenswrapper[36504]: I1203 22:42:13.424365 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-0\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.433956 master-0 kubenswrapper[36504]: I1203 22:42:13.433899 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68q5\" (UniqueName: \"kubernetes.io/projected/0b812398-2798-4234-a991-5978100dc9b0-kube-api-access-b68q5\") pod \"nova-dataplane-services-edpm-ldv7n\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:13.464918 master-0 kubenswrapper[36504]: I1203 22:42:13.463806 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:42:14.095704 master-0 kubenswrapper[36504]: I1203 22:42:14.095595 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-dataplane-services-edpm-ldv7n"] Dec 03 22:42:14.098812 master-0 kubenswrapper[36504]: I1203 22:42:14.098753 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:42:15.007378 master-0 kubenswrapper[36504]: I1203 22:42:15.007206 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-dataplane-services-edpm-ldv7n" event={"ID":"0b812398-2798-4234-a991-5978100dc9b0","Type":"ContainerStarted","Data":"0ccc68ba5bbb38c966287f18b1115a6aaa7810e44f01f90a99c2aede92733f0c"} Dec 03 22:42:15.007378 master-0 kubenswrapper[36504]: I1203 22:42:15.007274 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-dataplane-services-edpm-ldv7n" event={"ID":"0b812398-2798-4234-a991-5978100dc9b0","Type":"ContainerStarted","Data":"81858c64c9a0caca259d8480640f97fa91376c24aa4bc1327ce2c7c2134348ae"} Dec 03 22:42:15.082631 master-0 kubenswrapper[36504]: I1203 22:42:15.082425 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-dataplane-services-edpm-ldv7n" podStartSLOduration=1.625927189 podStartE2EDuration="2.082394189s" podCreationTimestamp="2025-12-03 22:42:13 +0000 UTC" firstStartedPulling="2025-12-03 22:42:14.098708778 +0000 UTC m=+1899.318480785" lastFinishedPulling="2025-12-03 22:42:14.555175778 +0000 UTC m=+1899.774947785" observedRunningTime="2025-12-03 22:42:15.067937601 +0000 UTC m=+1900.287709608" watchObservedRunningTime="2025-12-03 22:42:15.082394189 +0000 UTC m=+1900.302166196" Dec 03 22:42:35.733410 master-0 kubenswrapper[36504]: E1203 22:42:35.733338 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:43:18.096260 master-0 kubenswrapper[36504]: I1203 22:43:18.096185 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:43:28.096511 master-0 kubenswrapper[36504]: I1203 22:43:28.096432 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:43:35.720324 master-0 kubenswrapper[36504]: E1203 22:43:35.720231 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:44:27.096917 master-0 kubenswrapper[36504]: I1203 22:44:27.096828 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:44:29.096068 master-0 kubenswrapper[36504]: I1203 22:44:29.095999 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:44:35.729577 master-0 kubenswrapper[36504]: E1203 22:44:35.729509 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:45:00.174210 master-0 kubenswrapper[36504]: I1203 22:45:00.174121 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z"] Dec 03 22:45:00.177181 master-0 kubenswrapper[36504]: I1203 22:45:00.177113 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.180128 master-0 kubenswrapper[36504]: I1203 22:45:00.180064 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 22:45:00.192369 master-0 kubenswrapper[36504]: I1203 22:45:00.192215 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 22:45:00.221015 master-0 kubenswrapper[36504]: I1203 22:45:00.218182 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z"] Dec 03 22:45:00.322664 master-0 kubenswrapper[36504]: I1203 22:45:00.322537 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ab5d874-2936-4181-a579-3b3da719081f-secret-volume\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.323197 master-0 kubenswrapper[36504]: I1203 22:45:00.323156 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85jn4\" (UniqueName: \"kubernetes.io/projected/9ab5d874-2936-4181-a579-3b3da719081f-kube-api-access-85jn4\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.323787 master-0 kubenswrapper[36504]: I1203 22:45:00.323693 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ab5d874-2936-4181-a579-3b3da719081f-config-volume\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.425476 master-0 kubenswrapper[36504]: I1203 22:45:00.425267 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85jn4\" (UniqueName: \"kubernetes.io/projected/9ab5d874-2936-4181-a579-3b3da719081f-kube-api-access-85jn4\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.425476 master-0 kubenswrapper[36504]: I1203 22:45:00.425423 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ab5d874-2936-4181-a579-3b3da719081f-config-volume\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.425918 master-0 kubenswrapper[36504]: I1203 22:45:00.425581 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ab5d874-2936-4181-a579-3b3da719081f-secret-volume\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.426763 master-0 kubenswrapper[36504]: I1203 22:45:00.426707 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ab5d874-2936-4181-a579-3b3da719081f-config-volume\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.431677 master-0 kubenswrapper[36504]: I1203 22:45:00.431604 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ab5d874-2936-4181-a579-3b3da719081f-secret-volume\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.442753 master-0 kubenswrapper[36504]: I1203 22:45:00.442676 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85jn4\" (UniqueName: \"kubernetes.io/projected/9ab5d874-2936-4181-a579-3b3da719081f-kube-api-access-85jn4\") pod \"collect-profiles-29413365-mbz2z\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:00.543125 master-0 kubenswrapper[36504]: I1203 22:45:00.543020 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:01.042377 master-0 kubenswrapper[36504]: I1203 22:45:01.042295 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z"] Dec 03 22:45:01.042710 master-0 kubenswrapper[36504]: W1203 22:45:01.042457 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab5d874_2936_4181_a579_3b3da719081f.slice/crio-22f1721f24147836b8d3ab425e0c6b6ad39a43f7ee4dc3376a89d6db7b6dfc2e WatchSource:0}: Error finding container 22f1721f24147836b8d3ab425e0c6b6ad39a43f7ee4dc3376a89d6db7b6dfc2e: Status 404 returned error can't find the container with id 22f1721f24147836b8d3ab425e0c6b6ad39a43f7ee4dc3376a89d6db7b6dfc2e Dec 03 22:45:01.518463 master-0 kubenswrapper[36504]: I1203 22:45:01.518395 36504 generic.go:334] "Generic (PLEG): container finished" podID="9ab5d874-2936-4181-a579-3b3da719081f" containerID="8c6ae18dd798993682f984a97ff8a834c1c36f8a63cb3a7dababb0bcd39b3a1d" exitCode=0 Dec 03 22:45:01.518463 master-0 kubenswrapper[36504]: I1203 22:45:01.518458 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" event={"ID":"9ab5d874-2936-4181-a579-3b3da719081f","Type":"ContainerDied","Data":"8c6ae18dd798993682f984a97ff8a834c1c36f8a63cb3a7dababb0bcd39b3a1d"} Dec 03 22:45:01.520830 master-0 kubenswrapper[36504]: I1203 22:45:01.518490 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" event={"ID":"9ab5d874-2936-4181-a579-3b3da719081f","Type":"ContainerStarted","Data":"22f1721f24147836b8d3ab425e0c6b6ad39a43f7ee4dc3376a89d6db7b6dfc2e"} Dec 03 22:45:02.936467 master-0 kubenswrapper[36504]: I1203 22:45:02.936384 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:03.017989 master-0 kubenswrapper[36504]: I1203 22:45:03.017925 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ab5d874-2936-4181-a579-3b3da719081f-config-volume\") pod \"9ab5d874-2936-4181-a579-3b3da719081f\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " Dec 03 22:45:03.017989 master-0 kubenswrapper[36504]: I1203 22:45:03.017996 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ab5d874-2936-4181-a579-3b3da719081f-secret-volume\") pod \"9ab5d874-2936-4181-a579-3b3da719081f\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " Dec 03 22:45:03.019123 master-0 kubenswrapper[36504]: I1203 22:45:03.018687 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab5d874-2936-4181-a579-3b3da719081f-config-volume" (OuterVolumeSpecName: "config-volume") pod "9ab5d874-2936-4181-a579-3b3da719081f" (UID: "9ab5d874-2936-4181-a579-3b3da719081f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:45:03.020200 master-0 kubenswrapper[36504]: I1203 22:45:03.020167 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ab5d874-2936-4181-a579-3b3da719081f-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:03.022156 master-0 kubenswrapper[36504]: I1203 22:45:03.022086 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ab5d874-2936-4181-a579-3b3da719081f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9ab5d874-2936-4181-a579-3b3da719081f" (UID: "9ab5d874-2936-4181-a579-3b3da719081f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:03.121844 master-0 kubenswrapper[36504]: I1203 22:45:03.121756 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85jn4\" (UniqueName: \"kubernetes.io/projected/9ab5d874-2936-4181-a579-3b3da719081f-kube-api-access-85jn4\") pod \"9ab5d874-2936-4181-a579-3b3da719081f\" (UID: \"9ab5d874-2936-4181-a579-3b3da719081f\") " Dec 03 22:45:03.123601 master-0 kubenswrapper[36504]: I1203 22:45:03.123565 36504 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ab5d874-2936-4181-a579-3b3da719081f-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:03.125651 master-0 kubenswrapper[36504]: I1203 22:45:03.125578 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab5d874-2936-4181-a579-3b3da719081f-kube-api-access-85jn4" (OuterVolumeSpecName: "kube-api-access-85jn4") pod "9ab5d874-2936-4181-a579-3b3da719081f" (UID: "9ab5d874-2936-4181-a579-3b3da719081f"). InnerVolumeSpecName "kube-api-access-85jn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:45:03.228382 master-0 kubenswrapper[36504]: I1203 22:45:03.228168 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85jn4\" (UniqueName: \"kubernetes.io/projected/9ab5d874-2936-4181-a579-3b3da719081f-kube-api-access-85jn4\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:03.566232 master-0 kubenswrapper[36504]: I1203 22:45:03.566170 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" event={"ID":"9ab5d874-2936-4181-a579-3b3da719081f","Type":"ContainerDied","Data":"22f1721f24147836b8d3ab425e0c6b6ad39a43f7ee4dc3376a89d6db7b6dfc2e"} Dec 03 22:45:03.566586 master-0 kubenswrapper[36504]: I1203 22:45:03.566570 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f1721f24147836b8d3ab425e0c6b6ad39a43f7ee4dc3376a89d6db7b6dfc2e" Dec 03 22:45:03.566667 master-0 kubenswrapper[36504]: I1203 22:45:03.566226 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z" Dec 03 22:45:04.046083 master-0 kubenswrapper[36504]: I1203 22:45:04.045998 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m"] Dec 03 22:45:04.061902 master-0 kubenswrapper[36504]: I1203 22:45:04.061815 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413320-ndk8m"] Dec 03 22:45:05.111568 master-0 kubenswrapper[36504]: I1203 22:45:05.111497 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea59f59-8970-4eea-994d-9763792ee704" path="/var/lib/kubelet/pods/4ea59f59-8970-4eea-994d-9763792ee704/volumes" Dec 03 22:45:14.717410 master-0 kubenswrapper[36504]: I1203 22:45:14.717346 36504 generic.go:334] "Generic (PLEG): container finished" podID="0b812398-2798-4234-a991-5978100dc9b0" containerID="0ccc68ba5bbb38c966287f18b1115a6aaa7810e44f01f90a99c2aede92733f0c" exitCode=0 Dec 03 22:45:14.718247 master-0 kubenswrapper[36504]: I1203 22:45:14.717425 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-dataplane-services-edpm-ldv7n" event={"ID":"0b812398-2798-4234-a991-5978100dc9b0","Type":"ContainerDied","Data":"0ccc68ba5bbb38c966287f18b1115a6aaa7810e44f01f90a99c2aede92733f0c"} Dec 03 22:45:16.344513 master-0 kubenswrapper[36504]: I1203 22:45:16.344450 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:45:16.480657 master-0 kubenswrapper[36504]: I1203 22:45:16.480480 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-1\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.480657 master-0 kubenswrapper[36504]: I1203 22:45:16.480622 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-0\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.485025 master-0 kubenswrapper[36504]: I1203 22:45:16.484954 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-combined-ca-bundle\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.485199 master-0 kubenswrapper[36504]: I1203 22:45:16.485138 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-ssh-key\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.485413 master-0 kubenswrapper[36504]: I1203 22:45:16.485378 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-inventory\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.486876 master-0 kubenswrapper[36504]: I1203 22:45:16.485605 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-0\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.486876 master-0 kubenswrapper[36504]: I1203 22:45:16.485638 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-1\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.486876 master-0 kubenswrapper[36504]: I1203 22:45:16.485681 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68q5\" (UniqueName: \"kubernetes.io/projected/0b812398-2798-4234-a991-5978100dc9b0-kube-api-access-b68q5\") pod \"0b812398-2798-4234-a991-5978100dc9b0\" (UID: \"0b812398-2798-4234-a991-5978100dc9b0\") " Dec 03 22:45:16.489238 master-0 kubenswrapper[36504]: I1203 22:45:16.489154 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.490590 master-0 kubenswrapper[36504]: I1203 22:45:16.490411 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b812398-2798-4234-a991-5978100dc9b0-kube-api-access-b68q5" (OuterVolumeSpecName: "kube-api-access-b68q5") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "kube-api-access-b68q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:45:16.521311 master-0 kubenswrapper[36504]: I1203 22:45:16.521235 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.521592 master-0 kubenswrapper[36504]: I1203 22:45:16.521362 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-inventory" (OuterVolumeSpecName: "inventory") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.524230 master-0 kubenswrapper[36504]: I1203 22:45:16.524140 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.524694 master-0 kubenswrapper[36504]: I1203 22:45:16.524559 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.533199 master-0 kubenswrapper[36504]: I1203 22:45:16.533129 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.537099 master-0 kubenswrapper[36504]: I1203 22:45:16.537023 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0b812398-2798-4234-a991-5978100dc9b0" (UID: "0b812398-2798-4234-a991-5978100dc9b0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:45:16.590289 master-0 kubenswrapper[36504]: I1203 22:45:16.590220 36504 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590289 master-0 kubenswrapper[36504]: I1203 22:45:16.590273 36504 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-cell1-compute-config-1\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590289 master-0 kubenswrapper[36504]: I1203 22:45:16.590286 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68q5\" (UniqueName: \"kubernetes.io/projected/0b812398-2798-4234-a991-5978100dc9b0-kube-api-access-b68q5\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590289 master-0 kubenswrapper[36504]: I1203 22:45:16.590298 36504 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-1\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590289 master-0 kubenswrapper[36504]: I1203 22:45:16.590308 36504 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-migration-ssh-key-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590728 master-0 kubenswrapper[36504]: I1203 22:45:16.590329 36504 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-nova-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590728 master-0 kubenswrapper[36504]: I1203 22:45:16.590350 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.590728 master-0 kubenswrapper[36504]: I1203 22:45:16.590400 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b812398-2798-4234-a991-5978100dc9b0-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:45:16.750472 master-0 kubenswrapper[36504]: I1203 22:45:16.750311 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-dataplane-services-edpm-ldv7n" event={"ID":"0b812398-2798-4234-a991-5978100dc9b0","Type":"ContainerDied","Data":"81858c64c9a0caca259d8480640f97fa91376c24aa4bc1327ce2c7c2134348ae"} Dec 03 22:45:16.750472 master-0 kubenswrapper[36504]: I1203 22:45:16.750374 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81858c64c9a0caca259d8480640f97fa91376c24aa4bc1327ce2c7c2134348ae" Dec 03 22:45:16.750472 master-0 kubenswrapper[36504]: I1203 22:45:16.750449 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-dataplane-services-edpm-ldv7n" Dec 03 22:45:16.874797 master-0 kubenswrapper[36504]: I1203 22:45:16.874700 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-dataplane-services-edpm-2jkt6"] Dec 03 22:45:16.876927 master-0 kubenswrapper[36504]: E1203 22:45:16.875615 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab5d874-2936-4181-a579-3b3da719081f" containerName="collect-profiles" Dec 03 22:45:16.876927 master-0 kubenswrapper[36504]: I1203 22:45:16.875639 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab5d874-2936-4181-a579-3b3da719081f" containerName="collect-profiles" Dec 03 22:45:16.876927 master-0 kubenswrapper[36504]: E1203 22:45:16.875730 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b812398-2798-4234-a991-5978100dc9b0" containerName="nova-dataplane-services-edpm" Dec 03 22:45:16.876927 master-0 kubenswrapper[36504]: I1203 22:45:16.875739 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b812398-2798-4234-a991-5978100dc9b0" containerName="nova-dataplane-services-edpm" Dec 03 22:45:16.876927 master-0 kubenswrapper[36504]: I1203 22:45:16.876070 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab5d874-2936-4181-a579-3b3da719081f" containerName="collect-profiles" Dec 03 22:45:16.876927 master-0 kubenswrapper[36504]: I1203 22:45:16.876109 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b812398-2798-4234-a991-5978100dc9b0" containerName="nova-dataplane-services-edpm" Dec 03 22:45:16.877431 master-0 kubenswrapper[36504]: I1203 22:45:16.877294 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.880055 master-0 kubenswrapper[36504]: I1203 22:45:16.879993 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 22:45:16.880189 master-0 kubenswrapper[36504]: I1203 22:45:16.880135 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-edpm" Dec 03 22:45:16.880300 master-0 kubenswrapper[36504]: I1203 22:45:16.880275 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 22:45:16.887837 master-0 kubenswrapper[36504]: I1203 22:45:16.887764 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 22:45:16.901501 master-0 kubenswrapper[36504]: I1203 22:45:16.901427 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjzqj\" (UniqueName: \"kubernetes.io/projected/7aaf0e4b-65ab-49f5-9786-3115aba68d33-kube-api-access-xjzqj\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.901844 master-0 kubenswrapper[36504]: I1203 22:45:16.901572 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-2\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.901844 master-0 kubenswrapper[36504]: I1203 22:45:16.901660 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-inventory\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.901844 master-0 kubenswrapper[36504]: I1203 22:45:16.901702 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-1\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.901844 master-0 kubenswrapper[36504]: I1203 22:45:16.901820 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ssh-key\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.901844 master-0 kubenswrapper[36504]: I1203 22:45:16.901845 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-0\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.902039 master-0 kubenswrapper[36504]: I1203 22:45:16.901867 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-telemetry-combined-ca-bundle\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:16.904534 master-0 kubenswrapper[36504]: I1203 22:45:16.904469 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-dataplane-services-edpm-2jkt6"] Dec 03 22:45:17.007341 master-0 kubenswrapper[36504]: I1203 22:45:17.007009 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-inventory\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.007718 master-0 kubenswrapper[36504]: I1203 22:45:17.007436 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-1\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.009181 master-0 kubenswrapper[36504]: I1203 22:45:17.008240 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ssh-key\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.009181 master-0 kubenswrapper[36504]: I1203 22:45:17.008312 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-0\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.009181 master-0 kubenswrapper[36504]: I1203 22:45:17.008855 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-telemetry-combined-ca-bundle\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.009181 master-0 kubenswrapper[36504]: I1203 22:45:17.009155 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjzqj\" (UniqueName: \"kubernetes.io/projected/7aaf0e4b-65ab-49f5-9786-3115aba68d33-kube-api-access-xjzqj\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.009841 master-0 kubenswrapper[36504]: I1203 22:45:17.009721 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-2\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.013506 master-0 kubenswrapper[36504]: I1203 22:45:17.013418 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-0\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.013919 master-0 kubenswrapper[36504]: I1203 22:45:17.013898 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-inventory\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.014073 master-0 kubenswrapper[36504]: I1203 22:45:17.014033 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-telemetry-combined-ca-bundle\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.015153 master-0 kubenswrapper[36504]: I1203 22:45:17.014544 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ssh-key\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.017779 master-0 kubenswrapper[36504]: I1203 22:45:17.017287 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-1\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.017779 master-0 kubenswrapper[36504]: I1203 22:45:17.017722 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-2\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.028723 master-0 kubenswrapper[36504]: I1203 22:45:17.028654 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjzqj\" (UniqueName: \"kubernetes.io/projected/7aaf0e4b-65ab-49f5-9786-3115aba68d33-kube-api-access-xjzqj\") pod \"telemetry-dataplane-services-edpm-2jkt6\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.213240 master-0 kubenswrapper[36504]: I1203 22:45:17.213168 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:45:17.804823 master-0 kubenswrapper[36504]: I1203 22:45:17.804760 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-dataplane-services-edpm-2jkt6"] Dec 03 22:45:18.783528 master-0 kubenswrapper[36504]: I1203 22:45:18.783453 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" event={"ID":"7aaf0e4b-65ab-49f5-9786-3115aba68d33","Type":"ContainerStarted","Data":"c3ce43531853a1568b24bf579104fab6715d8b0a84317979ea758a1ce85d1cc4"} Dec 03 22:45:18.783528 master-0 kubenswrapper[36504]: I1203 22:45:18.783521 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" event={"ID":"7aaf0e4b-65ab-49f5-9786-3115aba68d33","Type":"ContainerStarted","Data":"219d31ada86fe554d5a656c55f68c0886f54ec5bdd0dace7cf31a61acd55f407"} Dec 03 22:45:18.817637 master-0 kubenswrapper[36504]: I1203 22:45:18.817502 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" podStartSLOduration=2.277586368 podStartE2EDuration="2.817480343s" podCreationTimestamp="2025-12-03 22:45:16 +0000 UTC" firstStartedPulling="2025-12-03 22:45:17.802477641 +0000 UTC m=+2083.022249648" lastFinishedPulling="2025-12-03 22:45:18.342371616 +0000 UTC m=+2083.562143623" observedRunningTime="2025-12-03 22:45:18.806511233 +0000 UTC m=+2084.026283250" watchObservedRunningTime="2025-12-03 22:45:18.817480343 +0000 UTC m=+2084.037252340" Dec 03 22:45:35.721100 master-0 kubenswrapper[36504]: E1203 22:45:35.721025 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:45:45.109563 master-0 kubenswrapper[36504]: I1203 22:45:45.106947 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:45:46.430908 master-0 kubenswrapper[36504]: I1203 22:45:46.430835 36504 scope.go:117] "RemoveContainer" containerID="e80985ff027f59006e6b5b28f1d88811cb0ea27e878e3be4ff3dc70b657dbfd8" Dec 03 22:45:51.097132 master-0 kubenswrapper[36504]: I1203 22:45:51.097048 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:46:35.723547 master-0 kubenswrapper[36504]: E1203 22:46:35.723449 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:47:06.096311 master-0 kubenswrapper[36504]: I1203 22:47:06.096155 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:47:09.109322 master-0 kubenswrapper[36504]: I1203 22:47:09.109218 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:47:15.687012 master-0 kubenswrapper[36504]: I1203 22:47:15.686934 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hqsz"] Dec 03 22:47:15.692359 master-0 kubenswrapper[36504]: I1203 22:47:15.692317 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.703445 master-0 kubenswrapper[36504]: I1203 22:47:15.703357 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hqsz"] Dec 03 22:47:15.734242 master-0 kubenswrapper[36504]: I1203 22:47:15.733422 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxzjs\" (UniqueName: \"kubernetes.io/projected/8c0990ef-af5f-4c98-87b2-09da15a10703-kube-api-access-xxzjs\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.737928 master-0 kubenswrapper[36504]: I1203 22:47:15.736033 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-catalog-content\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.737928 master-0 kubenswrapper[36504]: I1203 22:47:15.736450 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-utilities\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.844128 master-0 kubenswrapper[36504]: I1203 22:47:15.844016 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxzjs\" (UniqueName: \"kubernetes.io/projected/8c0990ef-af5f-4c98-87b2-09da15a10703-kube-api-access-xxzjs\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.844466 master-0 kubenswrapper[36504]: I1203 22:47:15.844160 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-catalog-content\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.844466 master-0 kubenswrapper[36504]: I1203 22:47:15.844238 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-utilities\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.845015 master-0 kubenswrapper[36504]: I1203 22:47:15.844977 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-utilities\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.845294 master-0 kubenswrapper[36504]: I1203 22:47:15.845208 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-catalog-content\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:15.866989 master-0 kubenswrapper[36504]: I1203 22:47:15.866912 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxzjs\" (UniqueName: \"kubernetes.io/projected/8c0990ef-af5f-4c98-87b2-09da15a10703-kube-api-access-xxzjs\") pod \"certified-operators-2hqsz\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:16.044641 master-0 kubenswrapper[36504]: I1203 22:47:16.044451 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:16.697900 master-0 kubenswrapper[36504]: W1203 22:47:16.697763 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c0990ef_af5f_4c98_87b2_09da15a10703.slice/crio-afebb37bb76619568d6ddbea4f88f2107469bed0ead2a6c55eb2489d2f1cdf96 WatchSource:0}: Error finding container afebb37bb76619568d6ddbea4f88f2107469bed0ead2a6c55eb2489d2f1cdf96: Status 404 returned error can't find the container with id afebb37bb76619568d6ddbea4f88f2107469bed0ead2a6c55eb2489d2f1cdf96 Dec 03 22:47:16.718562 master-0 kubenswrapper[36504]: I1203 22:47:16.718461 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hqsz"] Dec 03 22:47:17.605112 master-0 kubenswrapper[36504]: I1203 22:47:17.604983 36504 generic.go:334] "Generic (PLEG): container finished" podID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerID="622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96" exitCode=0 Dec 03 22:47:17.605463 master-0 kubenswrapper[36504]: I1203 22:47:17.605099 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerDied","Data":"622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96"} Dec 03 22:47:17.605463 master-0 kubenswrapper[36504]: I1203 22:47:17.605177 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerStarted","Data":"afebb37bb76619568d6ddbea4f88f2107469bed0ead2a6c55eb2489d2f1cdf96"} Dec 03 22:47:17.607738 master-0 kubenswrapper[36504]: I1203 22:47:17.607709 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:47:18.622556 master-0 kubenswrapper[36504]: I1203 22:47:18.622467 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerStarted","Data":"d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70"} Dec 03 22:47:19.639755 master-0 kubenswrapper[36504]: I1203 22:47:19.639672 36504 generic.go:334] "Generic (PLEG): container finished" podID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerID="d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70" exitCode=0 Dec 03 22:47:19.639755 master-0 kubenswrapper[36504]: I1203 22:47:19.639744 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerDied","Data":"d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70"} Dec 03 22:47:20.657548 master-0 kubenswrapper[36504]: I1203 22:47:20.657476 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerStarted","Data":"7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454"} Dec 03 22:47:20.694868 master-0 kubenswrapper[36504]: I1203 22:47:20.693699 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hqsz" podStartSLOduration=3.2739375920000002 podStartE2EDuration="5.693673458s" podCreationTimestamp="2025-12-03 22:47:15 +0000 UTC" firstStartedPulling="2025-12-03 22:47:17.607627638 +0000 UTC m=+2202.827399635" lastFinishedPulling="2025-12-03 22:47:20.027363494 +0000 UTC m=+2205.247135501" observedRunningTime="2025-12-03 22:47:20.685941867 +0000 UTC m=+2205.905713874" watchObservedRunningTime="2025-12-03 22:47:20.693673458 +0000 UTC m=+2205.913445465" Dec 03 22:47:26.044983 master-0 kubenswrapper[36504]: I1203 22:47:26.044920 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:26.047110 master-0 kubenswrapper[36504]: I1203 22:47:26.047073 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:26.104131 master-0 kubenswrapper[36504]: I1203 22:47:26.103502 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:26.813054 master-0 kubenswrapper[36504]: I1203 22:47:26.812843 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:27.022714 master-0 kubenswrapper[36504]: I1203 22:47:27.022631 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hqsz"] Dec 03 22:47:28.780349 master-0 kubenswrapper[36504]: I1203 22:47:28.780262 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hqsz" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="registry-server" containerID="cri-o://7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454" gracePeriod=2 Dec 03 22:47:29.403557 master-0 kubenswrapper[36504]: I1203 22:47:29.403494 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:29.557705 master-0 kubenswrapper[36504]: I1203 22:47:29.557468 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxzjs\" (UniqueName: \"kubernetes.io/projected/8c0990ef-af5f-4c98-87b2-09da15a10703-kube-api-access-xxzjs\") pod \"8c0990ef-af5f-4c98-87b2-09da15a10703\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " Dec 03 22:47:29.557705 master-0 kubenswrapper[36504]: I1203 22:47:29.557625 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-catalog-content\") pod \"8c0990ef-af5f-4c98-87b2-09da15a10703\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " Dec 03 22:47:29.559705 master-0 kubenswrapper[36504]: I1203 22:47:29.559653 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-utilities\") pod \"8c0990ef-af5f-4c98-87b2-09da15a10703\" (UID: \"8c0990ef-af5f-4c98-87b2-09da15a10703\") " Dec 03 22:47:29.561048 master-0 kubenswrapper[36504]: I1203 22:47:29.560980 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-utilities" (OuterVolumeSpecName: "utilities") pod "8c0990ef-af5f-4c98-87b2-09da15a10703" (UID: "8c0990ef-af5f-4c98-87b2-09da15a10703"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:47:29.562863 master-0 kubenswrapper[36504]: I1203 22:47:29.562814 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0990ef-af5f-4c98-87b2-09da15a10703-kube-api-access-xxzjs" (OuterVolumeSpecName: "kube-api-access-xxzjs") pod "8c0990ef-af5f-4c98-87b2-09da15a10703" (UID: "8c0990ef-af5f-4c98-87b2-09da15a10703"). InnerVolumeSpecName "kube-api-access-xxzjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:47:29.607079 master-0 kubenswrapper[36504]: I1203 22:47:29.606985 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c0990ef-af5f-4c98-87b2-09da15a10703" (UID: "8c0990ef-af5f-4c98-87b2-09da15a10703"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:47:29.664797 master-0 kubenswrapper[36504]: I1203 22:47:29.664732 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxzjs\" (UniqueName: \"kubernetes.io/projected/8c0990ef-af5f-4c98-87b2-09da15a10703-kube-api-access-xxzjs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:29.664797 master-0 kubenswrapper[36504]: I1203 22:47:29.664793 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:29.664797 master-0 kubenswrapper[36504]: I1203 22:47:29.664808 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c0990ef-af5f-4c98-87b2-09da15a10703-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:29.804215 master-0 kubenswrapper[36504]: I1203 22:47:29.804149 36504 generic.go:334] "Generic (PLEG): container finished" podID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerID="7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454" exitCode=0 Dec 03 22:47:29.805008 master-0 kubenswrapper[36504]: I1203 22:47:29.804221 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerDied","Data":"7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454"} Dec 03 22:47:29.805008 master-0 kubenswrapper[36504]: I1203 22:47:29.804244 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hqsz" Dec 03 22:47:29.805008 master-0 kubenswrapper[36504]: I1203 22:47:29.804275 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hqsz" event={"ID":"8c0990ef-af5f-4c98-87b2-09da15a10703","Type":"ContainerDied","Data":"afebb37bb76619568d6ddbea4f88f2107469bed0ead2a6c55eb2489d2f1cdf96"} Dec 03 22:47:29.805008 master-0 kubenswrapper[36504]: I1203 22:47:29.804301 36504 scope.go:117] "RemoveContainer" containerID="7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454" Dec 03 22:47:29.841710 master-0 kubenswrapper[36504]: I1203 22:47:29.841642 36504 scope.go:117] "RemoveContainer" containerID="d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70" Dec 03 22:47:29.877423 master-0 kubenswrapper[36504]: I1203 22:47:29.877118 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hqsz"] Dec 03 22:47:29.881448 master-0 kubenswrapper[36504]: I1203 22:47:29.881341 36504 scope.go:117] "RemoveContainer" containerID="622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96" Dec 03 22:47:29.894376 master-0 kubenswrapper[36504]: I1203 22:47:29.894290 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hqsz"] Dec 03 22:47:29.947290 master-0 kubenswrapper[36504]: I1203 22:47:29.947233 36504 scope.go:117] "RemoveContainer" containerID="7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454" Dec 03 22:47:29.948157 master-0 kubenswrapper[36504]: E1203 22:47:29.947984 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454\": container with ID starting with 7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454 not found: ID does not exist" containerID="7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454" Dec 03 22:47:29.948157 master-0 kubenswrapper[36504]: I1203 22:47:29.948027 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454"} err="failed to get container status \"7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454\": rpc error: code = NotFound desc = could not find container \"7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454\": container with ID starting with 7f05a6d9001e7f99adb2f01c3849e0c95ceec508060a934df14bca7208089454 not found: ID does not exist" Dec 03 22:47:29.948157 master-0 kubenswrapper[36504]: I1203 22:47:29.948055 36504 scope.go:117] "RemoveContainer" containerID="d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70" Dec 03 22:47:29.948713 master-0 kubenswrapper[36504]: E1203 22:47:29.948650 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70\": container with ID starting with d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70 not found: ID does not exist" containerID="d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70" Dec 03 22:47:29.948837 master-0 kubenswrapper[36504]: I1203 22:47:29.948729 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70"} err="failed to get container status \"d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70\": rpc error: code = NotFound desc = could not find container \"d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70\": container with ID starting with d8867ad5ad28b8155578fdc56541770d7b6c47ef696c74e409495caa11a26f70 not found: ID does not exist" Dec 03 22:47:29.948837 master-0 kubenswrapper[36504]: I1203 22:47:29.948801 36504 scope.go:117] "RemoveContainer" containerID="622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96" Dec 03 22:47:29.949293 master-0 kubenswrapper[36504]: E1203 22:47:29.949224 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96\": container with ID starting with 622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96 not found: ID does not exist" containerID="622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96" Dec 03 22:47:29.949293 master-0 kubenswrapper[36504]: I1203 22:47:29.949263 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96"} err="failed to get container status \"622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96\": rpc error: code = NotFound desc = could not find container \"622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96\": container with ID starting with 622e870d4235651be865b4f9ef9ec4a12aa93f42e2b4a2a83326ee828c37fa96 not found: ID does not exist" Dec 03 22:47:31.113410 master-0 kubenswrapper[36504]: I1203 22:47:31.113319 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" path="/var/lib/kubelet/pods/8c0990ef-af5f-4c98-87b2-09da15a10703/volumes" Dec 03 22:47:35.726306 master-0 kubenswrapper[36504]: E1203 22:47:35.726219 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:47:38.807034 master-0 kubenswrapper[36504]: I1203 22:47:38.806911 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mkgdz"] Dec 03 22:47:38.807947 master-0 kubenswrapper[36504]: E1203 22:47:38.807710 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="registry-server" Dec 03 22:47:38.807947 master-0 kubenswrapper[36504]: I1203 22:47:38.807730 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="registry-server" Dec 03 22:47:38.807947 master-0 kubenswrapper[36504]: E1203 22:47:38.807923 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="extract-utilities" Dec 03 22:47:38.807947 master-0 kubenswrapper[36504]: I1203 22:47:38.807939 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="extract-utilities" Dec 03 22:47:38.808275 master-0 kubenswrapper[36504]: E1203 22:47:38.807955 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="extract-content" Dec 03 22:47:38.808275 master-0 kubenswrapper[36504]: I1203 22:47:38.807966 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="extract-content" Dec 03 22:47:38.808368 master-0 kubenswrapper[36504]: I1203 22:47:38.808350 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0990ef-af5f-4c98-87b2-09da15a10703" containerName="registry-server" Dec 03 22:47:38.811599 master-0 kubenswrapper[36504]: I1203 22:47:38.811532 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:38.837318 master-0 kubenswrapper[36504]: I1203 22:47:38.837250 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkgdz"] Dec 03 22:47:38.979960 master-0 kubenswrapper[36504]: I1203 22:47:38.979878 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-utilities\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:38.980297 master-0 kubenswrapper[36504]: I1203 22:47:38.979975 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ltz\" (UniqueName: \"kubernetes.io/projected/1e9e08c2-3d46-4dde-b52d-f139c46b5561-kube-api-access-x5ltz\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:38.980297 master-0 kubenswrapper[36504]: I1203 22:47:38.980011 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-catalog-content\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.083082 master-0 kubenswrapper[36504]: I1203 22:47:39.082892 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-utilities\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.083082 master-0 kubenswrapper[36504]: I1203 22:47:39.082985 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ltz\" (UniqueName: \"kubernetes.io/projected/1e9e08c2-3d46-4dde-b52d-f139c46b5561-kube-api-access-x5ltz\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.083393 master-0 kubenswrapper[36504]: I1203 22:47:39.083090 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-catalog-content\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.083649 master-0 kubenswrapper[36504]: I1203 22:47:39.083618 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-utilities\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.083978 master-0 kubenswrapper[36504]: I1203 22:47:39.083908 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-catalog-content\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.110180 master-0 kubenswrapper[36504]: I1203 22:47:39.110111 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ltz\" (UniqueName: \"kubernetes.io/projected/1e9e08c2-3d46-4dde-b52d-f139c46b5561-kube-api-access-x5ltz\") pod \"redhat-operators-mkgdz\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.148923 master-0 kubenswrapper[36504]: I1203 22:47:39.148789 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:39.746546 master-0 kubenswrapper[36504]: I1203 22:47:39.746472 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mkgdz"] Dec 03 22:47:39.957401 master-0 kubenswrapper[36504]: I1203 22:47:39.957245 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkgdz" event={"ID":"1e9e08c2-3d46-4dde-b52d-f139c46b5561","Type":"ContainerStarted","Data":"4cfa1334f0cfc7071e6f82a7b15c63050e2e4559bd6fbc25eebd1759c5d29e21"} Dec 03 22:47:40.973531 master-0 kubenswrapper[36504]: I1203 22:47:40.973469 36504 generic.go:334] "Generic (PLEG): container finished" podID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerID="fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521" exitCode=0 Dec 03 22:47:40.974315 master-0 kubenswrapper[36504]: I1203 22:47:40.973576 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkgdz" event={"ID":"1e9e08c2-3d46-4dde-b52d-f139c46b5561","Type":"ContainerDied","Data":"fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521"} Dec 03 22:47:41.994232 master-0 kubenswrapper[36504]: I1203 22:47:41.994145 36504 generic.go:334] "Generic (PLEG): container finished" podID="7aaf0e4b-65ab-49f5-9786-3115aba68d33" containerID="c3ce43531853a1568b24bf579104fab6715d8b0a84317979ea758a1ce85d1cc4" exitCode=0 Dec 03 22:47:41.994232 master-0 kubenswrapper[36504]: I1203 22:47:41.994209 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" event={"ID":"7aaf0e4b-65ab-49f5-9786-3115aba68d33","Type":"ContainerDied","Data":"c3ce43531853a1568b24bf579104fab6715d8b0a84317979ea758a1ce85d1cc4"} Dec 03 22:47:43.032284 master-0 kubenswrapper[36504]: I1203 22:47:43.032200 36504 generic.go:334] "Generic (PLEG): container finished" podID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerID="d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633" exitCode=0 Dec 03 22:47:43.033147 master-0 kubenswrapper[36504]: I1203 22:47:43.032338 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkgdz" event={"ID":"1e9e08c2-3d46-4dde-b52d-f139c46b5561","Type":"ContainerDied","Data":"d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633"} Dec 03 22:47:43.736497 master-0 kubenswrapper[36504]: I1203 22:47:43.736453 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:47:43.922251 master-0 kubenswrapper[36504]: I1203 22:47:43.922174 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjzqj\" (UniqueName: \"kubernetes.io/projected/7aaf0e4b-65ab-49f5-9786-3115aba68d33-kube-api-access-xjzqj\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.922564 master-0 kubenswrapper[36504]: I1203 22:47:43.922322 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-inventory\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.922564 master-0 kubenswrapper[36504]: I1203 22:47:43.922486 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-1\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.922977 master-0 kubenswrapper[36504]: I1203 22:47:43.922958 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-telemetry-combined-ca-bundle\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.923035 master-0 kubenswrapper[36504]: I1203 22:47:43.922987 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-2\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.923078 master-0 kubenswrapper[36504]: I1203 22:47:43.923042 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-0\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.923132 master-0 kubenswrapper[36504]: I1203 22:47:43.923115 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ssh-key\") pod \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\" (UID: \"7aaf0e4b-65ab-49f5-9786-3115aba68d33\") " Dec 03 22:47:43.929219 master-0 kubenswrapper[36504]: I1203 22:47:43.929127 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:47:43.938289 master-0 kubenswrapper[36504]: I1203 22:47:43.938205 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aaf0e4b-65ab-49f5-9786-3115aba68d33-kube-api-access-xjzqj" (OuterVolumeSpecName: "kube-api-access-xjzqj") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "kube-api-access-xjzqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:47:43.964834 master-0 kubenswrapper[36504]: I1203 22:47:43.964713 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:47:43.967697 master-0 kubenswrapper[36504]: I1203 22:47:43.967636 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:47:43.969039 master-0 kubenswrapper[36504]: I1203 22:47:43.968973 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-inventory" (OuterVolumeSpecName: "inventory") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:47:43.970580 master-0 kubenswrapper[36504]: I1203 22:47:43.970549 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:47:43.980104 master-0 kubenswrapper[36504]: I1203 22:47:43.979838 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7aaf0e4b-65ab-49f5-9786-3115aba68d33" (UID: "7aaf0e4b-65ab-49f5-9786-3115aba68d33"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:47:44.028655 master-0 kubenswrapper[36504]: I1203 22:47:44.028585 36504 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-telemetry-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.028655 master-0 kubenswrapper[36504]: I1203 22:47:44.028653 36504 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-2\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.028655 master-0 kubenswrapper[36504]: I1203 22:47:44.028673 36504 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-0\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.028655 master-0 kubenswrapper[36504]: I1203 22:47:44.028687 36504 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ssh-key\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.029112 master-0 kubenswrapper[36504]: I1203 22:47:44.028724 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjzqj\" (UniqueName: \"kubernetes.io/projected/7aaf0e4b-65ab-49f5-9786-3115aba68d33-kube-api-access-xjzqj\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.029112 master-0 kubenswrapper[36504]: I1203 22:47:44.028743 36504 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-inventory\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.029112 master-0 kubenswrapper[36504]: I1203 22:47:44.028755 36504 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7aaf0e4b-65ab-49f5-9786-3115aba68d33-ceilometer-compute-config-data-1\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:44.071285 master-0 kubenswrapper[36504]: I1203 22:47:44.071121 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" event={"ID":"7aaf0e4b-65ab-49f5-9786-3115aba68d33","Type":"ContainerDied","Data":"219d31ada86fe554d5a656c55f68c0886f54ec5bdd0dace7cf31a61acd55f407"} Dec 03 22:47:44.071285 master-0 kubenswrapper[36504]: I1203 22:47:44.071284 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="219d31ada86fe554d5a656c55f68c0886f54ec5bdd0dace7cf31a61acd55f407" Dec 03 22:47:44.074407 master-0 kubenswrapper[36504]: I1203 22:47:44.074307 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-dataplane-services-edpm-2jkt6" Dec 03 22:47:44.093252 master-0 kubenswrapper[36504]: I1203 22:47:44.093162 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkgdz" event={"ID":"1e9e08c2-3d46-4dde-b52d-f139c46b5561","Type":"ContainerStarted","Data":"7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16"} Dec 03 22:47:44.126244 master-0 kubenswrapper[36504]: I1203 22:47:44.126142 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mkgdz" podStartSLOduration=3.619964666 podStartE2EDuration="6.12611736s" podCreationTimestamp="2025-12-03 22:47:38 +0000 UTC" firstStartedPulling="2025-12-03 22:47:40.978084839 +0000 UTC m=+2226.197856846" lastFinishedPulling="2025-12-03 22:47:43.484237523 +0000 UTC m=+2228.704009540" observedRunningTime="2025-12-03 22:47:44.119606498 +0000 UTC m=+2229.339378515" watchObservedRunningTime="2025-12-03 22:47:44.12611736 +0000 UTC m=+2229.345889357" Dec 03 22:47:49.149269 master-0 kubenswrapper[36504]: I1203 22:47:49.149192 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:49.149269 master-0 kubenswrapper[36504]: I1203 22:47:49.149276 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:49.208411 master-0 kubenswrapper[36504]: I1203 22:47:49.208320 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:49.283655 master-0 kubenswrapper[36504]: I1203 22:47:49.283585 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:49.463439 master-0 kubenswrapper[36504]: I1203 22:47:49.463261 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mkgdz"] Dec 03 22:47:51.197338 master-0 kubenswrapper[36504]: I1203 22:47:51.197228 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mkgdz" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="registry-server" containerID="cri-o://7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16" gracePeriod=2 Dec 03 22:47:51.901958 master-0 kubenswrapper[36504]: I1203 22:47:51.901895 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:52.058142 master-0 kubenswrapper[36504]: I1203 22:47:52.047816 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5ltz\" (UniqueName: \"kubernetes.io/projected/1e9e08c2-3d46-4dde-b52d-f139c46b5561-kube-api-access-x5ltz\") pod \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " Dec 03 22:47:52.058142 master-0 kubenswrapper[36504]: I1203 22:47:52.048132 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-utilities\") pod \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " Dec 03 22:47:52.058142 master-0 kubenswrapper[36504]: I1203 22:47:52.048481 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-catalog-content\") pod \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\" (UID: \"1e9e08c2-3d46-4dde-b52d-f139c46b5561\") " Dec 03 22:47:52.058142 master-0 kubenswrapper[36504]: I1203 22:47:52.049275 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-utilities" (OuterVolumeSpecName: "utilities") pod "1e9e08c2-3d46-4dde-b52d-f139c46b5561" (UID: "1e9e08c2-3d46-4dde-b52d-f139c46b5561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:47:52.058142 master-0 kubenswrapper[36504]: I1203 22:47:52.050754 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:52.058142 master-0 kubenswrapper[36504]: I1203 22:47:52.051096 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9e08c2-3d46-4dde-b52d-f139c46b5561-kube-api-access-x5ltz" (OuterVolumeSpecName: "kube-api-access-x5ltz") pod "1e9e08c2-3d46-4dde-b52d-f139c46b5561" (UID: "1e9e08c2-3d46-4dde-b52d-f139c46b5561"). InnerVolumeSpecName "kube-api-access-x5ltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:47:52.157144 master-0 kubenswrapper[36504]: I1203 22:47:52.157088 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5ltz\" (UniqueName: \"kubernetes.io/projected/1e9e08c2-3d46-4dde-b52d-f139c46b5561-kube-api-access-x5ltz\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:52.172905 master-0 kubenswrapper[36504]: I1203 22:47:52.172757 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e9e08c2-3d46-4dde-b52d-f139c46b5561" (UID: "1e9e08c2-3d46-4dde-b52d-f139c46b5561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:47:52.214627 master-0 kubenswrapper[36504]: I1203 22:47:52.214553 36504 generic.go:334] "Generic (PLEG): container finished" podID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerID="7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16" exitCode=0 Dec 03 22:47:52.215357 master-0 kubenswrapper[36504]: I1203 22:47:52.214628 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mkgdz" Dec 03 22:47:52.215357 master-0 kubenswrapper[36504]: I1203 22:47:52.214626 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkgdz" event={"ID":"1e9e08c2-3d46-4dde-b52d-f139c46b5561","Type":"ContainerDied","Data":"7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16"} Dec 03 22:47:52.215357 master-0 kubenswrapper[36504]: I1203 22:47:52.214795 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mkgdz" event={"ID":"1e9e08c2-3d46-4dde-b52d-f139c46b5561","Type":"ContainerDied","Data":"4cfa1334f0cfc7071e6f82a7b15c63050e2e4559bd6fbc25eebd1759c5d29e21"} Dec 03 22:47:52.215357 master-0 kubenswrapper[36504]: I1203 22:47:52.214887 36504 scope.go:117] "RemoveContainer" containerID="7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16" Dec 03 22:47:52.245256 master-0 kubenswrapper[36504]: I1203 22:47:52.245177 36504 scope.go:117] "RemoveContainer" containerID="d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633" Dec 03 22:47:52.260352 master-0 kubenswrapper[36504]: I1203 22:47:52.260292 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e9e08c2-3d46-4dde-b52d-f139c46b5561-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:47:52.287809 master-0 kubenswrapper[36504]: I1203 22:47:52.286023 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mkgdz"] Dec 03 22:47:52.290514 master-0 kubenswrapper[36504]: I1203 22:47:52.290439 36504 scope.go:117] "RemoveContainer" containerID="fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521" Dec 03 22:47:52.304914 master-0 kubenswrapper[36504]: I1203 22:47:52.304819 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mkgdz"] Dec 03 22:47:52.368216 master-0 kubenswrapper[36504]: I1203 22:47:52.368161 36504 scope.go:117] "RemoveContainer" containerID="7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16" Dec 03 22:47:52.368925 master-0 kubenswrapper[36504]: E1203 22:47:52.368858 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16\": container with ID starting with 7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16 not found: ID does not exist" containerID="7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16" Dec 03 22:47:52.368985 master-0 kubenswrapper[36504]: I1203 22:47:52.368931 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16"} err="failed to get container status \"7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16\": rpc error: code = NotFound desc = could not find container \"7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16\": container with ID starting with 7b876fa307e844eca527747c3ebfc43d5e5f04c9d046e3117eeaa7149ef69a16 not found: ID does not exist" Dec 03 22:47:52.368985 master-0 kubenswrapper[36504]: I1203 22:47:52.368962 36504 scope.go:117] "RemoveContainer" containerID="d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633" Dec 03 22:47:52.369345 master-0 kubenswrapper[36504]: E1203 22:47:52.369302 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633\": container with ID starting with d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633 not found: ID does not exist" containerID="d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633" Dec 03 22:47:52.369345 master-0 kubenswrapper[36504]: I1203 22:47:52.369330 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633"} err="failed to get container status \"d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633\": rpc error: code = NotFound desc = could not find container \"d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633\": container with ID starting with d139d52cbb97e30911a78b3e58b683caa13b78d573db64d969bb3333b187a633 not found: ID does not exist" Dec 03 22:47:52.369345 master-0 kubenswrapper[36504]: I1203 22:47:52.369344 36504 scope.go:117] "RemoveContainer" containerID="fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521" Dec 03 22:47:52.369887 master-0 kubenswrapper[36504]: E1203 22:47:52.369858 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521\": container with ID starting with fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521 not found: ID does not exist" containerID="fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521" Dec 03 22:47:52.369962 master-0 kubenswrapper[36504]: I1203 22:47:52.369885 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521"} err="failed to get container status \"fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521\": rpc error: code = NotFound desc = could not find container \"fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521\": container with ID starting with fba51cc8da676279dd6e791d61ab61046f7ddf51fd5efd6e084884c600a22521 not found: ID does not exist" Dec 03 22:47:53.111911 master-0 kubenswrapper[36504]: I1203 22:47:53.111826 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" path="/var/lib/kubelet/pods/1e9e08c2-3d46-4dde-b52d-f139c46b5561/volumes" Dec 03 22:48:14.096700 master-0 kubenswrapper[36504]: I1203 22:48:14.096611 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:48:16.106264 master-0 kubenswrapper[36504]: I1203 22:48:16.106184 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:48:35.724473 master-0 kubenswrapper[36504]: E1203 22:48:35.724421 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:48:57.138802 master-0 kubenswrapper[36504]: I1203 22:48:57.137812 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-s00-smoke-and-plugin-tests"] Dec 03 22:48:57.138802 master-0 kubenswrapper[36504]: E1203 22:48:57.138717 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="extract-utilities" Dec 03 22:48:57.138802 master-0 kubenswrapper[36504]: I1203 22:48:57.138744 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="extract-utilities" Dec 03 22:48:57.138802 master-0 kubenswrapper[36504]: E1203 22:48:57.138790 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aaf0e4b-65ab-49f5-9786-3115aba68d33" containerName="telemetry-dataplane-services-edpm" Dec 03 22:48:57.138802 master-0 kubenswrapper[36504]: I1203 22:48:57.138803 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aaf0e4b-65ab-49f5-9786-3115aba68d33" containerName="telemetry-dataplane-services-edpm" Dec 03 22:48:57.140089 master-0 kubenswrapper[36504]: E1203 22:48:57.138860 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="registry-server" Dec 03 22:48:57.140089 master-0 kubenswrapper[36504]: I1203 22:48:57.138871 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="registry-server" Dec 03 22:48:57.140089 master-0 kubenswrapper[36504]: E1203 22:48:57.138888 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="extract-content" Dec 03 22:48:57.140089 master-0 kubenswrapper[36504]: I1203 22:48:57.138897 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="extract-content" Dec 03 22:48:57.155799 master-0 kubenswrapper[36504]: I1203 22:48:57.139246 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aaf0e4b-65ab-49f5-9786-3115aba68d33" containerName="telemetry-dataplane-services-edpm" Dec 03 22:48:57.155799 master-0 kubenswrapper[36504]: I1203 22:48:57.140448 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9e08c2-3d46-4dde-b52d-f139c46b5561" containerName="registry-server" Dec 03 22:48:57.155799 master-0 kubenswrapper[36504]: I1203 22:48:57.142282 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.155799 master-0 kubenswrapper[36504]: I1203 22:48:57.144904 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-env-vars-s0" Dec 03 22:48:57.155799 master-0 kubenswrapper[36504]: I1203 22:48:57.145487 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-custom-data-s0" Dec 03 22:48:57.217000 master-0 kubenswrapper[36504]: I1203 22:48:57.216943 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-s00-smoke-and-plugin-tests"] Dec 03 22:48:57.244477 master-0 kubenswrapper[36504]: I1203 22:48:57.244368 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-ca-certs\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.244864 master-0 kubenswrapper[36504]: I1203 22:48:57.244568 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-config-data\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.244864 master-0 kubenswrapper[36504]: I1203 22:48:57.244621 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.244970 master-0 kubenswrapper[36504]: I1203 22:48:57.244928 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-workdir\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.245020 master-0 kubenswrapper[36504]: I1203 22:48:57.244978 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-temporary\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.245057 master-0 kubenswrapper[36504]: I1203 22:48:57.245039 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config-secret\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.245199 master-0 kubenswrapper[36504]: I1203 22:48:57.245161 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.349118 master-0 kubenswrapper[36504]: I1203 22:48:57.349010 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-workdir\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.349118 master-0 kubenswrapper[36504]: I1203 22:48:57.349134 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-temporary\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.349548 master-0 kubenswrapper[36504]: I1203 22:48:57.349220 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config-secret\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.349548 master-0 kubenswrapper[36504]: I1203 22:48:57.349432 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.349657 master-0 kubenswrapper[36504]: I1203 22:48:57.349597 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-ca-certs\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.350333 master-0 kubenswrapper[36504]: I1203 22:48:57.350074 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-workdir\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.350941 master-0 kubenswrapper[36504]: I1203 22:48:57.350684 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-temporary\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.351803 master-0 kubenswrapper[36504]: I1203 22:48:57.351574 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.351894 master-0 kubenswrapper[36504]: I1203 22:48:57.351797 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-config-data\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.352686 master-0 kubenswrapper[36504]: I1203 22:48:57.349755 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-config-data\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.352686 master-0 kubenswrapper[36504]: I1203 22:48:57.352508 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.355926 master-0 kubenswrapper[36504]: I1203 22:48:57.354524 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config-secret\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.356269 master-0 kubenswrapper[36504]: I1203 22:48:57.355950 36504 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 22:48:57.356269 master-0 kubenswrapper[36504]: I1203 22:48:57.355990 36504 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d414658fda1a3dd8a99504616d47789bce27df9b4d8feaa31a1985e1ad21f05f/globalmount\"" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:57.356856 master-0 kubenswrapper[36504]: I1203 22:48:57.356800 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-ca-certs\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:58.564503 master-0 kubenswrapper[36504]: I1203 22:48:58.564424 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"tempest-tests-s00-smoke-and-plugin-tests\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:58.725473 master-0 kubenswrapper[36504]: I1203 22:48:58.725274 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:48:59.247428 master-0 kubenswrapper[36504]: I1203 22:48:59.246523 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-s00-smoke-and-plugin-tests"] Dec 03 22:49:00.236561 master-0 kubenswrapper[36504]: I1203 22:49:00.236489 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" event={"ID":"23c32606-4f1d-433d-b273-b0a4fca4c4a0","Type":"ContainerStarted","Data":"d07c5521def713f5592f3e34a9eec721a4bdcb55d1f9a577d744e5423ba55931"} Dec 03 22:49:17.098299 master-0 kubenswrapper[36504]: I1203 22:49:17.098199 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:49:26.097427 master-0 kubenswrapper[36504]: I1203 22:49:26.097180 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:49:30.756029 master-0 kubenswrapper[36504]: I1203 22:49:30.755948 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" event={"ID":"23c32606-4f1d-433d-b273-b0a4fca4c4a0","Type":"ContainerStarted","Data":"84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227"} Dec 03 22:49:30.792813 master-0 kubenswrapper[36504]: I1203 22:49:30.792634 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" podStartSLOduration=5.062270127 podStartE2EDuration="35.792597889s" podCreationTimestamp="2025-12-03 22:48:55 +0000 UTC" firstStartedPulling="2025-12-03 22:48:59.254211738 +0000 UTC m=+2304.473983745" lastFinishedPulling="2025-12-03 22:49:29.9845395 +0000 UTC m=+2335.204311507" observedRunningTime="2025-12-03 22:49:30.777141145 +0000 UTC m=+2335.996913172" watchObservedRunningTime="2025-12-03 22:49:30.792597889 +0000 UTC m=+2336.012369896" Dec 03 22:49:35.715983 master-0 kubenswrapper[36504]: E1203 22:49:35.715857 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:50:32.099848 master-0 kubenswrapper[36504]: I1203 22:50:32.099790 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:50:35.888241 master-0 kubenswrapper[36504]: E1203 22:50:35.888127 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:50:36.099862 master-0 kubenswrapper[36504]: I1203 22:50:36.099786 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:50:41.328071 master-0 kubenswrapper[36504]: I1203 22:50:41.327973 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2j79t"] Dec 03 22:50:41.335926 master-0 kubenswrapper[36504]: I1203 22:50:41.334269 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.364472 master-0 kubenswrapper[36504]: I1203 22:50:41.363796 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2j79t"] Dec 03 22:50:41.400499 master-0 kubenswrapper[36504]: I1203 22:50:41.400420 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-utilities\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.400873 master-0 kubenswrapper[36504]: I1203 22:50:41.400541 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-catalog-content\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.400873 master-0 kubenswrapper[36504]: I1203 22:50:41.400610 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fhmb\" (UniqueName: \"kubernetes.io/projected/22859fd1-6a11-49f3-8593-d54f4f47c29d-kube-api-access-9fhmb\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.506699 master-0 kubenswrapper[36504]: I1203 22:50:41.504485 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-catalog-content\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.506699 master-0 kubenswrapper[36504]: I1203 22:50:41.505326 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fhmb\" (UniqueName: \"kubernetes.io/projected/22859fd1-6a11-49f3-8593-d54f4f47c29d-kube-api-access-9fhmb\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.506699 master-0 kubenswrapper[36504]: I1203 22:50:41.505585 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-utilities\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.506699 master-0 kubenswrapper[36504]: I1203 22:50:41.505166 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-catalog-content\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.506699 master-0 kubenswrapper[36504]: I1203 22:50:41.505991 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-utilities\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:41.868962 master-0 kubenswrapper[36504]: I1203 22:50:41.868754 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fhmb\" (UniqueName: \"kubernetes.io/projected/22859fd1-6a11-49f3-8593-d54f4f47c29d-kube-api-access-9fhmb\") pod \"community-operators-2j79t\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:42.014887 master-0 kubenswrapper[36504]: I1203 22:50:42.013850 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:43.770843 master-0 kubenswrapper[36504]: I1203 22:50:43.769144 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2j79t"] Dec 03 22:50:44.020645 master-0 kubenswrapper[36504]: I1203 22:50:44.020379 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerStarted","Data":"9770ec3e079847402e8c00fbc8a31ee17a67c45b8c2ad73500d1d13055a0cb27"} Dec 03 22:50:45.040728 master-0 kubenswrapper[36504]: I1203 22:50:45.040660 36504 generic.go:334] "Generic (PLEG): container finished" podID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerID="6590c1068c52454c39c9dfa2ce3d35a4f191d599c4552672a4fb7548a433043c" exitCode=0 Dec 03 22:50:45.040728 master-0 kubenswrapper[36504]: I1203 22:50:45.040729 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerDied","Data":"6590c1068c52454c39c9dfa2ce3d35a4f191d599c4552672a4fb7548a433043c"} Dec 03 22:50:47.089925 master-0 kubenswrapper[36504]: I1203 22:50:47.089023 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerStarted","Data":"535d814f04ac720c44b9800fdf674c554da50be47f987c4558bf19064278bbda"} Dec 03 22:50:48.117787 master-0 kubenswrapper[36504]: I1203 22:50:48.117696 36504 generic.go:334] "Generic (PLEG): container finished" podID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerID="535d814f04ac720c44b9800fdf674c554da50be47f987c4558bf19064278bbda" exitCode=0 Dec 03 22:50:48.118525 master-0 kubenswrapper[36504]: I1203 22:50:48.117805 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerDied","Data":"535d814f04ac720c44b9800fdf674c554da50be47f987c4558bf19064278bbda"} Dec 03 22:50:48.118525 master-0 kubenswrapper[36504]: I1203 22:50:48.117856 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerStarted","Data":"03f61398c64443064d0a39774fe29f438e3537c361491758f30ea50e42b5849f"} Dec 03 22:50:48.163886 master-0 kubenswrapper[36504]: I1203 22:50:48.163483 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2j79t" podStartSLOduration=4.67691047 podStartE2EDuration="7.163449683s" podCreationTimestamp="2025-12-03 22:50:41 +0000 UTC" firstStartedPulling="2025-12-03 22:50:45.046897067 +0000 UTC m=+2410.266669074" lastFinishedPulling="2025-12-03 22:50:47.53343627 +0000 UTC m=+2412.753208287" observedRunningTime="2025-12-03 22:50:48.148914689 +0000 UTC m=+2413.368686726" watchObservedRunningTime="2025-12-03 22:50:48.163449683 +0000 UTC m=+2413.383221690" Dec 03 22:50:52.015576 master-0 kubenswrapper[36504]: I1203 22:50:52.015497 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:52.015576 master-0 kubenswrapper[36504]: I1203 22:50:52.015575 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:52.076477 master-0 kubenswrapper[36504]: I1203 22:50:52.076425 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:52.239869 master-0 kubenswrapper[36504]: I1203 22:50:52.239791 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:53.832694 master-0 kubenswrapper[36504]: I1203 22:50:53.830087 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2j79t"] Dec 03 22:50:54.249581 master-0 kubenswrapper[36504]: I1203 22:50:54.249446 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2j79t" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="registry-server" containerID="cri-o://03f61398c64443064d0a39774fe29f438e3537c361491758f30ea50e42b5849f" gracePeriod=2 Dec 03 22:50:55.273158 master-0 kubenswrapper[36504]: I1203 22:50:55.272983 36504 generic.go:334] "Generic (PLEG): container finished" podID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerID="03f61398c64443064d0a39774fe29f438e3537c361491758f30ea50e42b5849f" exitCode=0 Dec 03 22:50:55.273158 master-0 kubenswrapper[36504]: I1203 22:50:55.273075 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerDied","Data":"03f61398c64443064d0a39774fe29f438e3537c361491758f30ea50e42b5849f"} Dec 03 22:50:55.646050 master-0 kubenswrapper[36504]: I1203 22:50:55.645989 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:56.182235 master-0 kubenswrapper[36504]: I1203 22:50:56.182171 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fhmb\" (UniqueName: \"kubernetes.io/projected/22859fd1-6a11-49f3-8593-d54f4f47c29d-kube-api-access-9fhmb\") pod \"22859fd1-6a11-49f3-8593-d54f4f47c29d\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " Dec 03 22:50:56.186721 master-0 kubenswrapper[36504]: I1203 22:50:56.186642 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22859fd1-6a11-49f3-8593-d54f4f47c29d-kube-api-access-9fhmb" (OuterVolumeSpecName: "kube-api-access-9fhmb") pod "22859fd1-6a11-49f3-8593-d54f4f47c29d" (UID: "22859fd1-6a11-49f3-8593-d54f4f47c29d"). InnerVolumeSpecName "kube-api-access-9fhmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:50:56.192627 master-0 kubenswrapper[36504]: I1203 22:50:56.192564 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-utilities\") pod \"22859fd1-6a11-49f3-8593-d54f4f47c29d\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " Dec 03 22:50:56.192900 master-0 kubenswrapper[36504]: I1203 22:50:56.192874 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-catalog-content\") pod \"22859fd1-6a11-49f3-8593-d54f4f47c29d\" (UID: \"22859fd1-6a11-49f3-8593-d54f4f47c29d\") " Dec 03 22:50:56.194534 master-0 kubenswrapper[36504]: I1203 22:50:56.194044 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-utilities" (OuterVolumeSpecName: "utilities") pod "22859fd1-6a11-49f3-8593-d54f4f47c29d" (UID: "22859fd1-6a11-49f3-8593-d54f4f47c29d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:50:56.198276 master-0 kubenswrapper[36504]: I1203 22:50:56.198207 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fhmb\" (UniqueName: \"kubernetes.io/projected/22859fd1-6a11-49f3-8593-d54f4f47c29d-kube-api-access-9fhmb\") on node \"master-0\" DevicePath \"\"" Dec 03 22:50:56.199743 master-0 kubenswrapper[36504]: I1203 22:50:56.199687 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:50:56.279962 master-0 kubenswrapper[36504]: I1203 22:50:56.279360 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22859fd1-6a11-49f3-8593-d54f4f47c29d" (UID: "22859fd1-6a11-49f3-8593-d54f4f47c29d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:50:56.300335 master-0 kubenswrapper[36504]: I1203 22:50:56.299232 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2j79t" event={"ID":"22859fd1-6a11-49f3-8593-d54f4f47c29d","Type":"ContainerDied","Data":"9770ec3e079847402e8c00fbc8a31ee17a67c45b8c2ad73500d1d13055a0cb27"} Dec 03 22:50:56.300335 master-0 kubenswrapper[36504]: I1203 22:50:56.299324 36504 scope.go:117] "RemoveContainer" containerID="03f61398c64443064d0a39774fe29f438e3537c361491758f30ea50e42b5849f" Dec 03 22:50:56.300335 master-0 kubenswrapper[36504]: I1203 22:50:56.299386 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2j79t" Dec 03 22:50:56.303904 master-0 kubenswrapper[36504]: I1203 22:50:56.303005 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22859fd1-6a11-49f3-8593-d54f4f47c29d-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:50:56.338588 master-0 kubenswrapper[36504]: I1203 22:50:56.338526 36504 scope.go:117] "RemoveContainer" containerID="535d814f04ac720c44b9800fdf674c554da50be47f987c4558bf19064278bbda" Dec 03 22:50:56.368201 master-0 kubenswrapper[36504]: I1203 22:50:56.368084 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2j79t"] Dec 03 22:50:56.386248 master-0 kubenswrapper[36504]: I1203 22:50:56.386161 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2j79t"] Dec 03 22:50:56.413414 master-0 kubenswrapper[36504]: I1203 22:50:56.413328 36504 scope.go:117] "RemoveContainer" containerID="6590c1068c52454c39c9dfa2ce3d35a4f191d599c4552672a4fb7548a433043c" Dec 03 22:50:57.116069 master-0 kubenswrapper[36504]: I1203 22:50:57.115971 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" path="/var/lib/kubelet/pods/22859fd1-6a11-49f3-8593-d54f4f47c29d/volumes" Dec 03 22:51:35.742069 master-0 kubenswrapper[36504]: E1203 22:51:35.741973 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:51:37.096351 master-0 kubenswrapper[36504]: I1203 22:51:37.095964 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:51:39.097589 master-0 kubenswrapper[36504]: I1203 22:51:39.097491 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:52:35.730489 master-0 kubenswrapper[36504]: E1203 22:52:35.730410 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:52:40.096404 master-0 kubenswrapper[36504]: I1203 22:52:40.096338 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:52:51.099876 master-0 kubenswrapper[36504]: I1203 22:52:51.099710 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:53:35.734539 master-0 kubenswrapper[36504]: E1203 22:53:35.734468 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:53:53.414288 master-0 kubenswrapper[36504]: I1203 22:53:53.414204 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f86ng"] Dec 03 22:53:53.415262 master-0 kubenswrapper[36504]: E1203 22:53:53.415134 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="extract-content" Dec 03 22:53:53.415262 master-0 kubenswrapper[36504]: I1203 22:53:53.415151 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="extract-content" Dec 03 22:53:53.415262 master-0 kubenswrapper[36504]: E1203 22:53:53.415196 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="registry-server" Dec 03 22:53:53.415262 master-0 kubenswrapper[36504]: I1203 22:53:53.415202 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="registry-server" Dec 03 22:53:53.415262 master-0 kubenswrapper[36504]: E1203 22:53:53.415263 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="extract-utilities" Dec 03 22:53:53.415438 master-0 kubenswrapper[36504]: I1203 22:53:53.415271 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="extract-utilities" Dec 03 22:53:53.415604 master-0 kubenswrapper[36504]: I1203 22:53:53.415586 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="22859fd1-6a11-49f3-8593-d54f4f47c29d" containerName="registry-server" Dec 03 22:53:53.418365 master-0 kubenswrapper[36504]: I1203 22:53:53.418335 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.462915 master-0 kubenswrapper[36504]: I1203 22:53:53.459340 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f86ng"] Dec 03 22:53:53.589302 master-0 kubenswrapper[36504]: I1203 22:53:53.589109 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsck\" (UniqueName: \"kubernetes.io/projected/3cc81a8d-89e3-40bb-84ac-9c26993726fe-kube-api-access-2dsck\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.589950 master-0 kubenswrapper[36504]: I1203 22:53:53.589606 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-catalog-content\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.589950 master-0 kubenswrapper[36504]: I1203 22:53:53.589817 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-utilities\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.695650 master-0 kubenswrapper[36504]: I1203 22:53:53.695392 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsck\" (UniqueName: \"kubernetes.io/projected/3cc81a8d-89e3-40bb-84ac-9c26993726fe-kube-api-access-2dsck\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.695650 master-0 kubenswrapper[36504]: I1203 22:53:53.695558 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-catalog-content\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.695650 master-0 kubenswrapper[36504]: I1203 22:53:53.695599 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-utilities\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.696318 master-0 kubenswrapper[36504]: I1203 22:53:53.696252 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-utilities\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.696585 master-0 kubenswrapper[36504]: I1203 22:53:53.696552 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-catalog-content\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.719246 master-0 kubenswrapper[36504]: I1203 22:53:53.719179 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsck\" (UniqueName: \"kubernetes.io/projected/3cc81a8d-89e3-40bb-84ac-9c26993726fe-kube-api-access-2dsck\") pod \"redhat-marketplace-f86ng\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:53.773494 master-0 kubenswrapper[36504]: I1203 22:53:53.773377 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:53:54.433385 master-0 kubenswrapper[36504]: I1203 22:53:54.430655 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f86ng"] Dec 03 22:53:55.257728 master-0 kubenswrapper[36504]: I1203 22:53:55.257625 36504 generic.go:334] "Generic (PLEG): container finished" podID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerID="a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5" exitCode=0 Dec 03 22:53:55.257728 master-0 kubenswrapper[36504]: I1203 22:53:55.257723 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerDied","Data":"a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5"} Dec 03 22:53:55.258093 master-0 kubenswrapper[36504]: I1203 22:53:55.257795 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerStarted","Data":"1406e11884a5ad2aee214a2932dc69ba43022dc3d308f04c74749c04bd3c3ab4"} Dec 03 22:53:55.261603 master-0 kubenswrapper[36504]: I1203 22:53:55.261368 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 22:53:56.098491 master-0 kubenswrapper[36504]: I1203 22:53:56.097624 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:53:56.290087 master-0 kubenswrapper[36504]: I1203 22:53:56.289999 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerStarted","Data":"d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e"} Dec 03 22:53:57.311409 master-0 kubenswrapper[36504]: I1203 22:53:57.311195 36504 generic.go:334] "Generic (PLEG): container finished" podID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerID="d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e" exitCode=0 Dec 03 22:53:57.311409 master-0 kubenswrapper[36504]: I1203 22:53:57.311287 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerDied","Data":"d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e"} Dec 03 22:53:58.342582 master-0 kubenswrapper[36504]: I1203 22:53:58.342515 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerStarted","Data":"0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f"} Dec 03 22:53:58.381304 master-0 kubenswrapper[36504]: I1203 22:53:58.381184 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f86ng" podStartSLOduration=2.919000154 podStartE2EDuration="5.381153735s" podCreationTimestamp="2025-12-03 22:53:53 +0000 UTC" firstStartedPulling="2025-12-03 22:53:55.261301095 +0000 UTC m=+2600.481073102" lastFinishedPulling="2025-12-03 22:53:57.723454676 +0000 UTC m=+2602.943226683" observedRunningTime="2025-12-03 22:53:58.366490406 +0000 UTC m=+2603.586262423" watchObservedRunningTime="2025-12-03 22:53:58.381153735 +0000 UTC m=+2603.600925742" Dec 03 22:54:03.773966 master-0 kubenswrapper[36504]: I1203 22:54:03.773887 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:54:03.775042 master-0 kubenswrapper[36504]: I1203 22:54:03.774984 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:54:03.848409 master-0 kubenswrapper[36504]: I1203 22:54:03.848245 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:54:04.533060 master-0 kubenswrapper[36504]: I1203 22:54:04.532886 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:54:04.636168 master-0 kubenswrapper[36504]: I1203 22:54:04.636069 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f86ng"] Dec 03 22:54:06.503321 master-0 kubenswrapper[36504]: I1203 22:54:06.503096 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f86ng" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="registry-server" containerID="cri-o://0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f" gracePeriod=2 Dec 03 22:54:07.097120 master-0 kubenswrapper[36504]: I1203 22:54:07.096930 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:54:07.402685 master-0 kubenswrapper[36504]: I1203 22:54:07.402597 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:54:07.523867 master-0 kubenswrapper[36504]: I1203 22:54:07.523660 36504 generic.go:334] "Generic (PLEG): container finished" podID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerID="0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f" exitCode=0 Dec 03 22:54:07.524603 master-0 kubenswrapper[36504]: I1203 22:54:07.523743 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerDied","Data":"0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f"} Dec 03 22:54:07.524762 master-0 kubenswrapper[36504]: I1203 22:54:07.523799 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f86ng" Dec 03 22:54:07.524898 master-0 kubenswrapper[36504]: I1203 22:54:07.524878 36504 scope.go:117] "RemoveContainer" containerID="0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f" Dec 03 22:54:07.525032 master-0 kubenswrapper[36504]: I1203 22:54:07.524739 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f86ng" event={"ID":"3cc81a8d-89e3-40bb-84ac-9c26993726fe","Type":"ContainerDied","Data":"1406e11884a5ad2aee214a2932dc69ba43022dc3d308f04c74749c04bd3c3ab4"} Dec 03 22:54:07.550884 master-0 kubenswrapper[36504]: I1203 22:54:07.550822 36504 scope.go:117] "RemoveContainer" containerID="d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e" Dec 03 22:54:07.574893 master-0 kubenswrapper[36504]: I1203 22:54:07.574728 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-utilities\") pod \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " Dec 03 22:54:07.574893 master-0 kubenswrapper[36504]: I1203 22:54:07.574878 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsck\" (UniqueName: \"kubernetes.io/projected/3cc81a8d-89e3-40bb-84ac-9c26993726fe-kube-api-access-2dsck\") pod \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " Dec 03 22:54:07.575250 master-0 kubenswrapper[36504]: I1203 22:54:07.574915 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-catalog-content\") pod \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\" (UID: \"3cc81a8d-89e3-40bb-84ac-9c26993726fe\") " Dec 03 22:54:07.577567 master-0 kubenswrapper[36504]: I1203 22:54:07.576147 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-utilities" (OuterVolumeSpecName: "utilities") pod "3cc81a8d-89e3-40bb-84ac-9c26993726fe" (UID: "3cc81a8d-89e3-40bb-84ac-9c26993726fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:54:07.580946 master-0 kubenswrapper[36504]: I1203 22:54:07.579301 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc81a8d-89e3-40bb-84ac-9c26993726fe-kube-api-access-2dsck" (OuterVolumeSpecName: "kube-api-access-2dsck") pod "3cc81a8d-89e3-40bb-84ac-9c26993726fe" (UID: "3cc81a8d-89e3-40bb-84ac-9c26993726fe"). InnerVolumeSpecName "kube-api-access-2dsck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:54:07.589620 master-0 kubenswrapper[36504]: I1203 22:54:07.587906 36504 scope.go:117] "RemoveContainer" containerID="a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5" Dec 03 22:54:07.600629 master-0 kubenswrapper[36504]: I1203 22:54:07.600419 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cc81a8d-89e3-40bb-84ac-9c26993726fe" (UID: "3cc81a8d-89e3-40bb-84ac-9c26993726fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:54:07.680058 master-0 kubenswrapper[36504]: I1203 22:54:07.679902 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:54:07.680058 master-0 kubenswrapper[36504]: I1203 22:54:07.679956 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsck\" (UniqueName: \"kubernetes.io/projected/3cc81a8d-89e3-40bb-84ac-9c26993726fe-kube-api-access-2dsck\") on node \"master-0\" DevicePath \"\"" Dec 03 22:54:07.680058 master-0 kubenswrapper[36504]: I1203 22:54:07.679967 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cc81a8d-89e3-40bb-84ac-9c26993726fe-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:54:07.704702 master-0 kubenswrapper[36504]: I1203 22:54:07.704653 36504 scope.go:117] "RemoveContainer" containerID="0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f" Dec 03 22:54:07.705458 master-0 kubenswrapper[36504]: E1203 22:54:07.705385 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f\": container with ID starting with 0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f not found: ID does not exist" containerID="0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f" Dec 03 22:54:07.705532 master-0 kubenswrapper[36504]: I1203 22:54:07.705466 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f"} err="failed to get container status \"0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f\": rpc error: code = NotFound desc = could not find container \"0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f\": container with ID starting with 0b2d4f61bcb8eaff499042937c6eff481aac7f71d5e0657cb5c96cb987c8334f not found: ID does not exist" Dec 03 22:54:07.705532 master-0 kubenswrapper[36504]: I1203 22:54:07.705507 36504 scope.go:117] "RemoveContainer" containerID="d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e" Dec 03 22:54:07.706395 master-0 kubenswrapper[36504]: E1203 22:54:07.706337 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e\": container with ID starting with d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e not found: ID does not exist" containerID="d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e" Dec 03 22:54:07.706395 master-0 kubenswrapper[36504]: I1203 22:54:07.706374 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e"} err="failed to get container status \"d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e\": rpc error: code = NotFound desc = could not find container \"d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e\": container with ID starting with d3256342f166255aa5cb0f2787f93c9cdd26955d24a168381b7140005c1a771e not found: ID does not exist" Dec 03 22:54:07.706395 master-0 kubenswrapper[36504]: I1203 22:54:07.706410 36504 scope.go:117] "RemoveContainer" containerID="a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5" Dec 03 22:54:07.706912 master-0 kubenswrapper[36504]: E1203 22:54:07.706862 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5\": container with ID starting with a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5 not found: ID does not exist" containerID="a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5" Dec 03 22:54:07.706971 master-0 kubenswrapper[36504]: I1203 22:54:07.706899 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5"} err="failed to get container status \"a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5\": rpc error: code = NotFound desc = could not find container \"a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5\": container with ID starting with a044acc1481e79e25719fd12404da5ce0f01a6de6510097187debe515eb1c3c5 not found: ID does not exist" Dec 03 22:54:07.882879 master-0 kubenswrapper[36504]: I1203 22:54:07.882752 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f86ng"] Dec 03 22:54:07.898184 master-0 kubenswrapper[36504]: I1203 22:54:07.898092 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f86ng"] Dec 03 22:54:09.118809 master-0 kubenswrapper[36504]: I1203 22:54:09.118715 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" path="/var/lib/kubelet/pods/3cc81a8d-89e3-40bb-84ac-9c26993726fe/volumes" Dec 03 22:54:35.718113 master-0 kubenswrapper[36504]: E1203 22:54:35.718038 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:55:15.110397 master-0 kubenswrapper[36504]: I1203 22:55:15.110323 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:55:23.096578 master-0 kubenswrapper[36504]: I1203 22:55:23.096493 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:55:35.785517 master-0 kubenswrapper[36504]: E1203 22:55:35.785427 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:55:38.503127 master-0 kubenswrapper[36504]: I1203 22:55:38.503051 36504 trace.go:236] Trace[1649390878]: "Calculate volume metrics of test-operator-logs for pod openstack/tempest-tests-s00-smoke-and-plugin-tests" (03-Dec-2025 22:55:37.284) (total time: 1218ms): Dec 03 22:55:38.503127 master-0 kubenswrapper[36504]: Trace[1649390878]: [1.218616841s] [1.218616841s] END Dec 03 22:56:35.725592 master-0 kubenswrapper[36504]: E1203 22:56:35.725511 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:56:37.096391 master-0 kubenswrapper[36504]: I1203 22:56:37.096312 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:56:51.099026 master-0 kubenswrapper[36504]: I1203 22:56:51.098335 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:57:35.728955 master-0 kubenswrapper[36504]: E1203 22:57:35.728871 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:57:35.901660 master-0 kubenswrapper[36504]: I1203 22:57:35.901588 36504 generic.go:334] "Generic (PLEG): container finished" podID="23c32606-4f1d-433d-b273-b0a4fca4c4a0" containerID="84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227" exitCode=0 Dec 03 22:57:35.901660 master-0 kubenswrapper[36504]: I1203 22:57:35.901661 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" event={"ID":"23c32606-4f1d-433d-b273-b0a4fca4c4a0","Type":"ContainerDied","Data":"84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227"} Dec 03 22:57:37.542830 master-0 kubenswrapper[36504]: I1203 22:57:37.542712 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:57:37.656013 master-0 kubenswrapper[36504]: E1203 22:57:37.648184 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:57:37.656013 master-0 kubenswrapper[36504]: E1203 22:57:37.648600 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:57:37.656013 master-0 kubenswrapper[36504]: E1203 22:57:37.651009 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:57:37.660069 master-0 kubenswrapper[36504]: I1203 22:57:37.660009 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-s01-scenario-tests"] Dec 03 22:57:37.661319 master-0 kubenswrapper[36504]: E1203 22:57:37.661291 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="extract-utilities" Dec 03 22:57:37.661437 master-0 kubenswrapper[36504]: I1203 22:57:37.661421 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="extract-utilities" Dec 03 22:57:37.661618 master-0 kubenswrapper[36504]: E1203 22:57:37.661602 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="extract-content" Dec 03 22:57:37.661708 master-0 kubenswrapper[36504]: I1203 22:57:37.661695 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="extract-content" Dec 03 22:57:37.661867 master-0 kubenswrapper[36504]: E1203 22:57:37.661851 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23c32606-4f1d-433d-b273-b0a4fca4c4a0" containerName="tempest-tests-tests-runner" Dec 03 22:57:37.661959 master-0 kubenswrapper[36504]: I1203 22:57:37.661945 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="23c32606-4f1d-433d-b273-b0a4fca4c4a0" containerName="tempest-tests-tests-runner" Dec 03 22:57:37.662068 master-0 kubenswrapper[36504]: E1203 22:57:37.662054 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="registry-server" Dec 03 22:57:37.662149 master-0 kubenswrapper[36504]: I1203 22:57:37.662136 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="registry-server" Dec 03 22:57:37.662605 master-0 kubenswrapper[36504]: I1203 22:57:37.662584 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cc81a8d-89e3-40bb-84ac-9c26993726fe" containerName="registry-server" Dec 03 22:57:37.662754 master-0 kubenswrapper[36504]: I1203 22:57:37.662737 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="23c32606-4f1d-433d-b273-b0a4fca4c4a0" containerName="tempest-tests-tests-runner" Dec 03 22:57:37.664475 master-0 kubenswrapper[36504]: I1203 22:57:37.664451 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.684661 master-0 kubenswrapper[36504]: I1203 22:57:37.684579 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-s01-scenario-tests"] Dec 03 22:57:37.685471 master-0 kubenswrapper[36504]: I1203 22:57:37.685426 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-env-vars-s1" Dec 03 22:57:37.685555 master-0 kubenswrapper[36504]: I1203 22:57:37.685532 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-custom-data-s1" Dec 03 22:57:37.695655 master-0 kubenswrapper[36504]: I1203 22:57:37.695612 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config-secret\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.696116 master-0 kubenswrapper[36504]: I1203 22:57:37.696092 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.696391 master-0 kubenswrapper[36504]: I1203 22:57:37.696372 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.696562 master-0 kubenswrapper[36504]: I1203 22:57:37.696547 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-ca-certs\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.696682 master-0 kubenswrapper[36504]: I1203 22:57:37.696668 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-workdir\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.697061 master-0 kubenswrapper[36504]: I1203 22:57:37.697045 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-temporary\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.697424 master-0 kubenswrapper[36504]: I1203 22:57:37.697406 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-config-data\") pod \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\" (UID: \"23c32606-4f1d-433d-b273-b0a4fca4c4a0\") " Dec 03 22:57:37.698515 master-0 kubenswrapper[36504]: I1203 22:57:37.697955 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:57:37.698618 master-0 kubenswrapper[36504]: I1203 22:57:37.698360 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-config-data" (OuterVolumeSpecName: "config-data") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:57:37.699970 master-0 kubenswrapper[36504]: I1203 22:57:37.699945 36504 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-temporary\") on node \"master-0\" DevicePath \"\"" Dec 03 22:57:37.700110 master-0 kubenswrapper[36504]: I1203 22:57:37.700093 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 22:57:37.708762 master-0 kubenswrapper[36504]: I1203 22:57:37.708478 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:57:37.743571 master-0 kubenswrapper[36504]: I1203 22:57:37.743467 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:57:37.748118 master-0 kubenswrapper[36504]: I1203 22:57:37.748042 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43" (OuterVolumeSpecName: "test-operator-logs") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 22:57:37.761804 master-0 kubenswrapper[36504]: I1203 22:57:37.761706 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 22:57:37.785881 master-0 kubenswrapper[36504]: I1203 22:57:37.785802 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "23c32606-4f1d-433d-b273-b0a4fca4c4a0" (UID: "23c32606-4f1d-433d-b273-b0a4fca4c4a0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 22:57:37.807541 master-0 kubenswrapper[36504]: I1203 22:57:37.807438 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-config-data\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.808104 master-0 kubenswrapper[36504]: I1203 22:57:37.808035 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-ca-certs\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.808286 master-0 kubenswrapper[36504]: I1203 22:57:37.808238 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.808436 master-0 kubenswrapper[36504]: I1203 22:57:37.808392 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-temporary\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.808493 master-0 kubenswrapper[36504]: I1203 22:57:37.808473 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config-secret\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.808762 master-0 kubenswrapper[36504]: I1203 22:57:37.808723 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-workdir\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.808852 master-0 kubenswrapper[36504]: I1203 22:57:37.808830 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.809218 master-0 kubenswrapper[36504]: I1203 22:57:37.809187 36504 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config-secret\") on node \"master-0\" DevicePath \"\"" Dec 03 22:57:37.809218 master-0 kubenswrapper[36504]: I1203 22:57:37.809212 36504 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/23c32606-4f1d-433d-b273-b0a4fca4c4a0-openstack-config\") on node \"master-0\" DevicePath \"\"" Dec 03 22:57:37.809311 master-0 kubenswrapper[36504]: I1203 22:57:37.809227 36504 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/23c32606-4f1d-433d-b273-b0a4fca4c4a0-ca-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 22:57:37.809311 master-0 kubenswrapper[36504]: I1203 22:57:37.809240 36504 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/23c32606-4f1d-433d-b273-b0a4fca4c4a0-test-operator-ephemeral-workdir\") on node \"master-0\" DevicePath \"\"" Dec 03 22:57:37.912454 master-0 kubenswrapper[36504]: I1203 22:57:37.912270 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-config-data\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.912454 master-0 kubenswrapper[36504]: I1203 22:57:37.912417 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-ca-certs\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.912861 master-0 kubenswrapper[36504]: I1203 22:57:37.912463 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.912861 master-0 kubenswrapper[36504]: I1203 22:57:37.912503 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-temporary\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.912861 master-0 kubenswrapper[36504]: I1203 22:57:37.912534 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config-secret\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.912861 master-0 kubenswrapper[36504]: I1203 22:57:37.912603 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-workdir\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.913388 master-0 kubenswrapper[36504]: I1203 22:57:37.913352 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-temporary\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.913473 master-0 kubenswrapper[36504]: I1203 22:57:37.913442 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-workdir\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.913889 master-0 kubenswrapper[36504]: I1203 22:57:37.913829 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.913956 master-0 kubenswrapper[36504]: I1203 22:57:37.913883 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-config-data\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.917196 master-0 kubenswrapper[36504]: I1203 22:57:37.917135 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-ca-certs\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.917671 master-0 kubenswrapper[36504]: I1203 22:57:37.917630 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config-secret\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:37.937852 master-0 kubenswrapper[36504]: I1203 22:57:37.933799 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" event={"ID":"23c32606-4f1d-433d-b273-b0a4fca4c4a0","Type":"ContainerDied","Data":"d07c5521def713f5592f3e34a9eec721a4bdcb55d1f9a577d744e5423ba55931"} Dec 03 22:57:37.937852 master-0 kubenswrapper[36504]: I1203 22:57:37.933864 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d07c5521def713f5592f3e34a9eec721a4bdcb55d1f9a577d744e5423ba55931" Dec 03 22:57:37.937852 master-0 kubenswrapper[36504]: I1203 22:57:37.933895 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s00-smoke-and-plugin-tests" Dec 03 22:57:38.688428 master-0 kubenswrapper[36504]: I1203 22:57:38.688340 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"tempest-tests-s01-scenario-tests\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:38.776358 master-0 kubenswrapper[36504]: I1203 22:57:38.776212 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 22:57:39.348234 master-0 kubenswrapper[36504]: I1203 22:57:39.348179 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-s01-scenario-tests"] Dec 03 22:57:39.985542 master-0 kubenswrapper[36504]: I1203 22:57:39.979289 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s01-scenario-tests" event={"ID":"e8222600-745f-4d3f-87a9-9a8607c75bbf","Type":"ContainerStarted","Data":"c7630d8bb8b76d2bd906fc5c17d79414c4e1c50143de9d066111121f7484d0f7"} Dec 03 22:57:39.985542 master-0 kubenswrapper[36504]: I1203 22:57:39.979388 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s01-scenario-tests" event={"ID":"e8222600-745f-4d3f-87a9-9a8607c75bbf","Type":"ContainerStarted","Data":"37f9b36a7be5412cd584afc5ea391f52bdda5743188eb4148f6f1a8e3f6d273b"} Dec 03 22:57:40.011928 master-0 kubenswrapper[36504]: I1203 22:57:40.011807 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-s01-scenario-tests" podStartSLOduration=3.011760075 podStartE2EDuration="3.011760075s" podCreationTimestamp="2025-12-03 22:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 22:57:40.000960609 +0000 UTC m=+2825.220732636" watchObservedRunningTime="2025-12-03 22:57:40.011760075 +0000 UTC m=+2825.231532082" Dec 03 22:57:46.483525 master-0 kubenswrapper[36504]: I1203 22:57:46.483054 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8gqvr"] Dec 03 22:57:46.487612 master-0 kubenswrapper[36504]: I1203 22:57:46.487550 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.530843 master-0 kubenswrapper[36504]: I1203 22:57:46.529663 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gqvr"] Dec 03 22:57:46.549859 master-0 kubenswrapper[36504]: I1203 22:57:46.549736 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwb2\" (UniqueName: \"kubernetes.io/projected/2ca1b4e1-2889-4624-8c02-e431ac179554-kube-api-access-xvwb2\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.550574 master-0 kubenswrapper[36504]: I1203 22:57:46.550493 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-catalog-content\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.551188 master-0 kubenswrapper[36504]: I1203 22:57:46.551093 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-utilities\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.620472 master-0 kubenswrapper[36504]: E1203 22:57:46.618987 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:57:46.655670 master-0 kubenswrapper[36504]: I1203 22:57:46.655534 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-utilities\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.656301 master-0 kubenswrapper[36504]: I1203 22:57:46.656251 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-utilities\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.657079 master-0 kubenswrapper[36504]: I1203 22:57:46.655849 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwb2\" (UniqueName: \"kubernetes.io/projected/2ca1b4e1-2889-4624-8c02-e431ac179554-kube-api-access-xvwb2\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.657945 master-0 kubenswrapper[36504]: I1203 22:57:46.657894 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-catalog-content\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.658578 master-0 kubenswrapper[36504]: I1203 22:57:46.658536 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-catalog-content\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.683696 master-0 kubenswrapper[36504]: I1203 22:57:46.683172 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwb2\" (UniqueName: \"kubernetes.io/projected/2ca1b4e1-2889-4624-8c02-e431ac179554-kube-api-access-xvwb2\") pod \"certified-operators-8gqvr\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:46.820457 master-0 kubenswrapper[36504]: I1203 22:57:46.820236 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:47.445507 master-0 kubenswrapper[36504]: I1203 22:57:47.444629 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8gqvr"] Dec 03 22:57:48.073337 master-0 kubenswrapper[36504]: E1203 22:57:48.073170 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:57:48.209286 master-0 kubenswrapper[36504]: I1203 22:57:48.209193 36504 generic.go:334] "Generic (PLEG): container finished" podID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerID="d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9" exitCode=0 Dec 03 22:57:48.209658 master-0 kubenswrapper[36504]: I1203 22:57:48.209303 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerDied","Data":"d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9"} Dec 03 22:57:48.209658 master-0 kubenswrapper[36504]: I1203 22:57:48.209361 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerStarted","Data":"76f9815590b7c0c3c51922e0d12450de5d52cd8ecd382637ee3e0c35194842a6"} Dec 03 22:57:49.230077 master-0 kubenswrapper[36504]: I1203 22:57:49.229846 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerStarted","Data":"42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662"} Dec 03 22:57:50.253084 master-0 kubenswrapper[36504]: I1203 22:57:50.253006 36504 generic.go:334] "Generic (PLEG): container finished" podID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerID="42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662" exitCode=0 Dec 03 22:57:50.253864 master-0 kubenswrapper[36504]: I1203 22:57:50.253091 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerDied","Data":"42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662"} Dec 03 22:57:51.269467 master-0 kubenswrapper[36504]: I1203 22:57:51.269289 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerStarted","Data":"665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1"} Dec 03 22:57:51.303469 master-0 kubenswrapper[36504]: I1203 22:57:51.303335 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8gqvr" podStartSLOduration=2.831242851 podStartE2EDuration="5.303301712s" podCreationTimestamp="2025-12-03 22:57:46 +0000 UTC" firstStartedPulling="2025-12-03 22:57:48.214307395 +0000 UTC m=+2833.434079402" lastFinishedPulling="2025-12-03 22:57:50.686366266 +0000 UTC m=+2835.906138263" observedRunningTime="2025-12-03 22:57:51.296923244 +0000 UTC m=+2836.516695261" watchObservedRunningTime="2025-12-03 22:57:51.303301712 +0000 UTC m=+2836.523073749" Dec 03 22:57:56.821295 master-0 kubenswrapper[36504]: I1203 22:57:56.821193 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:56.821295 master-0 kubenswrapper[36504]: I1203 22:57:56.821282 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:56.884484 master-0 kubenswrapper[36504]: I1203 22:57:56.884405 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:57.418284 master-0 kubenswrapper[36504]: I1203 22:57:57.418198 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:57:57.492710 master-0 kubenswrapper[36504]: I1203 22:57:57.492607 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gqvr"] Dec 03 22:57:58.096229 master-0 kubenswrapper[36504]: I1203 22:57:58.096149 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:57:58.441492 master-0 kubenswrapper[36504]: E1203 22:57:58.441381 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:57:59.390164 master-0 kubenswrapper[36504]: I1203 22:57:59.390026 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8gqvr" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="registry-server" containerID="cri-o://665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1" gracePeriod=2 Dec 03 22:57:59.553352 master-0 kubenswrapper[36504]: I1203 22:57:59.553300 36504 trace.go:236] Trace[221697561]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (03-Dec-2025 22:57:58.421) (total time: 1131ms): Dec 03 22:57:59.553352 master-0 kubenswrapper[36504]: Trace[221697561]: [1.131730654s] [1.131730654s] END Dec 03 22:58:00.085166 master-0 kubenswrapper[36504]: I1203 22:58:00.085103 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:58:00.101528 master-0 kubenswrapper[36504]: I1203 22:58:00.099448 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:58:00.204206 master-0 kubenswrapper[36504]: I1203 22:58:00.204014 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvwb2\" (UniqueName: \"kubernetes.io/projected/2ca1b4e1-2889-4624-8c02-e431ac179554-kube-api-access-xvwb2\") pod \"2ca1b4e1-2889-4624-8c02-e431ac179554\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " Dec 03 22:58:00.204467 master-0 kubenswrapper[36504]: I1203 22:58:00.204277 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-utilities\") pod \"2ca1b4e1-2889-4624-8c02-e431ac179554\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " Dec 03 22:58:00.204467 master-0 kubenswrapper[36504]: I1203 22:58:00.204316 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-catalog-content\") pod \"2ca1b4e1-2889-4624-8c02-e431ac179554\" (UID: \"2ca1b4e1-2889-4624-8c02-e431ac179554\") " Dec 03 22:58:00.205472 master-0 kubenswrapper[36504]: I1203 22:58:00.205403 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-utilities" (OuterVolumeSpecName: "utilities") pod "2ca1b4e1-2889-4624-8c02-e431ac179554" (UID: "2ca1b4e1-2889-4624-8c02-e431ac179554"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:58:00.211087 master-0 kubenswrapper[36504]: I1203 22:58:00.209107 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca1b4e1-2889-4624-8c02-e431ac179554-kube-api-access-xvwb2" (OuterVolumeSpecName: "kube-api-access-xvwb2") pod "2ca1b4e1-2889-4624-8c02-e431ac179554" (UID: "2ca1b4e1-2889-4624-8c02-e431ac179554"). InnerVolumeSpecName "kube-api-access-xvwb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 22:58:00.276235 master-0 kubenswrapper[36504]: I1203 22:58:00.274642 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ca1b4e1-2889-4624-8c02-e431ac179554" (UID: "2ca1b4e1-2889-4624-8c02-e431ac179554"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 22:58:00.310147 master-0 kubenswrapper[36504]: I1203 22:58:00.310034 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvwb2\" (UniqueName: \"kubernetes.io/projected/2ca1b4e1-2889-4624-8c02-e431ac179554-kube-api-access-xvwb2\") on node \"master-0\" DevicePath \"\"" Dec 03 22:58:00.310147 master-0 kubenswrapper[36504]: I1203 22:58:00.310101 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 22:58:00.310147 master-0 kubenswrapper[36504]: I1203 22:58:00.310112 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ca1b4e1-2889-4624-8c02-e431ac179554-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 22:58:00.410994 master-0 kubenswrapper[36504]: I1203 22:58:00.410753 36504 generic.go:334] "Generic (PLEG): container finished" podID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerID="665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1" exitCode=0 Dec 03 22:58:00.410994 master-0 kubenswrapper[36504]: I1203 22:58:00.410850 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerDied","Data":"665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1"} Dec 03 22:58:00.410994 master-0 kubenswrapper[36504]: I1203 22:58:00.410932 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8gqvr" event={"ID":"2ca1b4e1-2889-4624-8c02-e431ac179554","Type":"ContainerDied","Data":"76f9815590b7c0c3c51922e0d12450de5d52cd8ecd382637ee3e0c35194842a6"} Dec 03 22:58:00.410994 master-0 kubenswrapper[36504]: I1203 22:58:00.410955 36504 scope.go:117] "RemoveContainer" containerID="665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1" Dec 03 22:58:00.413812 master-0 kubenswrapper[36504]: I1203 22:58:00.413056 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8gqvr" Dec 03 22:58:00.446441 master-0 kubenswrapper[36504]: I1203 22:58:00.446329 36504 scope.go:117] "RemoveContainer" containerID="42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662" Dec 03 22:58:00.480370 master-0 kubenswrapper[36504]: I1203 22:58:00.480090 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8gqvr"] Dec 03 22:58:00.495643 master-0 kubenswrapper[36504]: I1203 22:58:00.495527 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8gqvr"] Dec 03 22:58:00.498892 master-0 kubenswrapper[36504]: I1203 22:58:00.498855 36504 scope.go:117] "RemoveContainer" containerID="d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9" Dec 03 22:58:00.572807 master-0 kubenswrapper[36504]: I1203 22:58:00.570067 36504 scope.go:117] "RemoveContainer" containerID="665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1" Dec 03 22:58:00.585498 master-0 kubenswrapper[36504]: E1203 22:58:00.584090 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1\": container with ID starting with 665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1 not found: ID does not exist" containerID="665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1" Dec 03 22:58:00.585498 master-0 kubenswrapper[36504]: I1203 22:58:00.584169 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1"} err="failed to get container status \"665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1\": rpc error: code = NotFound desc = could not find container \"665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1\": container with ID starting with 665af6b925ab77a702a4ed2aa5dce8d2c84bdcc4421ed0d95900928593b79df1 not found: ID does not exist" Dec 03 22:58:00.585498 master-0 kubenswrapper[36504]: I1203 22:58:00.584213 36504 scope.go:117] "RemoveContainer" containerID="42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662" Dec 03 22:58:00.585952 master-0 kubenswrapper[36504]: E1203 22:58:00.585864 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662\": container with ID starting with 42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662 not found: ID does not exist" containerID="42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662" Dec 03 22:58:00.586030 master-0 kubenswrapper[36504]: I1203 22:58:00.585971 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662"} err="failed to get container status \"42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662\": rpc error: code = NotFound desc = could not find container \"42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662\": container with ID starting with 42db183f1e78ad356183dfd8dc8292fc2485f88a6f58bc90102dc1a98924a662 not found: ID does not exist" Dec 03 22:58:00.586084 master-0 kubenswrapper[36504]: I1203 22:58:00.586032 36504 scope.go:117] "RemoveContainer" containerID="d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9" Dec 03 22:58:00.591374 master-0 kubenswrapper[36504]: E1203 22:58:00.591307 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9\": container with ID starting with d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9 not found: ID does not exist" containerID="d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9" Dec 03 22:58:00.591571 master-0 kubenswrapper[36504]: I1203 22:58:00.591375 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9"} err="failed to get container status \"d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9\": rpc error: code = NotFound desc = could not find container \"d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9\": container with ID starting with d706de3240a0320cacad998f7926040a8596269a5f0abe45a6e4188e60b038c9 not found: ID does not exist" Dec 03 22:58:01.115701 master-0 kubenswrapper[36504]: I1203 22:58:01.115579 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" path="/var/lib/kubelet/pods/2ca1b4e1-2889-4624-8c02-e431ac179554/volumes" Dec 03 22:58:01.621232 master-0 kubenswrapper[36504]: E1203 22:58:01.621113 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:58:08.934200 master-0 kubenswrapper[36504]: E1203 22:58:08.934101 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:58:16.965018 master-0 kubenswrapper[36504]: E1203 22:58:16.964889 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:58:19.007520 master-0 kubenswrapper[36504]: E1203 22:58:19.007413 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:58:29.390717 master-0 kubenswrapper[36504]: E1203 22:58:29.390640 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:58:31.618890 master-0 kubenswrapper[36504]: E1203 22:58:31.618756 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c32606_4f1d_433d_b273_b0a4fca4c4a0.slice/crio-conmon-84b5bd29da138e0664baf80e4a69bba68a83a9943466e9212b8b9a2f8436b227.scope\": RecentStats: unable to find data in memory cache]" Dec 03 22:58:35.735842 master-0 kubenswrapper[36504]: E1203 22:58:35.734623 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:59:09.096852 master-0 kubenswrapper[36504]: I1203 22:59:09.096787 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 22:59:29.096976 master-0 kubenswrapper[36504]: I1203 22:59:29.096903 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 22:59:35.744624 master-0 kubenswrapper[36504]: E1203 22:59:35.744542 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 22:59:58.312025 master-0 kubenswrapper[36504]: I1203 22:59:58.311955 36504 trace.go:236] Trace[2072288397]: "Calculate volume metrics of test-operator-logs for pod openstack/tempest-tests-s01-scenario-tests" (03-Dec-2025 22:59:57.134) (total time: 1177ms): Dec 03 22:59:58.312025 master-0 kubenswrapper[36504]: Trace[2072288397]: [1.177625083s] [1.177625083s] END Dec 03 23:00:00.186158 master-0 kubenswrapper[36504]: I1203 23:00:00.186083 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp"] Dec 03 23:00:00.187031 master-0 kubenswrapper[36504]: E1203 23:00:00.186921 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="extract-utilities" Dec 03 23:00:00.187031 master-0 kubenswrapper[36504]: I1203 23:00:00.186944 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="extract-utilities" Dec 03 23:00:00.187031 master-0 kubenswrapper[36504]: E1203 23:00:00.187026 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="extract-content" Dec 03 23:00:00.187031 master-0 kubenswrapper[36504]: I1203 23:00:00.187035 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="extract-content" Dec 03 23:00:00.187233 master-0 kubenswrapper[36504]: E1203 23:00:00.187065 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="registry-server" Dec 03 23:00:00.187233 master-0 kubenswrapper[36504]: I1203 23:00:00.187072 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="registry-server" Dec 03 23:00:00.190800 master-0 kubenswrapper[36504]: I1203 23:00:00.187477 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca1b4e1-2889-4624-8c02-e431ac179554" containerName="registry-server" Dec 03 23:00:00.190800 master-0 kubenswrapper[36504]: I1203 23:00:00.189111 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.192363 master-0 kubenswrapper[36504]: I1203 23:00:00.192309 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 23:00:00.193505 master-0 kubenswrapper[36504]: I1203 23:00:00.193460 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:00:00.208350 master-0 kubenswrapper[36504]: I1203 23:00:00.206724 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp"] Dec 03 23:00:00.277588 master-0 kubenswrapper[36504]: I1203 23:00:00.277506 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5lrz\" (UniqueName: \"kubernetes.io/projected/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-kube-api-access-s5lrz\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.277981 master-0 kubenswrapper[36504]: I1203 23:00:00.277581 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-config-volume\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.277981 master-0 kubenswrapper[36504]: I1203 23:00:00.277723 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-secret-volume\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.381365 master-0 kubenswrapper[36504]: I1203 23:00:00.381265 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5lrz\" (UniqueName: \"kubernetes.io/projected/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-kube-api-access-s5lrz\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.381365 master-0 kubenswrapper[36504]: I1203 23:00:00.381384 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-config-volume\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.381786 master-0 kubenswrapper[36504]: I1203 23:00:00.381585 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-secret-volume\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.383118 master-0 kubenswrapper[36504]: I1203 23:00:00.383066 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-config-volume\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.643194 master-0 kubenswrapper[36504]: I1203 23:00:00.643111 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-secret-volume\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.644034 master-0 kubenswrapper[36504]: I1203 23:00:00.643969 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5lrz\" (UniqueName: \"kubernetes.io/projected/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-kube-api-access-s5lrz\") pod \"collect-profiles-29413380-j25wp\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:00.848838 master-0 kubenswrapper[36504]: I1203 23:00:00.848754 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:01.410423 master-0 kubenswrapper[36504]: I1203 23:00:01.410310 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp"] Dec 03 23:00:02.293496 master-0 kubenswrapper[36504]: I1203 23:00:02.293421 36504 generic.go:334] "Generic (PLEG): container finished" podID="b19c4ac6-170d-45a1-9fa8-f8c4962c0760" containerID="64dde6128e163ba33beeed6547110d829aca9f78baaec43beeed6c335453169b" exitCode=0 Dec 03 23:00:02.293885 master-0 kubenswrapper[36504]: I1203 23:00:02.293504 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" event={"ID":"b19c4ac6-170d-45a1-9fa8-f8c4962c0760","Type":"ContainerDied","Data":"64dde6128e163ba33beeed6547110d829aca9f78baaec43beeed6c335453169b"} Dec 03 23:00:02.293885 master-0 kubenswrapper[36504]: I1203 23:00:02.293550 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" event={"ID":"b19c4ac6-170d-45a1-9fa8-f8c4962c0760","Type":"ContainerStarted","Data":"773c1ce5157117fef1569b796b0237e4415b2bbdeb58a01244e6086084cf57b7"} Dec 03 23:00:03.889156 master-0 kubenswrapper[36504]: I1203 23:00:03.888601 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:04.041455 master-0 kubenswrapper[36504]: I1203 23:00:04.041234 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-config-volume\") pod \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " Dec 03 23:00:04.041455 master-0 kubenswrapper[36504]: I1203 23:00:04.041331 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-secret-volume\") pod \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " Dec 03 23:00:04.041455 master-0 kubenswrapper[36504]: I1203 23:00:04.041447 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5lrz\" (UniqueName: \"kubernetes.io/projected/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-kube-api-access-s5lrz\") pod \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\" (UID: \"b19c4ac6-170d-45a1-9fa8-f8c4962c0760\") " Dec 03 23:00:04.042244 master-0 kubenswrapper[36504]: I1203 23:00:04.042134 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-config-volume" (OuterVolumeSpecName: "config-volume") pod "b19c4ac6-170d-45a1-9fa8-f8c4962c0760" (UID: "b19c4ac6-170d-45a1-9fa8-f8c4962c0760"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:00:04.045819 master-0 kubenswrapper[36504]: I1203 23:00:04.045740 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-kube-api-access-s5lrz" (OuterVolumeSpecName: "kube-api-access-s5lrz") pod "b19c4ac6-170d-45a1-9fa8-f8c4962c0760" (UID: "b19c4ac6-170d-45a1-9fa8-f8c4962c0760"). InnerVolumeSpecName "kube-api-access-s5lrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:00:04.046057 master-0 kubenswrapper[36504]: I1203 23:00:04.045860 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b19c4ac6-170d-45a1-9fa8-f8c4962c0760" (UID: "b19c4ac6-170d-45a1-9fa8-f8c4962c0760"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:00:04.145834 master-0 kubenswrapper[36504]: I1203 23:00:04.145713 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 23:00:04.145834 master-0 kubenswrapper[36504]: I1203 23:00:04.145832 36504 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 23:00:04.145834 master-0 kubenswrapper[36504]: I1203 23:00:04.145848 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5lrz\" (UniqueName: \"kubernetes.io/projected/b19c4ac6-170d-45a1-9fa8-f8c4962c0760-kube-api-access-s5lrz\") on node \"master-0\" DevicePath \"\"" Dec 03 23:00:04.325146 master-0 kubenswrapper[36504]: I1203 23:00:04.324986 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" event={"ID":"b19c4ac6-170d-45a1-9fa8-f8c4962c0760","Type":"ContainerDied","Data":"773c1ce5157117fef1569b796b0237e4415b2bbdeb58a01244e6086084cf57b7"} Dec 03 23:00:04.325146 master-0 kubenswrapper[36504]: I1203 23:00:04.325049 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413380-j25wp" Dec 03 23:00:04.325461 master-0 kubenswrapper[36504]: I1203 23:00:04.325064 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773c1ce5157117fef1569b796b0237e4415b2bbdeb58a01244e6086084cf57b7" Dec 03 23:00:05.005613 master-0 kubenswrapper[36504]: I1203 23:00:05.005521 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l"] Dec 03 23:00:05.021895 master-0 kubenswrapper[36504]: I1203 23:00:05.021810 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413335-9cf9l"] Dec 03 23:00:05.126721 master-0 kubenswrapper[36504]: I1203 23:00:05.126592 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa99791-2fca-46c5-98f7-206de72957bb" path="/var/lib/kubelet/pods/0aa99791-2fca-46c5-98f7-206de72957bb/volumes" Dec 03 23:00:35.734071 master-0 kubenswrapper[36504]: E1203 23:00:35.734001 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:00:38.096353 master-0 kubenswrapper[36504]: I1203 23:00:38.096244 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:00:39.097071 master-0 kubenswrapper[36504]: I1203 23:00:39.096868 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:00:42.362100 master-0 kubenswrapper[36504]: I1203 23:00:42.361932 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6768g"] Dec 03 23:00:42.363036 master-0 kubenswrapper[36504]: E1203 23:00:42.362856 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19c4ac6-170d-45a1-9fa8-f8c4962c0760" containerName="collect-profiles" Dec 03 23:00:42.363036 master-0 kubenswrapper[36504]: I1203 23:00:42.362879 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19c4ac6-170d-45a1-9fa8-f8c4962c0760" containerName="collect-profiles" Dec 03 23:00:42.363967 master-0 kubenswrapper[36504]: I1203 23:00:42.363286 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19c4ac6-170d-45a1-9fa8-f8c4962c0760" containerName="collect-profiles" Dec 03 23:00:42.369900 master-0 kubenswrapper[36504]: I1203 23:00:42.369830 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.389111 master-0 kubenswrapper[36504]: I1203 23:00:42.389038 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6768g"] Dec 03 23:00:42.437296 master-0 kubenswrapper[36504]: I1203 23:00:42.437189 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-catalog-content\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.437990 master-0 kubenswrapper[36504]: I1203 23:00:42.437625 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-utilities\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.437990 master-0 kubenswrapper[36504]: I1203 23:00:42.437907 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82q27\" (UniqueName: \"kubernetes.io/projected/ed044c89-4f92-471f-8b61-24098c440947-kube-api-access-82q27\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.541036 master-0 kubenswrapper[36504]: I1203 23:00:42.540850 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-catalog-content\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.541389 master-0 kubenswrapper[36504]: I1203 23:00:42.541124 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-utilities\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.541389 master-0 kubenswrapper[36504]: I1203 23:00:42.541212 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82q27\" (UniqueName: \"kubernetes.io/projected/ed044c89-4f92-471f-8b61-24098c440947-kube-api-access-82q27\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.541715 master-0 kubenswrapper[36504]: I1203 23:00:42.541675 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-catalog-content\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.545783 master-0 kubenswrapper[36504]: I1203 23:00:42.543895 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-utilities\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.567483 master-0 kubenswrapper[36504]: I1203 23:00:42.567377 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82q27\" (UniqueName: \"kubernetes.io/projected/ed044c89-4f92-471f-8b61-24098c440947-kube-api-access-82q27\") pod \"community-operators-6768g\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:42.733164 master-0 kubenswrapper[36504]: I1203 23:00:42.732382 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:43.426306 master-0 kubenswrapper[36504]: I1203 23:00:43.426216 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6768g"] Dec 03 23:00:43.427007 master-0 kubenswrapper[36504]: W1203 23:00:43.426602 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded044c89_4f92_471f_8b61_24098c440947.slice/crio-a719b226e89f5ad8fd774e362fbf392b375639d27a80b3d415fa08f57f762cd8 WatchSource:0}: Error finding container a719b226e89f5ad8fd774e362fbf392b375639d27a80b3d415fa08f57f762cd8: Status 404 returned error can't find the container with id a719b226e89f5ad8fd774e362fbf392b375639d27a80b3d415fa08f57f762cd8 Dec 03 23:00:44.104060 master-0 kubenswrapper[36504]: I1203 23:00:44.103783 36504 generic.go:334] "Generic (PLEG): container finished" podID="ed044c89-4f92-471f-8b61-24098c440947" containerID="1fe93b90bf7012ba65440d9837520cd1d0c151576c07518e9531f041015f8d33" exitCode=0 Dec 03 23:00:44.104060 master-0 kubenswrapper[36504]: I1203 23:00:44.103852 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerDied","Data":"1fe93b90bf7012ba65440d9837520cd1d0c151576c07518e9531f041015f8d33"} Dec 03 23:00:44.104060 master-0 kubenswrapper[36504]: I1203 23:00:44.103887 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerStarted","Data":"a719b226e89f5ad8fd774e362fbf392b375639d27a80b3d415fa08f57f762cd8"} Dec 03 23:00:44.107264 master-0 kubenswrapper[36504]: I1203 23:00:44.106557 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:00:45.136973 master-0 kubenswrapper[36504]: I1203 23:00:45.136894 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerStarted","Data":"becba76ae059c6a7b9371e37d21251f6cda88b40338d7728ecdd773f3c0c3566"} Dec 03 23:00:46.153818 master-0 kubenswrapper[36504]: I1203 23:00:46.153727 36504 generic.go:334] "Generic (PLEG): container finished" podID="ed044c89-4f92-471f-8b61-24098c440947" containerID="becba76ae059c6a7b9371e37d21251f6cda88b40338d7728ecdd773f3c0c3566" exitCode=0 Dec 03 23:00:46.153818 master-0 kubenswrapper[36504]: I1203 23:00:46.153809 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerDied","Data":"becba76ae059c6a7b9371e37d21251f6cda88b40338d7728ecdd773f3c0c3566"} Dec 03 23:00:47.072991 master-0 kubenswrapper[36504]: I1203 23:00:47.072818 36504 scope.go:117] "RemoveContainer" containerID="ef6203f31725415f0d2b8868e71e83f9ab91a310fd0c29b08688106ab0bc8c60" Dec 03 23:00:47.173802 master-0 kubenswrapper[36504]: I1203 23:00:47.173012 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerStarted","Data":"ea7cf51310e28244d99388fd25a95c0e36d8aee4b88bfabbf5846630b05157fa"} Dec 03 23:00:47.214248 master-0 kubenswrapper[36504]: I1203 23:00:47.214025 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6768g" podStartSLOduration=2.64323344 podStartE2EDuration="5.213990462s" podCreationTimestamp="2025-12-03 23:00:42 +0000 UTC" firstStartedPulling="2025-12-03 23:00:44.106440867 +0000 UTC m=+3009.326212874" lastFinishedPulling="2025-12-03 23:00:46.677197889 +0000 UTC m=+3011.896969896" observedRunningTime="2025-12-03 23:00:47.208337406 +0000 UTC m=+3012.428109413" watchObservedRunningTime="2025-12-03 23:00:47.213990462 +0000 UTC m=+3012.433762469" Dec 03 23:00:52.733401 master-0 kubenswrapper[36504]: I1203 23:00:52.733297 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:52.734982 master-0 kubenswrapper[36504]: I1203 23:00:52.734926 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:52.787403 master-0 kubenswrapper[36504]: I1203 23:00:52.787338 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:53.310892 master-0 kubenswrapper[36504]: I1203 23:00:53.310826 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:53.398674 master-0 kubenswrapper[36504]: I1203 23:00:53.398572 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6768g"] Dec 03 23:00:55.286589 master-0 kubenswrapper[36504]: I1203 23:00:55.286459 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6768g" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="registry-server" containerID="cri-o://ea7cf51310e28244d99388fd25a95c0e36d8aee4b88bfabbf5846630b05157fa" gracePeriod=2 Dec 03 23:00:56.307245 master-0 kubenswrapper[36504]: I1203 23:00:56.307064 36504 generic.go:334] "Generic (PLEG): container finished" podID="ed044c89-4f92-471f-8b61-24098c440947" containerID="ea7cf51310e28244d99388fd25a95c0e36d8aee4b88bfabbf5846630b05157fa" exitCode=0 Dec 03 23:00:56.307245 master-0 kubenswrapper[36504]: I1203 23:00:56.307147 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerDied","Data":"ea7cf51310e28244d99388fd25a95c0e36d8aee4b88bfabbf5846630b05157fa"} Dec 03 23:00:56.469681 master-0 kubenswrapper[36504]: I1203 23:00:56.469482 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:56.580664 master-0 kubenswrapper[36504]: I1203 23:00:56.579534 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82q27\" (UniqueName: \"kubernetes.io/projected/ed044c89-4f92-471f-8b61-24098c440947-kube-api-access-82q27\") pod \"ed044c89-4f92-471f-8b61-24098c440947\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " Dec 03 23:00:56.580664 master-0 kubenswrapper[36504]: I1203 23:00:56.579909 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-catalog-content\") pod \"ed044c89-4f92-471f-8b61-24098c440947\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " Dec 03 23:00:56.580664 master-0 kubenswrapper[36504]: I1203 23:00:56.580039 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-utilities\") pod \"ed044c89-4f92-471f-8b61-24098c440947\" (UID: \"ed044c89-4f92-471f-8b61-24098c440947\") " Dec 03 23:00:56.582015 master-0 kubenswrapper[36504]: I1203 23:00:56.581945 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-utilities" (OuterVolumeSpecName: "utilities") pod "ed044c89-4f92-471f-8b61-24098c440947" (UID: "ed044c89-4f92-471f-8b61-24098c440947"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:00:56.586877 master-0 kubenswrapper[36504]: I1203 23:00:56.586829 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:00:56.590069 master-0 kubenswrapper[36504]: I1203 23:00:56.590008 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed044c89-4f92-471f-8b61-24098c440947-kube-api-access-82q27" (OuterVolumeSpecName: "kube-api-access-82q27") pod "ed044c89-4f92-471f-8b61-24098c440947" (UID: "ed044c89-4f92-471f-8b61-24098c440947"). InnerVolumeSpecName "kube-api-access-82q27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:00:56.662854 master-0 kubenswrapper[36504]: I1203 23:00:56.662715 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed044c89-4f92-471f-8b61-24098c440947" (UID: "ed044c89-4f92-471f-8b61-24098c440947"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:00:56.690616 master-0 kubenswrapper[36504]: I1203 23:00:56.690543 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82q27\" (UniqueName: \"kubernetes.io/projected/ed044c89-4f92-471f-8b61-24098c440947-kube-api-access-82q27\") on node \"master-0\" DevicePath \"\"" Dec 03 23:00:56.690616 master-0 kubenswrapper[36504]: I1203 23:00:56.690602 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed044c89-4f92-471f-8b61-24098c440947-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:00:57.326409 master-0 kubenswrapper[36504]: I1203 23:00:57.326346 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6768g" event={"ID":"ed044c89-4f92-471f-8b61-24098c440947","Type":"ContainerDied","Data":"a719b226e89f5ad8fd774e362fbf392b375639d27a80b3d415fa08f57f762cd8"} Dec 03 23:00:57.327064 master-0 kubenswrapper[36504]: I1203 23:00:57.326432 36504 scope.go:117] "RemoveContainer" containerID="ea7cf51310e28244d99388fd25a95c0e36d8aee4b88bfabbf5846630b05157fa" Dec 03 23:00:57.327064 master-0 kubenswrapper[36504]: I1203 23:00:57.326436 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6768g" Dec 03 23:00:57.360683 master-0 kubenswrapper[36504]: I1203 23:00:57.360130 36504 scope.go:117] "RemoveContainer" containerID="becba76ae059c6a7b9371e37d21251f6cda88b40338d7728ecdd773f3c0c3566" Dec 03 23:00:57.383603 master-0 kubenswrapper[36504]: I1203 23:00:57.381422 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6768g"] Dec 03 23:00:57.392615 master-0 kubenswrapper[36504]: I1203 23:00:57.391655 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6768g"] Dec 03 23:00:57.413371 master-0 kubenswrapper[36504]: I1203 23:00:57.413300 36504 scope.go:117] "RemoveContainer" containerID="1fe93b90bf7012ba65440d9837520cd1d0c151576c07518e9531f041015f8d33" Dec 03 23:00:59.114024 master-0 kubenswrapper[36504]: I1203 23:00:59.113917 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed044c89-4f92-471f-8b61-24098c440947" path="/var/lib/kubelet/pods/ed044c89-4f92-471f-8b61-24098c440947/volumes" Dec 03 23:01:00.215083 master-0 kubenswrapper[36504]: I1203 23:01:00.214991 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29413381-ph5vl"] Dec 03 23:01:00.216228 master-0 kubenswrapper[36504]: E1203 23:01:00.216190 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="extract-content" Dec 03 23:01:00.216228 master-0 kubenswrapper[36504]: I1203 23:01:00.216225 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="extract-content" Dec 03 23:01:00.216326 master-0 kubenswrapper[36504]: E1203 23:01:00.216254 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="registry-server" Dec 03 23:01:00.216326 master-0 kubenswrapper[36504]: I1203 23:01:00.216266 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="registry-server" Dec 03 23:01:00.216326 master-0 kubenswrapper[36504]: E1203 23:01:00.216309 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="extract-utilities" Dec 03 23:01:00.216326 master-0 kubenswrapper[36504]: I1203 23:01:00.216321 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="extract-utilities" Dec 03 23:01:00.216876 master-0 kubenswrapper[36504]: I1203 23:01:00.216817 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed044c89-4f92-471f-8b61-24098c440947" containerName="registry-server" Dec 03 23:01:00.221181 master-0 kubenswrapper[36504]: I1203 23:01:00.221130 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.269144 master-0 kubenswrapper[36504]: I1203 23:01:00.269095 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413381-ph5vl"] Dec 03 23:01:00.349820 master-0 kubenswrapper[36504]: I1203 23:01:00.349731 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-fernet-keys\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.350466 master-0 kubenswrapper[36504]: I1203 23:01:00.350368 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-config-data\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.350466 master-0 kubenswrapper[36504]: I1203 23:01:00.350454 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/56c8f011-6793-4317-9916-d23a1341be8b-kube-api-access-g6np4\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.350687 master-0 kubenswrapper[36504]: I1203 23:01:00.350502 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-combined-ca-bundle\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.454897 master-0 kubenswrapper[36504]: I1203 23:01:00.454806 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-config-data\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.454897 master-0 kubenswrapper[36504]: I1203 23:01:00.454888 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/56c8f011-6793-4317-9916-d23a1341be8b-kube-api-access-g6np4\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.455327 master-0 kubenswrapper[36504]: I1203 23:01:00.454925 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-combined-ca-bundle\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.455327 master-0 kubenswrapper[36504]: I1203 23:01:00.455185 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-fernet-keys\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.460541 master-0 kubenswrapper[36504]: I1203 23:01:00.460400 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-combined-ca-bundle\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.465450 master-0 kubenswrapper[36504]: I1203 23:01:00.465375 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-fernet-keys\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.472721 master-0 kubenswrapper[36504]: I1203 23:01:00.472389 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-config-data\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.475686 master-0 kubenswrapper[36504]: I1203 23:01:00.475537 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/56c8f011-6793-4317-9916-d23a1341be8b-kube-api-access-g6np4\") pod \"keystone-cron-29413381-ph5vl\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:00.571638 master-0 kubenswrapper[36504]: I1203 23:01:00.571488 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:01.089354 master-0 kubenswrapper[36504]: I1203 23:01:01.089249 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413381-ph5vl"] Dec 03 23:01:02.419938 master-0 kubenswrapper[36504]: I1203 23:01:02.419832 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-ph5vl" event={"ID":"56c8f011-6793-4317-9916-d23a1341be8b","Type":"ContainerStarted","Data":"365cbc13a93f769e007b8a31432736b5a006be7400266585542e734ae544a4e1"} Dec 03 23:01:02.419938 master-0 kubenswrapper[36504]: I1203 23:01:02.419927 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-ph5vl" event={"ID":"56c8f011-6793-4317-9916-d23a1341be8b","Type":"ContainerStarted","Data":"c373689a3bae3e79fcfeb3eddbcef4b6056d08a81943b87e916bd327cb52c59b"} Dec 03 23:01:02.447399 master-0 kubenswrapper[36504]: I1203 23:01:02.447272 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29413381-ph5vl" podStartSLOduration=2.447240211 podStartE2EDuration="2.447240211s" podCreationTimestamp="2025-12-03 23:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:01:02.444095023 +0000 UTC m=+3027.663867020" watchObservedRunningTime="2025-12-03 23:01:02.447240211 +0000 UTC m=+3027.667012218" Dec 03 23:01:05.484222 master-0 kubenswrapper[36504]: I1203 23:01:05.483700 36504 generic.go:334] "Generic (PLEG): container finished" podID="56c8f011-6793-4317-9916-d23a1341be8b" containerID="365cbc13a93f769e007b8a31432736b5a006be7400266585542e734ae544a4e1" exitCode=0 Dec 03 23:01:05.484222 master-0 kubenswrapper[36504]: I1203 23:01:05.483835 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-ph5vl" event={"ID":"56c8f011-6793-4317-9916-d23a1341be8b","Type":"ContainerDied","Data":"365cbc13a93f769e007b8a31432736b5a006be7400266585542e734ae544a4e1"} Dec 03 23:01:07.124650 master-0 kubenswrapper[36504]: I1203 23:01:07.124587 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:07.179483 master-0 kubenswrapper[36504]: I1203 23:01:07.179340 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-config-data\") pod \"56c8f011-6793-4317-9916-d23a1341be8b\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " Dec 03 23:01:07.179904 master-0 kubenswrapper[36504]: I1203 23:01:07.179524 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/56c8f011-6793-4317-9916-d23a1341be8b-kube-api-access-g6np4\") pod \"56c8f011-6793-4317-9916-d23a1341be8b\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " Dec 03 23:01:07.179904 master-0 kubenswrapper[36504]: I1203 23:01:07.179700 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-fernet-keys\") pod \"56c8f011-6793-4317-9916-d23a1341be8b\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " Dec 03 23:01:07.179904 master-0 kubenswrapper[36504]: I1203 23:01:07.179813 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-combined-ca-bundle\") pod \"56c8f011-6793-4317-9916-d23a1341be8b\" (UID: \"56c8f011-6793-4317-9916-d23a1341be8b\") " Dec 03 23:01:07.198105 master-0 kubenswrapper[36504]: I1203 23:01:07.197096 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c8f011-6793-4317-9916-d23a1341be8b-kube-api-access-g6np4" (OuterVolumeSpecName: "kube-api-access-g6np4") pod "56c8f011-6793-4317-9916-d23a1341be8b" (UID: "56c8f011-6793-4317-9916-d23a1341be8b"). InnerVolumeSpecName "kube-api-access-g6np4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:01:07.209923 master-0 kubenswrapper[36504]: I1203 23:01:07.209198 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "56c8f011-6793-4317-9916-d23a1341be8b" (UID: "56c8f011-6793-4317-9916-d23a1341be8b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:01:07.233033 master-0 kubenswrapper[36504]: I1203 23:01:07.232858 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c8f011-6793-4317-9916-d23a1341be8b" (UID: "56c8f011-6793-4317-9916-d23a1341be8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:01:07.252064 master-0 kubenswrapper[36504]: I1203 23:01:07.250315 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-config-data" (OuterVolumeSpecName: "config-data") pod "56c8f011-6793-4317-9916-d23a1341be8b" (UID: "56c8f011-6793-4317-9916-d23a1341be8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:01:07.297297 master-0 kubenswrapper[36504]: I1203 23:01:07.297226 36504 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Dec 03 23:01:07.297297 master-0 kubenswrapper[36504]: I1203 23:01:07.297278 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 23:01:07.297297 master-0 kubenswrapper[36504]: I1203 23:01:07.297290 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/56c8f011-6793-4317-9916-d23a1341be8b-kube-api-access-g6np4\") on node \"master-0\" DevicePath \"\"" Dec 03 23:01:07.297297 master-0 kubenswrapper[36504]: I1203 23:01:07.297302 36504 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/56c8f011-6793-4317-9916-d23a1341be8b-fernet-keys\") on node \"master-0\" DevicePath \"\"" Dec 03 23:01:07.525436 master-0 kubenswrapper[36504]: I1203 23:01:07.525267 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413381-ph5vl" event={"ID":"56c8f011-6793-4317-9916-d23a1341be8b","Type":"ContainerDied","Data":"c373689a3bae3e79fcfeb3eddbcef4b6056d08a81943b87e916bd327cb52c59b"} Dec 03 23:01:07.525436 master-0 kubenswrapper[36504]: I1203 23:01:07.525325 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c373689a3bae3e79fcfeb3eddbcef4b6056d08a81943b87e916bd327cb52c59b" Dec 03 23:01:07.525436 master-0 kubenswrapper[36504]: I1203 23:01:07.525333 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413381-ph5vl" Dec 03 23:01:35.737566 master-0 kubenswrapper[36504]: E1203 23:01:35.737496 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:01:45.107984 master-0 kubenswrapper[36504]: I1203 23:01:45.107918 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:01:50.096337 master-0 kubenswrapper[36504]: I1203 23:01:50.096101 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:02:35.738653 master-0 kubenswrapper[36504]: E1203 23:02:35.738564 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:02:51.096534 master-0 kubenswrapper[36504]: I1203 23:02:51.096466 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:02:51.098563 master-0 kubenswrapper[36504]: I1203 23:02:51.098510 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:03:35.723678 master-0 kubenswrapper[36504]: E1203 23:03:35.723600 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:03:45.218366 master-0 kubenswrapper[36504]: I1203 23:03:45.218275 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sphtk"] Dec 03 23:03:45.219304 master-0 kubenswrapper[36504]: E1203 23:03:45.219230 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c8f011-6793-4317-9916-d23a1341be8b" containerName="keystone-cron" Dec 03 23:03:45.219304 master-0 kubenswrapper[36504]: I1203 23:03:45.219291 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c8f011-6793-4317-9916-d23a1341be8b" containerName="keystone-cron" Dec 03 23:03:45.219962 master-0 kubenswrapper[36504]: I1203 23:03:45.219926 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c8f011-6793-4317-9916-d23a1341be8b" containerName="keystone-cron" Dec 03 23:03:45.222755 master-0 kubenswrapper[36504]: I1203 23:03:45.222701 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.263703 master-0 kubenswrapper[36504]: I1203 23:03:45.263645 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sphtk"] Dec 03 23:03:45.315751 master-0 kubenswrapper[36504]: I1203 23:03:45.315607 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmxc\" (UniqueName: \"kubernetes.io/projected/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-kube-api-access-cjmxc\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.316071 master-0 kubenswrapper[36504]: I1203 23:03:45.315886 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-catalog-content\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.316112 master-0 kubenswrapper[36504]: I1203 23:03:45.316095 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-utilities\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.419560 master-0 kubenswrapper[36504]: I1203 23:03:45.419469 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmxc\" (UniqueName: \"kubernetes.io/projected/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-kube-api-access-cjmxc\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.419898 master-0 kubenswrapper[36504]: I1203 23:03:45.419682 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-catalog-content\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.419898 master-0 kubenswrapper[36504]: I1203 23:03:45.419840 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-utilities\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.420561 master-0 kubenswrapper[36504]: I1203 23:03:45.420515 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-catalog-content\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.420640 master-0 kubenswrapper[36504]: I1203 23:03:45.420515 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-utilities\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.438513 master-0 kubenswrapper[36504]: I1203 23:03:45.438455 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmxc\" (UniqueName: \"kubernetes.io/projected/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-kube-api-access-cjmxc\") pod \"redhat-operators-sphtk\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:45.566903 master-0 kubenswrapper[36504]: I1203 23:03:45.566696 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:46.127660 master-0 kubenswrapper[36504]: I1203 23:03:46.127582 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sphtk"] Dec 03 23:03:46.985953 master-0 kubenswrapper[36504]: I1203 23:03:46.985868 36504 generic.go:334] "Generic (PLEG): container finished" podID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerID="948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9" exitCode=0 Dec 03 23:03:46.986709 master-0 kubenswrapper[36504]: I1203 23:03:46.985963 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sphtk" event={"ID":"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8","Type":"ContainerDied","Data":"948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9"} Dec 03 23:03:46.986709 master-0 kubenswrapper[36504]: I1203 23:03:46.986019 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sphtk" event={"ID":"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8","Type":"ContainerStarted","Data":"ab4169c1f6fd1ae1a1b13b10ecd7af0a350fc278c9a2c64ee6ad91af07945077"} Dec 03 23:03:49.024188 master-0 kubenswrapper[36504]: I1203 23:03:49.024113 36504 generic.go:334] "Generic (PLEG): container finished" podID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerID="626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3" exitCode=0 Dec 03 23:03:49.024995 master-0 kubenswrapper[36504]: I1203 23:03:49.024217 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sphtk" event={"ID":"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8","Type":"ContainerDied","Data":"626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3"} Dec 03 23:03:50.046677 master-0 kubenswrapper[36504]: I1203 23:03:50.046605 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sphtk" event={"ID":"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8","Type":"ContainerStarted","Data":"a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785"} Dec 03 23:03:50.093571 master-0 kubenswrapper[36504]: I1203 23:03:50.093417 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sphtk" podStartSLOduration=2.663050849 podStartE2EDuration="5.093379381s" podCreationTimestamp="2025-12-03 23:03:45 +0000 UTC" firstStartedPulling="2025-12-03 23:03:46.989617815 +0000 UTC m=+3192.209389812" lastFinishedPulling="2025-12-03 23:03:49.419946337 +0000 UTC m=+3194.639718344" observedRunningTime="2025-12-03 23:03:50.084403633 +0000 UTC m=+3195.304175640" watchObservedRunningTime="2025-12-03 23:03:50.093379381 +0000 UTC m=+3195.313151388" Dec 03 23:03:53.096899 master-0 kubenswrapper[36504]: I1203 23:03:53.096809 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:03:55.568691 master-0 kubenswrapper[36504]: I1203 23:03:55.568525 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:55.568691 master-0 kubenswrapper[36504]: I1203 23:03:55.568634 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:55.628343 master-0 kubenswrapper[36504]: I1203 23:03:55.628272 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:56.193217 master-0 kubenswrapper[36504]: I1203 23:03:56.193118 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:56.268171 master-0 kubenswrapper[36504]: I1203 23:03:56.268086 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sphtk"] Dec 03 23:03:58.158276 master-0 kubenswrapper[36504]: I1203 23:03:58.158160 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sphtk" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="registry-server" containerID="cri-o://a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785" gracePeriod=2 Dec 03 23:03:58.430601 master-0 kubenswrapper[36504]: E1203 23:03:58.430427 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32b60a1_0af3_4dba_85b8_94da5f7cd2c8.slice/crio-a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32b60a1_0af3_4dba_85b8_94da5f7cd2c8.slice/crio-conmon-a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785.scope\": RecentStats: unable to find data in memory cache]" Dec 03 23:03:58.936669 master-0 kubenswrapper[36504]: I1203 23:03:58.936452 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:59.030669 master-0 kubenswrapper[36504]: I1203 23:03:59.030596 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmxc\" (UniqueName: \"kubernetes.io/projected/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-kube-api-access-cjmxc\") pod \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " Dec 03 23:03:59.031107 master-0 kubenswrapper[36504]: I1203 23:03:59.030932 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-utilities\") pod \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " Dec 03 23:03:59.031107 master-0 kubenswrapper[36504]: I1203 23:03:59.031043 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-catalog-content\") pod \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\" (UID: \"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8\") " Dec 03 23:03:59.032083 master-0 kubenswrapper[36504]: I1203 23:03:59.031936 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-utilities" (OuterVolumeSpecName: "utilities") pod "a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" (UID: "a32b60a1-0af3-4dba-85b8-94da5f7cd2c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:03:59.032563 master-0 kubenswrapper[36504]: I1203 23:03:59.032531 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:03:59.035715 master-0 kubenswrapper[36504]: I1203 23:03:59.035641 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-kube-api-access-cjmxc" (OuterVolumeSpecName: "kube-api-access-cjmxc") pod "a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" (UID: "a32b60a1-0af3-4dba-85b8-94da5f7cd2c8"). InnerVolumeSpecName "kube-api-access-cjmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:03:59.135968 master-0 kubenswrapper[36504]: I1203 23:03:59.135853 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmxc\" (UniqueName: \"kubernetes.io/projected/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-kube-api-access-cjmxc\") on node \"master-0\" DevicePath \"\"" Dec 03 23:03:59.173643 master-0 kubenswrapper[36504]: I1203 23:03:59.173576 36504 generic.go:334] "Generic (PLEG): container finished" podID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerID="a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785" exitCode=0 Dec 03 23:03:59.173643 master-0 kubenswrapper[36504]: I1203 23:03:59.173639 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sphtk" event={"ID":"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8","Type":"ContainerDied","Data":"a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785"} Dec 03 23:03:59.174495 master-0 kubenswrapper[36504]: I1203 23:03:59.173681 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sphtk" event={"ID":"a32b60a1-0af3-4dba-85b8-94da5f7cd2c8","Type":"ContainerDied","Data":"ab4169c1f6fd1ae1a1b13b10ecd7af0a350fc278c9a2c64ee6ad91af07945077"} Dec 03 23:03:59.174495 master-0 kubenswrapper[36504]: I1203 23:03:59.173674 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sphtk" Dec 03 23:03:59.174495 master-0 kubenswrapper[36504]: I1203 23:03:59.173705 36504 scope.go:117] "RemoveContainer" containerID="a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785" Dec 03 23:03:59.202469 master-0 kubenswrapper[36504]: I1203 23:03:59.202413 36504 scope.go:117] "RemoveContainer" containerID="626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3" Dec 03 23:03:59.240178 master-0 kubenswrapper[36504]: I1203 23:03:59.240116 36504 scope.go:117] "RemoveContainer" containerID="948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9" Dec 03 23:03:59.303327 master-0 kubenswrapper[36504]: I1203 23:03:59.303256 36504 scope.go:117] "RemoveContainer" containerID="a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785" Dec 03 23:03:59.304059 master-0 kubenswrapper[36504]: E1203 23:03:59.304004 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785\": container with ID starting with a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785 not found: ID does not exist" containerID="a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785" Dec 03 23:03:59.304187 master-0 kubenswrapper[36504]: I1203 23:03:59.304092 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785"} err="failed to get container status \"a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785\": rpc error: code = NotFound desc = could not find container \"a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785\": container with ID starting with a96365ba3aaec8c4a3f2a613327af18f43cfdf711073cb4e3d2e63963f5cc785 not found: ID does not exist" Dec 03 23:03:59.304250 master-0 kubenswrapper[36504]: I1203 23:03:59.304196 36504 scope.go:117] "RemoveContainer" containerID="626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3" Dec 03 23:03:59.304861 master-0 kubenswrapper[36504]: E1203 23:03:59.304806 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3\": container with ID starting with 626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3 not found: ID does not exist" containerID="626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3" Dec 03 23:03:59.304921 master-0 kubenswrapper[36504]: I1203 23:03:59.304869 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3"} err="failed to get container status \"626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3\": rpc error: code = NotFound desc = could not find container \"626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3\": container with ID starting with 626c40b970cea4d4432aaf9c96ba4c4e3ef28d81cebc40b77faf04a47933e1a3 not found: ID does not exist" Dec 03 23:03:59.304921 master-0 kubenswrapper[36504]: I1203 23:03:59.304909 36504 scope.go:117] "RemoveContainer" containerID="948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9" Dec 03 23:03:59.305275 master-0 kubenswrapper[36504]: E1203 23:03:59.305231 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9\": container with ID starting with 948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9 not found: ID does not exist" containerID="948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9" Dec 03 23:03:59.305348 master-0 kubenswrapper[36504]: I1203 23:03:59.305275 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9"} err="failed to get container status \"948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9\": rpc error: code = NotFound desc = could not find container \"948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9\": container with ID starting with 948326738f2a1cf51a5cf9ab2d76f14062ed2b79f0d3e04836e83ee7d6b4d8c9 not found: ID does not exist" Dec 03 23:03:59.766089 master-0 kubenswrapper[36504]: I1203 23:03:59.765833 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" (UID: "a32b60a1-0af3-4dba-85b8-94da5f7cd2c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:03:59.862759 master-0 kubenswrapper[36504]: I1203 23:03:59.862686 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:03:59.976159 master-0 kubenswrapper[36504]: I1203 23:03:59.975909 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sphtk"] Dec 03 23:04:00.015589 master-0 kubenswrapper[36504]: I1203 23:04:00.015529 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sphtk"] Dec 03 23:04:01.112222 master-0 kubenswrapper[36504]: I1203 23:04:01.112111 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" path="/var/lib/kubelet/pods/a32b60a1-0af3-4dba-85b8-94da5f7cd2c8/volumes" Dec 03 23:04:10.096374 master-0 kubenswrapper[36504]: I1203 23:04:10.096294 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:04:13.493338 master-0 kubenswrapper[36504]: I1203 23:04:13.493266 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lw8qc"] Dec 03 23:04:13.494192 master-0 kubenswrapper[36504]: E1203 23:04:13.494156 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="registry-server" Dec 03 23:04:13.494192 master-0 kubenswrapper[36504]: I1203 23:04:13.494181 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="registry-server" Dec 03 23:04:13.494312 master-0 kubenswrapper[36504]: E1203 23:04:13.494231 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="extract-content" Dec 03 23:04:13.494312 master-0 kubenswrapper[36504]: I1203 23:04:13.494239 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="extract-content" Dec 03 23:04:13.494312 master-0 kubenswrapper[36504]: E1203 23:04:13.494251 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="extract-utilities" Dec 03 23:04:13.494312 master-0 kubenswrapper[36504]: I1203 23:04:13.494258 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="extract-utilities" Dec 03 23:04:13.494704 master-0 kubenswrapper[36504]: I1203 23:04:13.494683 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a32b60a1-0af3-4dba-85b8-94da5f7cd2c8" containerName="registry-server" Dec 03 23:04:13.497652 master-0 kubenswrapper[36504]: I1203 23:04:13.497601 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.513906 master-0 kubenswrapper[36504]: I1203 23:04:13.513742 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw8qc"] Dec 03 23:04:13.600955 master-0 kubenswrapper[36504]: I1203 23:04:13.599683 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-utilities\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.600955 master-0 kubenswrapper[36504]: I1203 23:04:13.600079 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rrfc\" (UniqueName: \"kubernetes.io/projected/b70e4f95-c957-490e-8381-9d30d58aef4d-kube-api-access-9rrfc\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.600955 master-0 kubenswrapper[36504]: I1203 23:04:13.600160 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-catalog-content\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.703870 master-0 kubenswrapper[36504]: I1203 23:04:13.703436 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rrfc\" (UniqueName: \"kubernetes.io/projected/b70e4f95-c957-490e-8381-9d30d58aef4d-kube-api-access-9rrfc\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.704567 master-0 kubenswrapper[36504]: I1203 23:04:13.704541 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-catalog-content\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.704940 master-0 kubenswrapper[36504]: I1203 23:04:13.704918 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-utilities\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.706141 master-0 kubenswrapper[36504]: I1203 23:04:13.706117 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-utilities\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.706647 master-0 kubenswrapper[36504]: I1203 23:04:13.706578 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-catalog-content\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.732662 master-0 kubenswrapper[36504]: I1203 23:04:13.732583 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rrfc\" (UniqueName: \"kubernetes.io/projected/b70e4f95-c957-490e-8381-9d30d58aef4d-kube-api-access-9rrfc\") pod \"redhat-marketplace-lw8qc\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:13.874294 master-0 kubenswrapper[36504]: I1203 23:04:13.873523 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:14.431571 master-0 kubenswrapper[36504]: I1203 23:04:14.431500 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw8qc"] Dec 03 23:04:15.436246 master-0 kubenswrapper[36504]: I1203 23:04:15.436180 36504 generic.go:334] "Generic (PLEG): container finished" podID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerID="60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f" exitCode=0 Dec 03 23:04:15.436925 master-0 kubenswrapper[36504]: I1203 23:04:15.436258 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw8qc" event={"ID":"b70e4f95-c957-490e-8381-9d30d58aef4d","Type":"ContainerDied","Data":"60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f"} Dec 03 23:04:15.436925 master-0 kubenswrapper[36504]: I1203 23:04:15.436303 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw8qc" event={"ID":"b70e4f95-c957-490e-8381-9d30d58aef4d","Type":"ContainerStarted","Data":"3d94964edd40651ad8a80e329d78f5cc5d3cf472139358da8eb8469585056ff5"} Dec 03 23:04:17.483908 master-0 kubenswrapper[36504]: I1203 23:04:17.483840 36504 generic.go:334] "Generic (PLEG): container finished" podID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerID="5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf" exitCode=0 Dec 03 23:04:17.484856 master-0 kubenswrapper[36504]: I1203 23:04:17.484793 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw8qc" event={"ID":"b70e4f95-c957-490e-8381-9d30d58aef4d","Type":"ContainerDied","Data":"5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf"} Dec 03 23:04:18.511593 master-0 kubenswrapper[36504]: I1203 23:04:18.510285 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw8qc" event={"ID":"b70e4f95-c957-490e-8381-9d30d58aef4d","Type":"ContainerStarted","Data":"134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5"} Dec 03 23:04:18.558584 master-0 kubenswrapper[36504]: I1203 23:04:18.558467 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lw8qc" podStartSLOduration=3.105172406 podStartE2EDuration="5.558436322s" podCreationTimestamp="2025-12-03 23:04:13 +0000 UTC" firstStartedPulling="2025-12-03 23:04:15.438980027 +0000 UTC m=+3220.658752034" lastFinishedPulling="2025-12-03 23:04:17.892243943 +0000 UTC m=+3223.112015950" observedRunningTime="2025-12-03 23:04:18.543525278 +0000 UTC m=+3223.763297305" watchObservedRunningTime="2025-12-03 23:04:18.558436322 +0000 UTC m=+3223.778208329" Dec 03 23:04:23.874390 master-0 kubenswrapper[36504]: I1203 23:04:23.874274 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:23.874390 master-0 kubenswrapper[36504]: I1203 23:04:23.874398 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:23.936734 master-0 kubenswrapper[36504]: I1203 23:04:23.936644 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:24.650973 master-0 kubenswrapper[36504]: I1203 23:04:24.650894 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:24.731936 master-0 kubenswrapper[36504]: I1203 23:04:24.731838 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw8qc"] Dec 03 23:04:26.623750 master-0 kubenswrapper[36504]: I1203 23:04:26.623620 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lw8qc" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="registry-server" containerID="cri-o://134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5" gracePeriod=2 Dec 03 23:04:27.400994 master-0 kubenswrapper[36504]: I1203 23:04:27.400950 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:27.549869 master-0 kubenswrapper[36504]: I1203 23:04:27.549600 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rrfc\" (UniqueName: \"kubernetes.io/projected/b70e4f95-c957-490e-8381-9d30d58aef4d-kube-api-access-9rrfc\") pod \"b70e4f95-c957-490e-8381-9d30d58aef4d\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " Dec 03 23:04:27.550358 master-0 kubenswrapper[36504]: I1203 23:04:27.549885 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-utilities\") pod \"b70e4f95-c957-490e-8381-9d30d58aef4d\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " Dec 03 23:04:27.550358 master-0 kubenswrapper[36504]: I1203 23:04:27.550225 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-catalog-content\") pod \"b70e4f95-c957-490e-8381-9d30d58aef4d\" (UID: \"b70e4f95-c957-490e-8381-9d30d58aef4d\") " Dec 03 23:04:27.551581 master-0 kubenswrapper[36504]: I1203 23:04:27.551510 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-utilities" (OuterVolumeSpecName: "utilities") pod "b70e4f95-c957-490e-8381-9d30d58aef4d" (UID: "b70e4f95-c957-490e-8381-9d30d58aef4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:04:27.554271 master-0 kubenswrapper[36504]: I1203 23:04:27.554211 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70e4f95-c957-490e-8381-9d30d58aef4d-kube-api-access-9rrfc" (OuterVolumeSpecName: "kube-api-access-9rrfc") pod "b70e4f95-c957-490e-8381-9d30d58aef4d" (UID: "b70e4f95-c957-490e-8381-9d30d58aef4d"). InnerVolumeSpecName "kube-api-access-9rrfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:04:27.576855 master-0 kubenswrapper[36504]: I1203 23:04:27.576757 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b70e4f95-c957-490e-8381-9d30d58aef4d" (UID: "b70e4f95-c957-490e-8381-9d30d58aef4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:04:27.641057 master-0 kubenswrapper[36504]: I1203 23:04:27.640996 36504 generic.go:334] "Generic (PLEG): container finished" podID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerID="134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5" exitCode=0 Dec 03 23:04:27.641057 master-0 kubenswrapper[36504]: I1203 23:04:27.641062 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw8qc" event={"ID":"b70e4f95-c957-490e-8381-9d30d58aef4d","Type":"ContainerDied","Data":"134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5"} Dec 03 23:04:27.641834 master-0 kubenswrapper[36504]: I1203 23:04:27.641101 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lw8qc" event={"ID":"b70e4f95-c957-490e-8381-9d30d58aef4d","Type":"ContainerDied","Data":"3d94964edd40651ad8a80e329d78f5cc5d3cf472139358da8eb8469585056ff5"} Dec 03 23:04:27.641834 master-0 kubenswrapper[36504]: I1203 23:04:27.641123 36504 scope.go:117] "RemoveContainer" containerID="134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5" Dec 03 23:04:27.641834 master-0 kubenswrapper[36504]: I1203 23:04:27.641283 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lw8qc" Dec 03 23:04:27.657639 master-0 kubenswrapper[36504]: I1203 23:04:27.657567 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:04:27.657639 master-0 kubenswrapper[36504]: I1203 23:04:27.657626 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rrfc\" (UniqueName: \"kubernetes.io/projected/b70e4f95-c957-490e-8381-9d30d58aef4d-kube-api-access-9rrfc\") on node \"master-0\" DevicePath \"\"" Dec 03 23:04:27.657856 master-0 kubenswrapper[36504]: I1203 23:04:27.657649 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b70e4f95-c957-490e-8381-9d30d58aef4d-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:04:27.693006 master-0 kubenswrapper[36504]: I1203 23:04:27.692911 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw8qc"] Dec 03 23:04:27.700006 master-0 kubenswrapper[36504]: I1203 23:04:27.699754 36504 scope.go:117] "RemoveContainer" containerID="5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf" Dec 03 23:04:27.709172 master-0 kubenswrapper[36504]: I1203 23:04:27.709082 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lw8qc"] Dec 03 23:04:27.763814 master-0 kubenswrapper[36504]: I1203 23:04:27.761394 36504 scope.go:117] "RemoveContainer" containerID="60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: I1203 23:04:27.828222 36504 scope.go:117] "RemoveContainer" containerID="134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: E1203 23:04:27.828829 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5\": container with ID starting with 134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5 not found: ID does not exist" containerID="134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: I1203 23:04:27.828884 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5"} err="failed to get container status \"134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5\": rpc error: code = NotFound desc = could not find container \"134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5\": container with ID starting with 134eafec4974355828e35857a4d9b485815bb618735b94fe508b2713d3ab89a5 not found: ID does not exist" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: I1203 23:04:27.828914 36504 scope.go:117] "RemoveContainer" containerID="5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: E1203 23:04:27.829623 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf\": container with ID starting with 5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf not found: ID does not exist" containerID="5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: I1203 23:04:27.829717 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf"} err="failed to get container status \"5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf\": rpc error: code = NotFound desc = could not find container \"5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf\": container with ID starting with 5b804eaf66202f5378eb431e23c4807c3462eeaa038e0648ae125647d2d45bbf not found: ID does not exist" Dec 03 23:04:27.829980 master-0 kubenswrapper[36504]: I1203 23:04:27.829744 36504 scope.go:117] "RemoveContainer" containerID="60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f" Dec 03 23:04:27.830668 master-0 kubenswrapper[36504]: E1203 23:04:27.830607 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f\": container with ID starting with 60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f not found: ID does not exist" containerID="60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f" Dec 03 23:04:27.830719 master-0 kubenswrapper[36504]: I1203 23:04:27.830664 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f"} err="failed to get container status \"60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f\": rpc error: code = NotFound desc = could not find container \"60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f\": container with ID starting with 60b63ba9b0abd5a0b704af6aa31ff3427dd3e29e5dc261773c5362fc00aeb01f not found: ID does not exist" Dec 03 23:04:29.117642 master-0 kubenswrapper[36504]: I1203 23:04:29.117562 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" path="/var/lib/kubelet/pods/b70e4f95-c957-490e-8381-9d30d58aef4d/volumes" Dec 03 23:04:35.769138 master-0 kubenswrapper[36504]: E1203 23:04:35.769057 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:05:08.096398 master-0 kubenswrapper[36504]: I1203 23:05:08.096302 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:05:28.097062 master-0 kubenswrapper[36504]: I1203 23:05:28.096985 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:05:35.748145 master-0 kubenswrapper[36504]: E1203 23:05:35.748071 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:06:32.097393 master-0 kubenswrapper[36504]: I1203 23:06:32.097310 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:06:35.541675 master-0 kubenswrapper[36504]: I1203 23:06:35.541610 36504 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a1ed8445-bac4-4e69-9462-c1e33f646315" containerName="galera" probeResult="failure" output="command timed out" Dec 03 23:06:35.542487 master-0 kubenswrapper[36504]: I1203 23:06:35.541610 36504 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="a1ed8445-bac4-4e69-9462-c1e33f646315" containerName="galera" probeResult="failure" output="command timed out" Dec 03 23:06:35.728631 master-0 kubenswrapper[36504]: E1203 23:06:35.728548 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:06:55.123198 master-0 kubenswrapper[36504]: I1203 23:06:55.123114 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:07:35.722747 master-0 kubenswrapper[36504]: E1203 23:07:35.722659 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:07:55.110885 master-0 kubenswrapper[36504]: I1203 23:07:55.107184 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:08:07.096358 master-0 kubenswrapper[36504]: I1203 23:08:07.096302 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:08:10.090798 master-0 kubenswrapper[36504]: I1203 23:08:10.090538 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k2gjk"] Dec 03 23:08:10.092993 master-0 kubenswrapper[36504]: E1203 23:08:10.091659 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="extract-content" Dec 03 23:08:10.092993 master-0 kubenswrapper[36504]: I1203 23:08:10.091698 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="extract-content" Dec 03 23:08:10.092993 master-0 kubenswrapper[36504]: E1203 23:08:10.091868 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="extract-utilities" Dec 03 23:08:10.092993 master-0 kubenswrapper[36504]: I1203 23:08:10.091882 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="extract-utilities" Dec 03 23:08:10.092993 master-0 kubenswrapper[36504]: E1203 23:08:10.091901 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="registry-server" Dec 03 23:08:10.092993 master-0 kubenswrapper[36504]: I1203 23:08:10.091909 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="registry-server" Dec 03 23:08:10.094836 master-0 kubenswrapper[36504]: I1203 23:08:10.094698 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70e4f95-c957-490e-8381-9d30d58aef4d" containerName="registry-server" Dec 03 23:08:10.102479 master-0 kubenswrapper[36504]: I1203 23:08:10.101840 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.109206 master-0 kubenswrapper[36504]: I1203 23:08:10.108463 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2gjk"] Dec 03 23:08:10.223004 master-0 kubenswrapper[36504]: I1203 23:08:10.222929 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-utilities\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.223332 master-0 kubenswrapper[36504]: I1203 23:08:10.223202 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtkqg\" (UniqueName: \"kubernetes.io/projected/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-kube-api-access-vtkqg\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.223332 master-0 kubenswrapper[36504]: I1203 23:08:10.223295 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-catalog-content\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.329160 master-0 kubenswrapper[36504]: I1203 23:08:10.329089 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-utilities\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.329626 master-0 kubenswrapper[36504]: I1203 23:08:10.329601 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtkqg\" (UniqueName: \"kubernetes.io/projected/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-kube-api-access-vtkqg\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.329750 master-0 kubenswrapper[36504]: I1203 23:08:10.329733 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-catalog-content\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.333849 master-0 kubenswrapper[36504]: I1203 23:08:10.330603 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-utilities\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.334300 master-0 kubenswrapper[36504]: I1203 23:08:10.334236 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-catalog-content\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.351423 master-0 kubenswrapper[36504]: I1203 23:08:10.351213 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtkqg\" (UniqueName: \"kubernetes.io/projected/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-kube-api-access-vtkqg\") pod \"certified-operators-k2gjk\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:10.431610 master-0 kubenswrapper[36504]: I1203 23:08:10.431469 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:11.116411 master-0 kubenswrapper[36504]: I1203 23:08:11.116351 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k2gjk"] Dec 03 23:08:11.994789 master-0 kubenswrapper[36504]: I1203 23:08:11.994698 36504 generic.go:334] "Generic (PLEG): container finished" podID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerID="766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6" exitCode=0 Dec 03 23:08:11.994789 master-0 kubenswrapper[36504]: I1203 23:08:11.994763 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerDied","Data":"766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6"} Dec 03 23:08:11.995130 master-0 kubenswrapper[36504]: I1203 23:08:11.994812 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerStarted","Data":"edf472e1a75dffe2d223248e6a98fcab98b7304d2a8214e84525ddb6c23a1e7e"} Dec 03 23:08:11.997556 master-0 kubenswrapper[36504]: I1203 23:08:11.997521 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:08:13.033451 master-0 kubenswrapper[36504]: I1203 23:08:13.033274 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerStarted","Data":"ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa"} Dec 03 23:08:14.056183 master-0 kubenswrapper[36504]: I1203 23:08:14.056115 36504 generic.go:334] "Generic (PLEG): container finished" podID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerID="ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa" exitCode=0 Dec 03 23:08:14.056946 master-0 kubenswrapper[36504]: I1203 23:08:14.056226 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerDied","Data":"ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa"} Dec 03 23:08:15.088884 master-0 kubenswrapper[36504]: I1203 23:08:15.087802 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerStarted","Data":"e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3"} Dec 03 23:08:15.130839 master-0 kubenswrapper[36504]: I1203 23:08:15.127922 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k2gjk" podStartSLOduration=2.491302351 podStartE2EDuration="5.127890235s" podCreationTimestamp="2025-12-03 23:08:10 +0000 UTC" firstStartedPulling="2025-12-03 23:08:11.997438045 +0000 UTC m=+3457.217210052" lastFinishedPulling="2025-12-03 23:08:14.634025929 +0000 UTC m=+3459.853797936" observedRunningTime="2025-12-03 23:08:15.125391527 +0000 UTC m=+3460.345163554" watchObservedRunningTime="2025-12-03 23:08:15.127890235 +0000 UTC m=+3460.347662252" Dec 03 23:08:20.431795 master-0 kubenswrapper[36504]: I1203 23:08:20.431670 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:20.432554 master-0 kubenswrapper[36504]: I1203 23:08:20.431831 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:20.482711 master-0 kubenswrapper[36504]: I1203 23:08:20.482638 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:21.244542 master-0 kubenswrapper[36504]: I1203 23:08:21.244470 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:21.340021 master-0 kubenswrapper[36504]: I1203 23:08:21.338956 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2gjk"] Dec 03 23:08:23.209947 master-0 kubenswrapper[36504]: I1203 23:08:23.209833 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k2gjk" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="registry-server" containerID="cri-o://e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3" gracePeriod=2 Dec 03 23:08:23.960834 master-0 kubenswrapper[36504]: I1203 23:08:23.960792 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:24.098797 master-0 kubenswrapper[36504]: I1203 23:08:24.098654 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-catalog-content\") pod \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " Dec 03 23:08:24.099412 master-0 kubenswrapper[36504]: I1203 23:08:24.099393 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtkqg\" (UniqueName: \"kubernetes.io/projected/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-kube-api-access-vtkqg\") pod \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " Dec 03 23:08:24.099753 master-0 kubenswrapper[36504]: I1203 23:08:24.099734 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-utilities\") pod \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\" (UID: \"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e\") " Dec 03 23:08:24.101710 master-0 kubenswrapper[36504]: I1203 23:08:24.101635 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-utilities" (OuterVolumeSpecName: "utilities") pod "a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" (UID: "a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:08:24.102387 master-0 kubenswrapper[36504]: I1203 23:08:24.102351 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:08:24.109114 master-0 kubenswrapper[36504]: I1203 23:08:24.109066 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-kube-api-access-vtkqg" (OuterVolumeSpecName: "kube-api-access-vtkqg") pod "a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" (UID: "a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e"). InnerVolumeSpecName "kube-api-access-vtkqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:08:24.151233 master-0 kubenswrapper[36504]: I1203 23:08:24.151107 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" (UID: "a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:08:24.206152 master-0 kubenswrapper[36504]: I1203 23:08:24.206073 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:08:24.206152 master-0 kubenswrapper[36504]: I1203 23:08:24.206131 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtkqg\" (UniqueName: \"kubernetes.io/projected/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e-kube-api-access-vtkqg\") on node \"master-0\" DevicePath \"\"" Dec 03 23:08:24.226472 master-0 kubenswrapper[36504]: I1203 23:08:24.226396 36504 generic.go:334] "Generic (PLEG): container finished" podID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerID="e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3" exitCode=0 Dec 03 23:08:24.226472 master-0 kubenswrapper[36504]: I1203 23:08:24.226464 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerDied","Data":"e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3"} Dec 03 23:08:24.227278 master-0 kubenswrapper[36504]: I1203 23:08:24.226491 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k2gjk" Dec 03 23:08:24.227278 master-0 kubenswrapper[36504]: I1203 23:08:24.226513 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k2gjk" event={"ID":"a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e","Type":"ContainerDied","Data":"edf472e1a75dffe2d223248e6a98fcab98b7304d2a8214e84525ddb6c23a1e7e"} Dec 03 23:08:24.227278 master-0 kubenswrapper[36504]: I1203 23:08:24.226538 36504 scope.go:117] "RemoveContainer" containerID="e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3" Dec 03 23:08:24.268223 master-0 kubenswrapper[36504]: I1203 23:08:24.268158 36504 scope.go:117] "RemoveContainer" containerID="ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa" Dec 03 23:08:24.288492 master-0 kubenswrapper[36504]: I1203 23:08:24.288407 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k2gjk"] Dec 03 23:08:24.310843 master-0 kubenswrapper[36504]: I1203 23:08:24.310758 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k2gjk"] Dec 03 23:08:24.313029 master-0 kubenswrapper[36504]: I1203 23:08:24.312986 36504 scope.go:117] "RemoveContainer" containerID="766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: I1203 23:08:24.383400 36504 scope.go:117] "RemoveContainer" containerID="e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: E1203 23:08:24.383884 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3\": container with ID starting with e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3 not found: ID does not exist" containerID="e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: I1203 23:08:24.383925 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3"} err="failed to get container status \"e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3\": rpc error: code = NotFound desc = could not find container \"e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3\": container with ID starting with e71991cecab6fac2e3361464c74283a1ce40d99fa2f2f36b4f187ce3fdebb7e3 not found: ID does not exist" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: I1203 23:08:24.383958 36504 scope.go:117] "RemoveContainer" containerID="ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: E1203 23:08:24.384191 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa\": container with ID starting with ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa not found: ID does not exist" containerID="ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: I1203 23:08:24.384215 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa"} err="failed to get container status \"ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa\": rpc error: code = NotFound desc = could not find container \"ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa\": container with ID starting with ce7c518e7c66ced934da9056e3ba69e665c60c5dc660b4a8a9867b8acae05afa not found: ID does not exist" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: I1203 23:08:24.384229 36504 scope.go:117] "RemoveContainer" containerID="766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: E1203 23:08:24.384453 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6\": container with ID starting with 766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6 not found: ID does not exist" containerID="766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6" Dec 03 23:08:24.426844 master-0 kubenswrapper[36504]: I1203 23:08:24.384475 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6"} err="failed to get container status \"766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6\": rpc error: code = NotFound desc = could not find container \"766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6\": container with ID starting with 766467c0d1c8df015c36494c6c075871b3095f52ab279d95d248da7339308ff6 not found: ID does not exist" Dec 03 23:08:25.111755 master-0 kubenswrapper[36504]: I1203 23:08:25.110942 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" path="/var/lib/kubelet/pods/a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e/volumes" Dec 03 23:08:35.722675 master-0 kubenswrapper[36504]: E1203 23:08:35.722568 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:08:39.054017 master-0 kubenswrapper[36504]: I1203 23:08:39.053963 36504 trace.go:236] Trace[1190469420]: "Calculate volume metrics of glance for pod openstack/glance-baebb-default-external-api-0" (03-Dec-2025 23:08:37.801) (total time: 1251ms): Dec 03 23:08:39.054017 master-0 kubenswrapper[36504]: Trace[1190469420]: [1.251992855s] [1.251992855s] END Dec 03 23:09:06.099333 master-0 kubenswrapper[36504]: I1203 23:09:06.099249 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:09:09.096892 master-0 kubenswrapper[36504]: I1203 23:09:09.096831 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:09:35.721378 master-0 kubenswrapper[36504]: E1203 23:09:35.721300 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:09:58.527647 master-0 kubenswrapper[36504]: I1203 23:09:58.527336 36504 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-65dc9c979b-4jgcr" podUID="93b454f7-f13a-4fa7-9b55-06cc102dd59e" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 23:10:14.096152 master-0 kubenswrapper[36504]: I1203 23:10:14.096082 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:10:34.095289 master-0 kubenswrapper[36504]: I1203 23:10:34.095219 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:10:35.724488 master-0 kubenswrapper[36504]: E1203 23:10:35.724413 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:10:49.941109 master-0 kubenswrapper[36504]: I1203 23:10:49.940916 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t88zx"] Dec 03 23:10:49.941994 master-0 kubenswrapper[36504]: E1203 23:10:49.941946 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="extract-utilities" Dec 03 23:10:49.941994 master-0 kubenswrapper[36504]: I1203 23:10:49.941970 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="extract-utilities" Dec 03 23:10:49.942111 master-0 kubenswrapper[36504]: E1203 23:10:49.942007 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="extract-content" Dec 03 23:10:49.942111 master-0 kubenswrapper[36504]: I1203 23:10:49.942021 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="extract-content" Dec 03 23:10:49.942111 master-0 kubenswrapper[36504]: E1203 23:10:49.942067 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="registry-server" Dec 03 23:10:49.942111 master-0 kubenswrapper[36504]: I1203 23:10:49.942076 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="registry-server" Dec 03 23:10:49.942837 master-0 kubenswrapper[36504]: I1203 23:10:49.942508 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="a99f71e8-be80-4aa2-9ab5-3e4fb13dac5e" containerName="registry-server" Dec 03 23:10:49.945923 master-0 kubenswrapper[36504]: I1203 23:10:49.945741 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:49.955434 master-0 kubenswrapper[36504]: I1203 23:10:49.955327 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t88zx"] Dec 03 23:10:50.100801 master-0 kubenswrapper[36504]: I1203 23:10:50.100723 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-catalog-content\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.103709 master-0 kubenswrapper[36504]: I1203 23:10:50.103521 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-utilities\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.105469 master-0 kubenswrapper[36504]: I1203 23:10:50.105309 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6bz\" (UniqueName: \"kubernetes.io/projected/3066f9cb-6a59-4316-b041-8eb7e313a05a-kube-api-access-tc6bz\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.208640 master-0 kubenswrapper[36504]: I1203 23:10:50.208480 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6bz\" (UniqueName: \"kubernetes.io/projected/3066f9cb-6a59-4316-b041-8eb7e313a05a-kube-api-access-tc6bz\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.208640 master-0 kubenswrapper[36504]: I1203 23:10:50.208624 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-catalog-content\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.208963 master-0 kubenswrapper[36504]: I1203 23:10:50.208858 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-utilities\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.209586 master-0 kubenswrapper[36504]: I1203 23:10:50.209551 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-utilities\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.209586 master-0 kubenswrapper[36504]: I1203 23:10:50.209564 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-catalog-content\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.237825 master-0 kubenswrapper[36504]: I1203 23:10:50.237758 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6bz\" (UniqueName: \"kubernetes.io/projected/3066f9cb-6a59-4316-b041-8eb7e313a05a-kube-api-access-tc6bz\") pod \"community-operators-t88zx\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.306902 master-0 kubenswrapper[36504]: I1203 23:10:50.306663 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:10:50.902964 master-0 kubenswrapper[36504]: I1203 23:10:50.899031 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t88zx"] Dec 03 23:10:51.459152 master-0 kubenswrapper[36504]: I1203 23:10:51.459000 36504 generic.go:334] "Generic (PLEG): container finished" podID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerID="25470b6bd15974a75004b8177884b2a90e57b9606af326bb52763175d906e61c" exitCode=0 Dec 03 23:10:51.459837 master-0 kubenswrapper[36504]: I1203 23:10:51.459112 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerDied","Data":"25470b6bd15974a75004b8177884b2a90e57b9606af326bb52763175d906e61c"} Dec 03 23:10:51.460063 master-0 kubenswrapper[36504]: I1203 23:10:51.460038 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerStarted","Data":"9427671530ba4bbd769653d2741bcf18d1bf95f3877ec0c21491d8ca690ffc86"} Dec 03 23:10:52.477354 master-0 kubenswrapper[36504]: I1203 23:10:52.477293 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerStarted","Data":"b6d2ef8fc8d6d3af4ce62fcf03e7e4cb3ad41dbe5fbbb572d7939d04d5cf101c"} Dec 03 23:10:53.492921 master-0 kubenswrapper[36504]: I1203 23:10:53.492840 36504 generic.go:334] "Generic (PLEG): container finished" podID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerID="b6d2ef8fc8d6d3af4ce62fcf03e7e4cb3ad41dbe5fbbb572d7939d04d5cf101c" exitCode=0 Dec 03 23:10:53.492921 master-0 kubenswrapper[36504]: I1203 23:10:53.492908 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerDied","Data":"b6d2ef8fc8d6d3af4ce62fcf03e7e4cb3ad41dbe5fbbb572d7939d04d5cf101c"} Dec 03 23:10:54.509951 master-0 kubenswrapper[36504]: I1203 23:10:54.509788 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerStarted","Data":"bb73e2b0f95f9bf6f039e545f3a12c24895efa6b5ebed31392fa072a95b9ab7f"} Dec 03 23:10:54.539586 master-0 kubenswrapper[36504]: I1203 23:10:54.539416 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t88zx" podStartSLOduration=3.135537394 podStartE2EDuration="5.539387538s" podCreationTimestamp="2025-12-03 23:10:49 +0000 UTC" firstStartedPulling="2025-12-03 23:10:51.461571593 +0000 UTC m=+3616.681343600" lastFinishedPulling="2025-12-03 23:10:53.865421737 +0000 UTC m=+3619.085193744" observedRunningTime="2025-12-03 23:10:54.534589308 +0000 UTC m=+3619.754361345" watchObservedRunningTime="2025-12-03 23:10:54.539387538 +0000 UTC m=+3619.759159545" Dec 03 23:11:00.307205 master-0 kubenswrapper[36504]: I1203 23:11:00.306978 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:11:00.307205 master-0 kubenswrapper[36504]: I1203 23:11:00.307069 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:11:00.366664 master-0 kubenswrapper[36504]: I1203 23:11:00.366592 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:11:00.650852 master-0 kubenswrapper[36504]: I1203 23:11:00.650801 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:11:00.726845 master-0 kubenswrapper[36504]: I1203 23:11:00.726722 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t88zx"] Dec 03 23:11:02.627371 master-0 kubenswrapper[36504]: I1203 23:11:02.627222 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t88zx" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="registry-server" containerID="cri-o://bb73e2b0f95f9bf6f039e545f3a12c24895efa6b5ebed31392fa072a95b9ab7f" gracePeriod=2 Dec 03 23:11:03.647014 master-0 kubenswrapper[36504]: I1203 23:11:03.646923 36504 generic.go:334] "Generic (PLEG): container finished" podID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerID="bb73e2b0f95f9bf6f039e545f3a12c24895efa6b5ebed31392fa072a95b9ab7f" exitCode=0 Dec 03 23:11:03.647014 master-0 kubenswrapper[36504]: I1203 23:11:03.647006 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerDied","Data":"bb73e2b0f95f9bf6f039e545f3a12c24895efa6b5ebed31392fa072a95b9ab7f"} Dec 03 23:11:03.894121 master-0 kubenswrapper[36504]: I1203 23:11:03.894054 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:11:04.027265 master-0 kubenswrapper[36504]: I1203 23:11:04.027104 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc6bz\" (UniqueName: \"kubernetes.io/projected/3066f9cb-6a59-4316-b041-8eb7e313a05a-kube-api-access-tc6bz\") pod \"3066f9cb-6a59-4316-b041-8eb7e313a05a\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " Dec 03 23:11:04.027578 master-0 kubenswrapper[36504]: I1203 23:11:04.027283 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-utilities\") pod \"3066f9cb-6a59-4316-b041-8eb7e313a05a\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " Dec 03 23:11:04.027578 master-0 kubenswrapper[36504]: I1203 23:11:04.027502 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-catalog-content\") pod \"3066f9cb-6a59-4316-b041-8eb7e313a05a\" (UID: \"3066f9cb-6a59-4316-b041-8eb7e313a05a\") " Dec 03 23:11:04.029366 master-0 kubenswrapper[36504]: I1203 23:11:04.029285 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-utilities" (OuterVolumeSpecName: "utilities") pod "3066f9cb-6a59-4316-b041-8eb7e313a05a" (UID: "3066f9cb-6a59-4316-b041-8eb7e313a05a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:11:04.033003 master-0 kubenswrapper[36504]: I1203 23:11:04.032925 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3066f9cb-6a59-4316-b041-8eb7e313a05a-kube-api-access-tc6bz" (OuterVolumeSpecName: "kube-api-access-tc6bz") pod "3066f9cb-6a59-4316-b041-8eb7e313a05a" (UID: "3066f9cb-6a59-4316-b041-8eb7e313a05a"). InnerVolumeSpecName "kube-api-access-tc6bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:11:04.081452 master-0 kubenswrapper[36504]: I1203 23:11:04.081350 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3066f9cb-6a59-4316-b041-8eb7e313a05a" (UID: "3066f9cb-6a59-4316-b041-8eb7e313a05a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:11:04.133337 master-0 kubenswrapper[36504]: I1203 23:11:04.133256 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc6bz\" (UniqueName: \"kubernetes.io/projected/3066f9cb-6a59-4316-b041-8eb7e313a05a-kube-api-access-tc6bz\") on node \"master-0\" DevicePath \"\"" Dec 03 23:11:04.133337 master-0 kubenswrapper[36504]: I1203 23:11:04.133313 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:11:04.133337 master-0 kubenswrapper[36504]: I1203 23:11:04.133324 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3066f9cb-6a59-4316-b041-8eb7e313a05a-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:11:04.664072 master-0 kubenswrapper[36504]: I1203 23:11:04.663981 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t88zx" event={"ID":"3066f9cb-6a59-4316-b041-8eb7e313a05a","Type":"ContainerDied","Data":"9427671530ba4bbd769653d2741bcf18d1bf95f3877ec0c21491d8ca690ffc86"} Dec 03 23:11:04.664833 master-0 kubenswrapper[36504]: I1203 23:11:04.664085 36504 scope.go:117] "RemoveContainer" containerID="bb73e2b0f95f9bf6f039e545f3a12c24895efa6b5ebed31392fa072a95b9ab7f" Dec 03 23:11:04.664833 master-0 kubenswrapper[36504]: I1203 23:11:04.664145 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t88zx" Dec 03 23:11:04.693914 master-0 kubenswrapper[36504]: I1203 23:11:04.693855 36504 scope.go:117] "RemoveContainer" containerID="b6d2ef8fc8d6d3af4ce62fcf03e7e4cb3ad41dbe5fbbb572d7939d04d5cf101c" Dec 03 23:11:04.723028 master-0 kubenswrapper[36504]: I1203 23:11:04.722869 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t88zx"] Dec 03 23:11:04.737679 master-0 kubenswrapper[36504]: I1203 23:11:04.737629 36504 scope.go:117] "RemoveContainer" containerID="25470b6bd15974a75004b8177884b2a90e57b9606af326bb52763175d906e61c" Dec 03 23:11:04.738399 master-0 kubenswrapper[36504]: I1203 23:11:04.738208 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t88zx"] Dec 03 23:11:05.113872 master-0 kubenswrapper[36504]: I1203 23:11:05.113708 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" path="/var/lib/kubelet/pods/3066f9cb-6a59-4316-b041-8eb7e313a05a/volumes" Dec 03 23:11:35.727400 master-0 kubenswrapper[36504]: E1203 23:11:35.727315 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:11:40.096532 master-0 kubenswrapper[36504]: I1203 23:11:40.096439 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:11:42.096073 master-0 kubenswrapper[36504]: I1203 23:11:42.096009 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:12:35.720183 master-0 kubenswrapper[36504]: E1203 23:12:35.720097 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:12:44.097366 master-0 kubenswrapper[36504]: I1203 23:12:44.097277 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:13:07.096839 master-0 kubenswrapper[36504]: I1203 23:13:07.096751 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:13:35.721931 master-0 kubenswrapper[36504]: E1203 23:13:35.721849 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:14:05.097991 master-0 kubenswrapper[36504]: I1203 23:14:05.097887 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:14:09.096745 master-0 kubenswrapper[36504]: I1203 23:14:09.096686 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: I1203 23:14:16.375108 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jcnrr"] Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: E1203 23:14:16.376299 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="extract-content" Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: I1203 23:14:16.376331 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="extract-content" Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: E1203 23:14:16.376358 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="registry-server" Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: I1203 23:14:16.376367 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="registry-server" Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: E1203 23:14:16.376435 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="extract-utilities" Dec 03 23:14:16.377321 master-0 kubenswrapper[36504]: I1203 23:14:16.376446 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="extract-utilities" Dec 03 23:14:16.378277 master-0 kubenswrapper[36504]: I1203 23:14:16.377626 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="3066f9cb-6a59-4316-b041-8eb7e313a05a" containerName="registry-server" Dec 03 23:14:16.381180 master-0 kubenswrapper[36504]: I1203 23:14:16.381113 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.393421 master-0 kubenswrapper[36504]: I1203 23:14:16.390394 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcnrr"] Dec 03 23:14:16.501078 master-0 kubenswrapper[36504]: I1203 23:14:16.500974 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghjd7\" (UniqueName: \"kubernetes.io/projected/b18e9dcc-7668-4dce-958e-9bb72847a3b6-kube-api-access-ghjd7\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.501703 master-0 kubenswrapper[36504]: I1203 23:14:16.501626 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-utilities\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.501811 master-0 kubenswrapper[36504]: I1203 23:14:16.501705 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-catalog-content\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.605741 master-0 kubenswrapper[36504]: I1203 23:14:16.605672 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghjd7\" (UniqueName: \"kubernetes.io/projected/b18e9dcc-7668-4dce-958e-9bb72847a3b6-kube-api-access-ghjd7\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.606782 master-0 kubenswrapper[36504]: I1203 23:14:16.605952 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-utilities\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.606782 master-0 kubenswrapper[36504]: I1203 23:14:16.605987 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-catalog-content\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.606782 master-0 kubenswrapper[36504]: I1203 23:14:16.606559 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-utilities\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:16.606782 master-0 kubenswrapper[36504]: I1203 23:14:16.606612 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-catalog-content\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:17.343441 master-0 kubenswrapper[36504]: I1203 23:14:17.343378 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghjd7\" (UniqueName: \"kubernetes.io/projected/b18e9dcc-7668-4dce-958e-9bb72847a3b6-kube-api-access-ghjd7\") pod \"redhat-marketplace-jcnrr\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:17.629744 master-0 kubenswrapper[36504]: I1203 23:14:17.629675 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:18.164950 master-0 kubenswrapper[36504]: W1203 23:14:18.162316 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb18e9dcc_7668_4dce_958e_9bb72847a3b6.slice/crio-2b37f19fa2cc8a164d5a43479ae031e7eaf2ffc870338228a0cda5a77ec1a01a WatchSource:0}: Error finding container 2b37f19fa2cc8a164d5a43479ae031e7eaf2ffc870338228a0cda5a77ec1a01a: Status 404 returned error can't find the container with id 2b37f19fa2cc8a164d5a43479ae031e7eaf2ffc870338228a0cda5a77ec1a01a Dec 03 23:14:18.167438 master-0 kubenswrapper[36504]: I1203 23:14:18.167371 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcnrr"] Dec 03 23:14:18.607886 master-0 kubenswrapper[36504]: I1203 23:14:18.607827 36504 generic.go:334] "Generic (PLEG): container finished" podID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerID="9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9" exitCode=0 Dec 03 23:14:18.607886 master-0 kubenswrapper[36504]: I1203 23:14:18.607887 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerDied","Data":"9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9"} Dec 03 23:14:18.608232 master-0 kubenswrapper[36504]: I1203 23:14:18.607917 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerStarted","Data":"2b37f19fa2cc8a164d5a43479ae031e7eaf2ffc870338228a0cda5a77ec1a01a"} Dec 03 23:14:18.610536 master-0 kubenswrapper[36504]: I1203 23:14:18.610476 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:14:19.629719 master-0 kubenswrapper[36504]: I1203 23:14:19.629239 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerStarted","Data":"670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07"} Dec 03 23:14:20.646317 master-0 kubenswrapper[36504]: I1203 23:14:20.646243 36504 generic.go:334] "Generic (PLEG): container finished" podID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerID="670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07" exitCode=0 Dec 03 23:14:20.646317 master-0 kubenswrapper[36504]: I1203 23:14:20.646314 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerDied","Data":"670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07"} Dec 03 23:14:21.668474 master-0 kubenswrapper[36504]: I1203 23:14:21.668150 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerStarted","Data":"37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416"} Dec 03 23:14:21.723458 master-0 kubenswrapper[36504]: I1203 23:14:21.723342 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jcnrr" podStartSLOduration=3.2596344139999998 podStartE2EDuration="5.723278316s" podCreationTimestamp="2025-12-03 23:14:16 +0000 UTC" firstStartedPulling="2025-12-03 23:14:18.610347383 +0000 UTC m=+3823.830119390" lastFinishedPulling="2025-12-03 23:14:21.073991285 +0000 UTC m=+3826.293763292" observedRunningTime="2025-12-03 23:14:21.692815294 +0000 UTC m=+3826.912587301" watchObservedRunningTime="2025-12-03 23:14:21.723278316 +0000 UTC m=+3826.943050323" Dec 03 23:14:27.630419 master-0 kubenswrapper[36504]: I1203 23:14:27.630325 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:27.630419 master-0 kubenswrapper[36504]: I1203 23:14:27.630408 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:27.686238 master-0 kubenswrapper[36504]: I1203 23:14:27.686167 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:27.808258 master-0 kubenswrapper[36504]: I1203 23:14:27.808178 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:27.936298 master-0 kubenswrapper[36504]: I1203 23:14:27.935523 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcnrr"] Dec 03 23:14:29.778699 master-0 kubenswrapper[36504]: I1203 23:14:29.778478 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jcnrr" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="registry-server" containerID="cri-o://37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416" gracePeriod=2 Dec 03 23:14:30.523237 master-0 kubenswrapper[36504]: I1203 23:14:30.523168 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:30.675591 master-0 kubenswrapper[36504]: I1203 23:14:30.675512 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghjd7\" (UniqueName: \"kubernetes.io/projected/b18e9dcc-7668-4dce-958e-9bb72847a3b6-kube-api-access-ghjd7\") pod \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " Dec 03 23:14:30.676041 master-0 kubenswrapper[36504]: I1203 23:14:30.675865 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-catalog-content\") pod \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " Dec 03 23:14:30.676132 master-0 kubenswrapper[36504]: I1203 23:14:30.676045 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-utilities\") pod \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\" (UID: \"b18e9dcc-7668-4dce-958e-9bb72847a3b6\") " Dec 03 23:14:30.677364 master-0 kubenswrapper[36504]: I1203 23:14:30.677271 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-utilities" (OuterVolumeSpecName: "utilities") pod "b18e9dcc-7668-4dce-958e-9bb72847a3b6" (UID: "b18e9dcc-7668-4dce-958e-9bb72847a3b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:14:30.680017 master-0 kubenswrapper[36504]: I1203 23:14:30.679127 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18e9dcc-7668-4dce-958e-9bb72847a3b6-kube-api-access-ghjd7" (OuterVolumeSpecName: "kube-api-access-ghjd7") pod "b18e9dcc-7668-4dce-958e-9bb72847a3b6" (UID: "b18e9dcc-7668-4dce-958e-9bb72847a3b6"). InnerVolumeSpecName "kube-api-access-ghjd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:14:30.695763 master-0 kubenswrapper[36504]: I1203 23:14:30.695673 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b18e9dcc-7668-4dce-958e-9bb72847a3b6" (UID: "b18e9dcc-7668-4dce-958e-9bb72847a3b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:14:30.781935 master-0 kubenswrapper[36504]: I1203 23:14:30.781834 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghjd7\" (UniqueName: \"kubernetes.io/projected/b18e9dcc-7668-4dce-958e-9bb72847a3b6-kube-api-access-ghjd7\") on node \"master-0\" DevicePath \"\"" Dec 03 23:14:30.781935 master-0 kubenswrapper[36504]: I1203 23:14:30.781902 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:14:30.781935 master-0 kubenswrapper[36504]: I1203 23:14:30.781914 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b18e9dcc-7668-4dce-958e-9bb72847a3b6-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:14:30.799287 master-0 kubenswrapper[36504]: I1203 23:14:30.799202 36504 generic.go:334] "Generic (PLEG): container finished" podID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerID="37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416" exitCode=0 Dec 03 23:14:30.799639 master-0 kubenswrapper[36504]: I1203 23:14:30.799314 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerDied","Data":"37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416"} Dec 03 23:14:30.799639 master-0 kubenswrapper[36504]: I1203 23:14:30.799358 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jcnrr" event={"ID":"b18e9dcc-7668-4dce-958e-9bb72847a3b6","Type":"ContainerDied","Data":"2b37f19fa2cc8a164d5a43479ae031e7eaf2ffc870338228a0cda5a77ec1a01a"} Dec 03 23:14:30.799639 master-0 kubenswrapper[36504]: I1203 23:14:30.799392 36504 scope.go:117] "RemoveContainer" containerID="37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416" Dec 03 23:14:30.799757 master-0 kubenswrapper[36504]: I1203 23:14:30.799649 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jcnrr" Dec 03 23:14:30.829257 master-0 kubenswrapper[36504]: I1203 23:14:30.829181 36504 scope.go:117] "RemoveContainer" containerID="670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07" Dec 03 23:14:30.869963 master-0 kubenswrapper[36504]: I1203 23:14:30.869801 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcnrr"] Dec 03 23:14:30.876515 master-0 kubenswrapper[36504]: I1203 23:14:30.876441 36504 scope.go:117] "RemoveContainer" containerID="9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9" Dec 03 23:14:30.894103 master-0 kubenswrapper[36504]: I1203 23:14:30.894013 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jcnrr"] Dec 03 23:14:30.927612 master-0 kubenswrapper[36504]: I1203 23:14:30.927460 36504 scope.go:117] "RemoveContainer" containerID="37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416" Dec 03 23:14:30.928282 master-0 kubenswrapper[36504]: E1203 23:14:30.928239 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416\": container with ID starting with 37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416 not found: ID does not exist" containerID="37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416" Dec 03 23:14:30.928391 master-0 kubenswrapper[36504]: I1203 23:14:30.928285 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416"} err="failed to get container status \"37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416\": rpc error: code = NotFound desc = could not find container \"37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416\": container with ID starting with 37bc48fbe7c83496a9566d8a08ba72b3b3a314297597754ba052687254dba416 not found: ID does not exist" Dec 03 23:14:30.928391 master-0 kubenswrapper[36504]: I1203 23:14:30.928325 36504 scope.go:117] "RemoveContainer" containerID="670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07" Dec 03 23:14:30.928732 master-0 kubenswrapper[36504]: E1203 23:14:30.928695 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07\": container with ID starting with 670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07 not found: ID does not exist" containerID="670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07" Dec 03 23:14:30.928732 master-0 kubenswrapper[36504]: I1203 23:14:30.928722 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07"} err="failed to get container status \"670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07\": rpc error: code = NotFound desc = could not find container \"670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07\": container with ID starting with 670212353d1d8397d4bde57f0d6c8e5b02e0d0ab04271ff9b7c8a4c73cf1eb07 not found: ID does not exist" Dec 03 23:14:30.928894 master-0 kubenswrapper[36504]: I1203 23:14:30.928738 36504 scope.go:117] "RemoveContainer" containerID="9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9" Dec 03 23:14:30.929177 master-0 kubenswrapper[36504]: E1203 23:14:30.929098 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9\": container with ID starting with 9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9 not found: ID does not exist" containerID="9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9" Dec 03 23:14:30.929262 master-0 kubenswrapper[36504]: I1203 23:14:30.929161 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9"} err="failed to get container status \"9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9\": rpc error: code = NotFound desc = could not find container \"9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9\": container with ID starting with 9213c77fd61b91c2e971708b6cd187fd6f824d6d29df768de4f41d1e1f5a6fc9 not found: ID does not exist" Dec 03 23:14:31.112640 master-0 kubenswrapper[36504]: I1203 23:14:31.112579 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" path="/var/lib/kubelet/pods/b18e9dcc-7668-4dce-958e-9bb72847a3b6/volumes" Dec 03 23:14:35.722997 master-0 kubenswrapper[36504]: E1203 23:14:35.722908 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:15:00.226595 master-0 kubenswrapper[36504]: I1203 23:15:00.226374 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56"] Dec 03 23:15:00.227424 master-0 kubenswrapper[36504]: E1203 23:15:00.227249 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="extract-content" Dec 03 23:15:00.227424 master-0 kubenswrapper[36504]: I1203 23:15:00.227269 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="extract-content" Dec 03 23:15:00.227424 master-0 kubenswrapper[36504]: E1203 23:15:00.227350 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="registry-server" Dec 03 23:15:00.227424 master-0 kubenswrapper[36504]: I1203 23:15:00.227357 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="registry-server" Dec 03 23:15:00.227424 master-0 kubenswrapper[36504]: E1203 23:15:00.227400 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="extract-utilities" Dec 03 23:15:00.227424 master-0 kubenswrapper[36504]: I1203 23:15:00.227411 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="extract-utilities" Dec 03 23:15:00.228082 master-0 kubenswrapper[36504]: I1203 23:15:00.227801 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18e9dcc-7668-4dce-958e-9bb72847a3b6" containerName="registry-server" Dec 03 23:15:00.229304 master-0 kubenswrapper[36504]: I1203 23:15:00.229253 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.233066 master-0 kubenswrapper[36504]: I1203 23:15:00.232967 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:15:00.233243 master-0 kubenswrapper[36504]: I1203 23:15:00.233100 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 23:15:00.258512 master-0 kubenswrapper[36504]: I1203 23:15:00.258455 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56"] Dec 03 23:15:00.383819 master-0 kubenswrapper[36504]: I1203 23:15:00.383728 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3acdc87-0aac-4ce0-9695-24923053531d-secret-volume\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.385424 master-0 kubenswrapper[36504]: I1203 23:15:00.385357 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3acdc87-0aac-4ce0-9695-24923053531d-config-volume\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.385606 master-0 kubenswrapper[36504]: I1203 23:15:00.385561 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5kn\" (UniqueName: \"kubernetes.io/projected/b3acdc87-0aac-4ce0-9695-24923053531d-kube-api-access-ql5kn\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.493239 master-0 kubenswrapper[36504]: I1203 23:15:00.493072 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3acdc87-0aac-4ce0-9695-24923053531d-secret-volume\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.494432 master-0 kubenswrapper[36504]: I1203 23:15:00.494365 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3acdc87-0aac-4ce0-9695-24923053531d-config-volume\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.494730 master-0 kubenswrapper[36504]: I1203 23:15:00.494705 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5kn\" (UniqueName: \"kubernetes.io/projected/b3acdc87-0aac-4ce0-9695-24923053531d-kube-api-access-ql5kn\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.497588 master-0 kubenswrapper[36504]: I1203 23:15:00.497172 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3acdc87-0aac-4ce0-9695-24923053531d-config-volume\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.498894 master-0 kubenswrapper[36504]: I1203 23:15:00.498869 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3acdc87-0aac-4ce0-9695-24923053531d-secret-volume\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.512098 master-0 kubenswrapper[36504]: I1203 23:15:00.512024 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5kn\" (UniqueName: \"kubernetes.io/projected/b3acdc87-0aac-4ce0-9695-24923053531d-kube-api-access-ql5kn\") pod \"collect-profiles-29413395-8ct56\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:00.568458 master-0 kubenswrapper[36504]: I1203 23:15:00.568399 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:01.150468 master-0 kubenswrapper[36504]: I1203 23:15:01.150354 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56"] Dec 03 23:15:01.163909 master-0 kubenswrapper[36504]: W1203 23:15:01.163798 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3acdc87_0aac_4ce0_9695_24923053531d.slice/crio-faf7dee77ca60a07ef1a2b5fa4f5f2f4dbc1428c9937e7c25c05a8c431d9db70 WatchSource:0}: Error finding container faf7dee77ca60a07ef1a2b5fa4f5f2f4dbc1428c9937e7c25c05a8c431d9db70: Status 404 returned error can't find the container with id faf7dee77ca60a07ef1a2b5fa4f5f2f4dbc1428c9937e7c25c05a8c431d9db70 Dec 03 23:15:01.294794 master-0 kubenswrapper[36504]: I1203 23:15:01.289763 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" event={"ID":"b3acdc87-0aac-4ce0-9695-24923053531d","Type":"ContainerStarted","Data":"faf7dee77ca60a07ef1a2b5fa4f5f2f4dbc1428c9937e7c25c05a8c431d9db70"} Dec 03 23:15:01.973396 master-0 kubenswrapper[36504]: E1203 23:15:01.973306 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3acdc87_0aac_4ce0_9695_24923053531d.slice/crio-conmon-e392bebb911b6bfc7e44592f6f2ebe33177627b79c5c0d92be24a7d2a7b04cf1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3acdc87_0aac_4ce0_9695_24923053531d.slice/crio-e392bebb911b6bfc7e44592f6f2ebe33177627b79c5c0d92be24a7d2a7b04cf1.scope\": RecentStats: unable to find data in memory cache]" Dec 03 23:15:02.304884 master-0 kubenswrapper[36504]: I1203 23:15:02.304727 36504 generic.go:334] "Generic (PLEG): container finished" podID="b3acdc87-0aac-4ce0-9695-24923053531d" containerID="e392bebb911b6bfc7e44592f6f2ebe33177627b79c5c0d92be24a7d2a7b04cf1" exitCode=0 Dec 03 23:15:02.304884 master-0 kubenswrapper[36504]: I1203 23:15:02.304830 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" event={"ID":"b3acdc87-0aac-4ce0-9695-24923053531d","Type":"ContainerDied","Data":"e392bebb911b6bfc7e44592f6f2ebe33177627b79c5c0d92be24a7d2a7b04cf1"} Dec 03 23:15:03.834264 master-0 kubenswrapper[36504]: I1203 23:15:03.834197 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:03.946644 master-0 kubenswrapper[36504]: I1203 23:15:03.946496 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5kn\" (UniqueName: \"kubernetes.io/projected/b3acdc87-0aac-4ce0-9695-24923053531d-kube-api-access-ql5kn\") pod \"b3acdc87-0aac-4ce0-9695-24923053531d\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " Dec 03 23:15:03.946644 master-0 kubenswrapper[36504]: I1203 23:15:03.946619 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3acdc87-0aac-4ce0-9695-24923053531d-config-volume\") pod \"b3acdc87-0aac-4ce0-9695-24923053531d\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " Dec 03 23:15:03.946644 master-0 kubenswrapper[36504]: I1203 23:15:03.946648 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3acdc87-0aac-4ce0-9695-24923053531d-secret-volume\") pod \"b3acdc87-0aac-4ce0-9695-24923053531d\" (UID: \"b3acdc87-0aac-4ce0-9695-24923053531d\") " Dec 03 23:15:03.947631 master-0 kubenswrapper[36504]: I1203 23:15:03.947568 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3acdc87-0aac-4ce0-9695-24923053531d-config-volume" (OuterVolumeSpecName: "config-volume") pod "b3acdc87-0aac-4ce0-9695-24923053531d" (UID: "b3acdc87-0aac-4ce0-9695-24923053531d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:15:03.948072 master-0 kubenswrapper[36504]: I1203 23:15:03.948047 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b3acdc87-0aac-4ce0-9695-24923053531d-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 23:15:03.951814 master-0 kubenswrapper[36504]: I1203 23:15:03.951758 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3acdc87-0aac-4ce0-9695-24923053531d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b3acdc87-0aac-4ce0-9695-24923053531d" (UID: "b3acdc87-0aac-4ce0-9695-24923053531d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:15:03.951919 master-0 kubenswrapper[36504]: I1203 23:15:03.951889 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3acdc87-0aac-4ce0-9695-24923053531d-kube-api-access-ql5kn" (OuterVolumeSpecName: "kube-api-access-ql5kn") pod "b3acdc87-0aac-4ce0-9695-24923053531d" (UID: "b3acdc87-0aac-4ce0-9695-24923053531d"). InnerVolumeSpecName "kube-api-access-ql5kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:15:04.051597 master-0 kubenswrapper[36504]: I1203 23:15:04.051441 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5kn\" (UniqueName: \"kubernetes.io/projected/b3acdc87-0aac-4ce0-9695-24923053531d-kube-api-access-ql5kn\") on node \"master-0\" DevicePath \"\"" Dec 03 23:15:04.051597 master-0 kubenswrapper[36504]: I1203 23:15:04.051528 36504 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b3acdc87-0aac-4ce0-9695-24923053531d-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 23:15:04.338429 master-0 kubenswrapper[36504]: I1203 23:15:04.338249 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" event={"ID":"b3acdc87-0aac-4ce0-9695-24923053531d","Type":"ContainerDied","Data":"faf7dee77ca60a07ef1a2b5fa4f5f2f4dbc1428c9937e7c25c05a8c431d9db70"} Dec 03 23:15:04.338429 master-0 kubenswrapper[36504]: I1203 23:15:04.338336 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf7dee77ca60a07ef1a2b5fa4f5f2f4dbc1428c9937e7c25c05a8c431d9db70" Dec 03 23:15:04.338429 master-0 kubenswrapper[36504]: I1203 23:15:04.338355 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413395-8ct56" Dec 03 23:15:04.942509 master-0 kubenswrapper[36504]: I1203 23:15:04.942434 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt"] Dec 03 23:15:04.958055 master-0 kubenswrapper[36504]: I1203 23:15:04.957952 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413350-djxjt"] Dec 03 23:15:05.110737 master-0 kubenswrapper[36504]: I1203 23:15:05.110675 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5803fb46-dff1-46a8-b7b8-e52a1261e409" path="/var/lib/kubelet/pods/5803fb46-dff1-46a8-b7b8-e52a1261e409/volumes" Dec 03 23:15:07.208803 master-0 kubenswrapper[36504]: I1203 23:15:07.207192 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sxt4f"] Dec 03 23:15:07.208803 master-0 kubenswrapper[36504]: E1203 23:15:07.208268 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3acdc87-0aac-4ce0-9695-24923053531d" containerName="collect-profiles" Dec 03 23:15:07.208803 master-0 kubenswrapper[36504]: I1203 23:15:07.208288 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3acdc87-0aac-4ce0-9695-24923053531d" containerName="collect-profiles" Dec 03 23:15:07.209630 master-0 kubenswrapper[36504]: I1203 23:15:07.208873 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3acdc87-0aac-4ce0-9695-24923053531d" containerName="collect-profiles" Dec 03 23:15:07.211261 master-0 kubenswrapper[36504]: I1203 23:15:07.211225 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.271167 master-0 kubenswrapper[36504]: I1203 23:15:07.269109 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxt4f"] Dec 03 23:15:07.367799 master-0 kubenswrapper[36504]: I1203 23:15:07.366119 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls57k\" (UniqueName: \"kubernetes.io/projected/af9bdbcc-033a-4a27-97a3-56f45def5b2e-kube-api-access-ls57k\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.367799 master-0 kubenswrapper[36504]: I1203 23:15:07.366241 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-catalog-content\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.367799 master-0 kubenswrapper[36504]: I1203 23:15:07.366371 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-utilities\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.470316 master-0 kubenswrapper[36504]: I1203 23:15:07.470133 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-utilities\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.470587 master-0 kubenswrapper[36504]: I1203 23:15:07.470402 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls57k\" (UniqueName: \"kubernetes.io/projected/af9bdbcc-033a-4a27-97a3-56f45def5b2e-kube-api-access-ls57k\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.470587 master-0 kubenswrapper[36504]: I1203 23:15:07.470537 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-catalog-content\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.471097 master-0 kubenswrapper[36504]: I1203 23:15:07.471046 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-utilities\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.471542 master-0 kubenswrapper[36504]: I1203 23:15:07.471498 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-catalog-content\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.492189 master-0 kubenswrapper[36504]: I1203 23:15:07.492117 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls57k\" (UniqueName: \"kubernetes.io/projected/af9bdbcc-033a-4a27-97a3-56f45def5b2e-kube-api-access-ls57k\") pod \"redhat-operators-sxt4f\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:07.558339 master-0 kubenswrapper[36504]: I1203 23:15:07.558264 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:08.142919 master-0 kubenswrapper[36504]: I1203 23:15:08.141950 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sxt4f"] Dec 03 23:15:08.752350 master-0 kubenswrapper[36504]: W1203 23:15:08.749178 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf9bdbcc_033a_4a27_97a3_56f45def5b2e.slice/crio-09f3d2401cd32b45c95909c41dd82c520a20f248290018219acfa92d44c80bf4 WatchSource:0}: Error finding container 09f3d2401cd32b45c95909c41dd82c520a20f248290018219acfa92d44c80bf4: Status 404 returned error can't find the container with id 09f3d2401cd32b45c95909c41dd82c520a20f248290018219acfa92d44c80bf4 Dec 03 23:15:09.428804 master-0 kubenswrapper[36504]: I1203 23:15:09.427817 36504 generic.go:334] "Generic (PLEG): container finished" podID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerID="e2c9294bfced33a7e599035ce5f177803d1e23064772ff5dd8b38d7d1906e71a" exitCode=0 Dec 03 23:15:09.428804 master-0 kubenswrapper[36504]: I1203 23:15:09.427881 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerDied","Data":"e2c9294bfced33a7e599035ce5f177803d1e23064772ff5dd8b38d7d1906e71a"} Dec 03 23:15:09.428804 master-0 kubenswrapper[36504]: I1203 23:15:09.427913 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerStarted","Data":"09f3d2401cd32b45c95909c41dd82c520a20f248290018219acfa92d44c80bf4"} Dec 03 23:15:10.444235 master-0 kubenswrapper[36504]: I1203 23:15:10.444080 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerStarted","Data":"baf93fd41a87a004fa09791095db473f924dca4eb22a09b2b1b00df6146af8fd"} Dec 03 23:15:11.459503 master-0 kubenswrapper[36504]: I1203 23:15:11.459434 36504 generic.go:334] "Generic (PLEG): container finished" podID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerID="baf93fd41a87a004fa09791095db473f924dca4eb22a09b2b1b00df6146af8fd" exitCode=0 Dec 03 23:15:11.459503 master-0 kubenswrapper[36504]: I1203 23:15:11.459496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerDied","Data":"baf93fd41a87a004fa09791095db473f924dca4eb22a09b2b1b00df6146af8fd"} Dec 03 23:15:12.476383 master-0 kubenswrapper[36504]: I1203 23:15:12.476203 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerStarted","Data":"9052fd5a972e3a83d20839f193ea9f676583dd328225a156ff289a22bce4aa9d"} Dec 03 23:15:12.515224 master-0 kubenswrapper[36504]: I1203 23:15:12.515083 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sxt4f" podStartSLOduration=3.048160346 podStartE2EDuration="5.515054969s" podCreationTimestamp="2025-12-03 23:15:07 +0000 UTC" firstStartedPulling="2025-12-03 23:15:09.430429521 +0000 UTC m=+3874.650201528" lastFinishedPulling="2025-12-03 23:15:11.897324144 +0000 UTC m=+3877.117096151" observedRunningTime="2025-12-03 23:15:12.500321368 +0000 UTC m=+3877.720093385" watchObservedRunningTime="2025-12-03 23:15:12.515054969 +0000 UTC m=+3877.734827136" Dec 03 23:15:17.559071 master-0 kubenswrapper[36504]: I1203 23:15:17.558976 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:17.560063 master-0 kubenswrapper[36504]: I1203 23:15:17.559091 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:17.621798 master-0 kubenswrapper[36504]: I1203 23:15:17.621710 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:18.620034 master-0 kubenswrapper[36504]: I1203 23:15:18.619874 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:18.697823 master-0 kubenswrapper[36504]: I1203 23:15:18.697711 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxt4f"] Dec 03 23:15:20.589649 master-0 kubenswrapper[36504]: I1203 23:15:20.589523 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sxt4f" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="registry-server" containerID="cri-o://9052fd5a972e3a83d20839f193ea9f676583dd328225a156ff289a22bce4aa9d" gracePeriod=2 Dec 03 23:15:21.098966 master-0 kubenswrapper[36504]: I1203 23:15:21.098800 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:15:21.608374 master-0 kubenswrapper[36504]: I1203 23:15:21.608259 36504 generic.go:334] "Generic (PLEG): container finished" podID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerID="9052fd5a972e3a83d20839f193ea9f676583dd328225a156ff289a22bce4aa9d" exitCode=0 Dec 03 23:15:21.608374 master-0 kubenswrapper[36504]: I1203 23:15:21.608338 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerDied","Data":"9052fd5a972e3a83d20839f193ea9f676583dd328225a156ff289a22bce4aa9d"} Dec 03 23:15:22.097351 master-0 kubenswrapper[36504]: I1203 23:15:22.097069 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:15:22.348329 master-0 kubenswrapper[36504]: I1203 23:15:22.348186 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:22.455134 master-0 kubenswrapper[36504]: I1203 23:15:22.454846 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-catalog-content\") pod \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " Dec 03 23:15:22.455451 master-0 kubenswrapper[36504]: I1203 23:15:22.455341 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-utilities\") pod \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " Dec 03 23:15:22.455664 master-0 kubenswrapper[36504]: I1203 23:15:22.455636 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls57k\" (UniqueName: \"kubernetes.io/projected/af9bdbcc-033a-4a27-97a3-56f45def5b2e-kube-api-access-ls57k\") pod \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\" (UID: \"af9bdbcc-033a-4a27-97a3-56f45def5b2e\") " Dec 03 23:15:22.456383 master-0 kubenswrapper[36504]: I1203 23:15:22.456346 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-utilities" (OuterVolumeSpecName: "utilities") pod "af9bdbcc-033a-4a27-97a3-56f45def5b2e" (UID: "af9bdbcc-033a-4a27-97a3-56f45def5b2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:15:22.465014 master-0 kubenswrapper[36504]: I1203 23:15:22.464934 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af9bdbcc-033a-4a27-97a3-56f45def5b2e-kube-api-access-ls57k" (OuterVolumeSpecName: "kube-api-access-ls57k") pod "af9bdbcc-033a-4a27-97a3-56f45def5b2e" (UID: "af9bdbcc-033a-4a27-97a3-56f45def5b2e"). InnerVolumeSpecName "kube-api-access-ls57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:15:22.563677 master-0 kubenswrapper[36504]: I1203 23:15:22.563590 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls57k\" (UniqueName: \"kubernetes.io/projected/af9bdbcc-033a-4a27-97a3-56f45def5b2e-kube-api-access-ls57k\") on node \"master-0\" DevicePath \"\"" Dec 03 23:15:22.563677 master-0 kubenswrapper[36504]: I1203 23:15:22.563661 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:15:22.567510 master-0 kubenswrapper[36504]: I1203 23:15:22.567466 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af9bdbcc-033a-4a27-97a3-56f45def5b2e" (UID: "af9bdbcc-033a-4a27-97a3-56f45def5b2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:15:22.639302 master-0 kubenswrapper[36504]: I1203 23:15:22.639240 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sxt4f" event={"ID":"af9bdbcc-033a-4a27-97a3-56f45def5b2e","Type":"ContainerDied","Data":"09f3d2401cd32b45c95909c41dd82c520a20f248290018219acfa92d44c80bf4"} Dec 03 23:15:22.641296 master-0 kubenswrapper[36504]: I1203 23:15:22.639323 36504 scope.go:117] "RemoveContainer" containerID="9052fd5a972e3a83d20839f193ea9f676583dd328225a156ff289a22bce4aa9d" Dec 03 23:15:22.641296 master-0 kubenswrapper[36504]: I1203 23:15:22.639612 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sxt4f" Dec 03 23:15:22.667558 master-0 kubenswrapper[36504]: I1203 23:15:22.667478 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af9bdbcc-033a-4a27-97a3-56f45def5b2e-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:15:22.682407 master-0 kubenswrapper[36504]: I1203 23:15:22.680154 36504 scope.go:117] "RemoveContainer" containerID="baf93fd41a87a004fa09791095db473f924dca4eb22a09b2b1b00df6146af8fd" Dec 03 23:15:22.705968 master-0 kubenswrapper[36504]: I1203 23:15:22.705887 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sxt4f"] Dec 03 23:15:22.723937 master-0 kubenswrapper[36504]: I1203 23:15:22.723464 36504 scope.go:117] "RemoveContainer" containerID="e2c9294bfced33a7e599035ce5f177803d1e23064772ff5dd8b38d7d1906e71a" Dec 03 23:15:22.725684 master-0 kubenswrapper[36504]: I1203 23:15:22.725622 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sxt4f"] Dec 03 23:15:23.114529 master-0 kubenswrapper[36504]: I1203 23:15:23.114453 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" path="/var/lib/kubelet/pods/af9bdbcc-033a-4a27-97a3-56f45def5b2e/volumes" Dec 03 23:15:35.781822 master-0 kubenswrapper[36504]: E1203 23:15:35.777987 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:15:47.676113 master-0 kubenswrapper[36504]: I1203 23:15:47.675992 36504 scope.go:117] "RemoveContainer" containerID="19d2313995bd1b110e25023c35fca32bf181a6c40bd844c6e1adeb019b0c4c5a" Dec 03 23:16:34.096570 master-0 kubenswrapper[36504]: I1203 23:16:34.096452 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:16:35.730706 master-0 kubenswrapper[36504]: E1203 23:16:35.730597 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:16:46.096749 master-0 kubenswrapper[36504]: I1203 23:16:46.096669 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:17:35.712506 master-0 kubenswrapper[36504]: E1203 23:17:35.712335 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:17:52.096222 master-0 kubenswrapper[36504]: I1203 23:17:52.096149 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:18:00.097057 master-0 kubenswrapper[36504]: I1203 23:18:00.096997 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:18:35.716484 master-0 kubenswrapper[36504]: E1203 23:18:35.716416 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:19:00.096229 master-0 kubenswrapper[36504]: I1203 23:19:00.096144 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:19:21.096684 master-0 kubenswrapper[36504]: I1203 23:19:21.096564 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:19:35.748616 master-0 kubenswrapper[36504]: E1203 23:19:35.748530 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:20:03.099015 master-0 kubenswrapper[36504]: I1203 23:20:03.098941 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:20:21.952538 master-0 kubenswrapper[36504]: I1203 23:20:21.952456 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fnr6m"] Dec 03 23:20:21.953838 master-0 kubenswrapper[36504]: E1203 23:20:21.953813 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="extract-utilities" Dec 03 23:20:21.953924 master-0 kubenswrapper[36504]: I1203 23:20:21.953839 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="extract-utilities" Dec 03 23:20:21.953924 master-0 kubenswrapper[36504]: E1203 23:20:21.953891 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="extract-content" Dec 03 23:20:21.953924 master-0 kubenswrapper[36504]: I1203 23:20:21.953898 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="extract-content" Dec 03 23:20:21.954141 master-0 kubenswrapper[36504]: E1203 23:20:21.953970 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="registry-server" Dec 03 23:20:21.954141 master-0 kubenswrapper[36504]: I1203 23:20:21.953978 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="registry-server" Dec 03 23:20:21.954739 master-0 kubenswrapper[36504]: I1203 23:20:21.954716 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="af9bdbcc-033a-4a27-97a3-56f45def5b2e" containerName="registry-server" Dec 03 23:20:21.958745 master-0 kubenswrapper[36504]: I1203 23:20:21.958707 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:21.971226 master-0 kubenswrapper[36504]: I1203 23:20:21.971162 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnr6m"] Dec 03 23:20:22.030183 master-0 kubenswrapper[36504]: I1203 23:20:22.030122 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdb9\" (UniqueName: \"kubernetes.io/projected/bdbefe04-5585-4367-b16e-4859c2f7e703-kube-api-access-jbdb9\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.030183 master-0 kubenswrapper[36504]: I1203 23:20:22.030192 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-catalog-content\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.030800 master-0 kubenswrapper[36504]: I1203 23:20:22.030231 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-utilities\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.136107 master-0 kubenswrapper[36504]: I1203 23:20:22.136041 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdb9\" (UniqueName: \"kubernetes.io/projected/bdbefe04-5585-4367-b16e-4859c2f7e703-kube-api-access-jbdb9\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.136388 master-0 kubenswrapper[36504]: I1203 23:20:22.136228 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-catalog-content\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.136438 master-0 kubenswrapper[36504]: I1203 23:20:22.136391 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-utilities\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.140906 master-0 kubenswrapper[36504]: I1203 23:20:22.138986 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-catalog-content\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.140906 master-0 kubenswrapper[36504]: I1203 23:20:22.139496 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-utilities\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.161051 master-0 kubenswrapper[36504]: I1203 23:20:22.160998 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdb9\" (UniqueName: \"kubernetes.io/projected/bdbefe04-5585-4367-b16e-4859c2f7e703-kube-api-access-jbdb9\") pod \"certified-operators-fnr6m\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.305227 master-0 kubenswrapper[36504]: I1203 23:20:22.305070 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:22.869706 master-0 kubenswrapper[36504]: I1203 23:20:22.869630 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnr6m"] Dec 03 23:20:22.871860 master-0 kubenswrapper[36504]: W1203 23:20:22.871806 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbefe04_5585_4367_b16e_4859c2f7e703.slice/crio-e04d49b73ac534dbd0f99ac84854dcfac334195bd8107c0da4f7c7bd61bc1de8 WatchSource:0}: Error finding container e04d49b73ac534dbd0f99ac84854dcfac334195bd8107c0da4f7c7bd61bc1de8: Status 404 returned error can't find the container with id e04d49b73ac534dbd0f99ac84854dcfac334195bd8107c0da4f7c7bd61bc1de8 Dec 03 23:20:23.190533 master-0 kubenswrapper[36504]: I1203 23:20:23.190451 36504 generic.go:334] "Generic (PLEG): container finished" podID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerID="08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda" exitCode=0 Dec 03 23:20:23.191282 master-0 kubenswrapper[36504]: I1203 23:20:23.190546 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr6m" event={"ID":"bdbefe04-5585-4367-b16e-4859c2f7e703","Type":"ContainerDied","Data":"08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda"} Dec 03 23:20:23.191282 master-0 kubenswrapper[36504]: I1203 23:20:23.190593 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr6m" event={"ID":"bdbefe04-5585-4367-b16e-4859c2f7e703","Type":"ContainerStarted","Data":"e04d49b73ac534dbd0f99ac84854dcfac334195bd8107c0da4f7c7bd61bc1de8"} Dec 03 23:20:23.193233 master-0 kubenswrapper[36504]: I1203 23:20:23.193170 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:20:25.220088 master-0 kubenswrapper[36504]: I1203 23:20:25.220022 36504 generic.go:334] "Generic (PLEG): container finished" podID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerID="8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4" exitCode=0 Dec 03 23:20:25.220876 master-0 kubenswrapper[36504]: I1203 23:20:25.220845 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr6m" event={"ID":"bdbefe04-5585-4367-b16e-4859c2f7e703","Type":"ContainerDied","Data":"8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4"} Dec 03 23:20:26.241539 master-0 kubenswrapper[36504]: I1203 23:20:26.240978 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr6m" event={"ID":"bdbefe04-5585-4367-b16e-4859c2f7e703","Type":"ContainerStarted","Data":"a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447"} Dec 03 23:20:26.280419 master-0 kubenswrapper[36504]: I1203 23:20:26.280281 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fnr6m" podStartSLOduration=2.844566318 podStartE2EDuration="5.280248526s" podCreationTimestamp="2025-12-03 23:20:21 +0000 UTC" firstStartedPulling="2025-12-03 23:20:23.193077478 +0000 UTC m=+4188.412849485" lastFinishedPulling="2025-12-03 23:20:25.628759686 +0000 UTC m=+4190.848531693" observedRunningTime="2025-12-03 23:20:26.269578933 +0000 UTC m=+4191.489350940" watchObservedRunningTime="2025-12-03 23:20:26.280248526 +0000 UTC m=+4191.500020533" Dec 03 23:20:27.097761 master-0 kubenswrapper[36504]: I1203 23:20:27.097640 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:20:32.305995 master-0 kubenswrapper[36504]: I1203 23:20:32.305845 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:32.305995 master-0 kubenswrapper[36504]: I1203 23:20:32.305930 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:32.358434 master-0 kubenswrapper[36504]: I1203 23:20:32.358307 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:32.410863 master-0 kubenswrapper[36504]: I1203 23:20:32.410802 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:34.097255 master-0 kubenswrapper[36504]: I1203 23:20:34.097179 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnr6m"] Dec 03 23:20:34.354201 master-0 kubenswrapper[36504]: I1203 23:20:34.354028 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fnr6m" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="registry-server" containerID="cri-o://a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447" gracePeriod=2 Dec 03 23:20:34.912928 master-0 kubenswrapper[36504]: I1203 23:20:34.911657 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:35.010980 master-0 kubenswrapper[36504]: I1203 23:20:35.010869 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdb9\" (UniqueName: \"kubernetes.io/projected/bdbefe04-5585-4367-b16e-4859c2f7e703-kube-api-access-jbdb9\") pod \"bdbefe04-5585-4367-b16e-4859c2f7e703\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " Dec 03 23:20:35.011482 master-0 kubenswrapper[36504]: I1203 23:20:35.011460 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-catalog-content\") pod \"bdbefe04-5585-4367-b16e-4859c2f7e703\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " Dec 03 23:20:35.011621 master-0 kubenswrapper[36504]: I1203 23:20:35.011606 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-utilities\") pod \"bdbefe04-5585-4367-b16e-4859c2f7e703\" (UID: \"bdbefe04-5585-4367-b16e-4859c2f7e703\") " Dec 03 23:20:35.012569 master-0 kubenswrapper[36504]: I1203 23:20:35.012529 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-utilities" (OuterVolumeSpecName: "utilities") pod "bdbefe04-5585-4367-b16e-4859c2f7e703" (UID: "bdbefe04-5585-4367-b16e-4859c2f7e703"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:20:35.015568 master-0 kubenswrapper[36504]: I1203 23:20:35.015482 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbefe04-5585-4367-b16e-4859c2f7e703-kube-api-access-jbdb9" (OuterVolumeSpecName: "kube-api-access-jbdb9") pod "bdbefe04-5585-4367-b16e-4859c2f7e703" (UID: "bdbefe04-5585-4367-b16e-4859c2f7e703"). InnerVolumeSpecName "kube-api-access-jbdb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:20:35.117874 master-0 kubenswrapper[36504]: I1203 23:20:35.117785 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:20:35.118635 master-0 kubenswrapper[36504]: I1203 23:20:35.118545 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdb9\" (UniqueName: \"kubernetes.io/projected/bdbefe04-5585-4367-b16e-4859c2f7e703-kube-api-access-jbdb9\") on node \"master-0\" DevicePath \"\"" Dec 03 23:20:35.202081 master-0 kubenswrapper[36504]: I1203 23:20:35.201963 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdbefe04-5585-4367-b16e-4859c2f7e703" (UID: "bdbefe04-5585-4367-b16e-4859c2f7e703"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:20:35.223255 master-0 kubenswrapper[36504]: I1203 23:20:35.223185 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdbefe04-5585-4367-b16e-4859c2f7e703-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:20:35.371658 master-0 kubenswrapper[36504]: I1203 23:20:35.371585 36504 generic.go:334] "Generic (PLEG): container finished" podID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerID="a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447" exitCode=0 Dec 03 23:20:35.371658 master-0 kubenswrapper[36504]: I1203 23:20:35.371665 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr6m" event={"ID":"bdbefe04-5585-4367-b16e-4859c2f7e703","Type":"ContainerDied","Data":"a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447"} Dec 03 23:20:35.371658 master-0 kubenswrapper[36504]: I1203 23:20:35.371708 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr6m" event={"ID":"bdbefe04-5585-4367-b16e-4859c2f7e703","Type":"ContainerDied","Data":"e04d49b73ac534dbd0f99ac84854dcfac334195bd8107c0da4f7c7bd61bc1de8"} Dec 03 23:20:35.372122 master-0 kubenswrapper[36504]: I1203 23:20:35.371720 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr6m" Dec 03 23:20:35.372122 master-0 kubenswrapper[36504]: I1203 23:20:35.371731 36504 scope.go:117] "RemoveContainer" containerID="a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447" Dec 03 23:20:35.405508 master-0 kubenswrapper[36504]: I1203 23:20:35.405440 36504 scope.go:117] "RemoveContainer" containerID="8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4" Dec 03 23:20:35.426163 master-0 kubenswrapper[36504]: I1203 23:20:35.426077 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnr6m"] Dec 03 23:20:35.442120 master-0 kubenswrapper[36504]: I1203 23:20:35.441973 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fnr6m"] Dec 03 23:20:35.455303 master-0 kubenswrapper[36504]: I1203 23:20:35.455222 36504 scope.go:117] "RemoveContainer" containerID="08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda" Dec 03 23:20:35.495836 master-0 kubenswrapper[36504]: I1203 23:20:35.495758 36504 scope.go:117] "RemoveContainer" containerID="a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447" Dec 03 23:20:35.496552 master-0 kubenswrapper[36504]: E1203 23:20:35.496484 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447\": container with ID starting with a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447 not found: ID does not exist" containerID="a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447" Dec 03 23:20:35.496631 master-0 kubenswrapper[36504]: I1203 23:20:35.496541 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447"} err="failed to get container status \"a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447\": rpc error: code = NotFound desc = could not find container \"a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447\": container with ID starting with a2d94e7a1c4f603fa98096a7c2323c35e8988acf19cf74a1b51edbf0c02f9447 not found: ID does not exist" Dec 03 23:20:35.496631 master-0 kubenswrapper[36504]: I1203 23:20:35.496572 36504 scope.go:117] "RemoveContainer" containerID="8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4" Dec 03 23:20:35.496983 master-0 kubenswrapper[36504]: E1203 23:20:35.496957 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4\": container with ID starting with 8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4 not found: ID does not exist" containerID="8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4" Dec 03 23:20:35.497056 master-0 kubenswrapper[36504]: I1203 23:20:35.496984 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4"} err="failed to get container status \"8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4\": rpc error: code = NotFound desc = could not find container \"8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4\": container with ID starting with 8c103a0214970e02646fa6921f1e7b3a2e01dc46752380a1b747bb042a5b56e4 not found: ID does not exist" Dec 03 23:20:35.497056 master-0 kubenswrapper[36504]: I1203 23:20:35.496999 36504 scope.go:117] "RemoveContainer" containerID="08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda" Dec 03 23:20:35.497518 master-0 kubenswrapper[36504]: E1203 23:20:35.497443 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda\": container with ID starting with 08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda not found: ID does not exist" containerID="08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda" Dec 03 23:20:35.497598 master-0 kubenswrapper[36504]: I1203 23:20:35.497538 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda"} err="failed to get container status \"08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda\": rpc error: code = NotFound desc = could not find container \"08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda\": container with ID starting with 08890580afc8ef9d72eb8b199d1f617097884adc12b198c7f0a586c247bbcbda not found: ID does not exist" Dec 03 23:20:35.716905 master-0 kubenswrapper[36504]: E1203 23:20:35.716739 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:20:37.114083 master-0 kubenswrapper[36504]: I1203 23:20:37.114004 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" path="/var/lib/kubelet/pods/bdbefe04-5585-4367-b16e-4859c2f7e703/volumes" Dec 03 23:21:28.099175 master-0 kubenswrapper[36504]: I1203 23:21:28.099067 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:21:33.606799 master-0 kubenswrapper[36504]: I1203 23:21:33.605816 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-57n6x"] Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: E1203 23:21:33.606807 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="extract-utilities" Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: I1203 23:21:33.606833 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="extract-utilities" Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: E1203 23:21:33.606957 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="extract-content" Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: I1203 23:21:33.606973 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="extract-content" Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: E1203 23:21:33.606994 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="registry-server" Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: I1203 23:21:33.607002 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="registry-server" Dec 03 23:21:33.607650 master-0 kubenswrapper[36504]: I1203 23:21:33.607442 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdbefe04-5585-4367-b16e-4859c2f7e703" containerName="registry-server" Dec 03 23:21:33.610437 master-0 kubenswrapper[36504]: I1203 23:21:33.610398 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.629963 master-0 kubenswrapper[36504]: I1203 23:21:33.619900 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57n6x"] Dec 03 23:21:33.734098 master-0 kubenswrapper[36504]: I1203 23:21:33.734025 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29fe3734-fb6a-4303-801d-656fdecc8a16-catalog-content\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.734820 master-0 kubenswrapper[36504]: I1203 23:21:33.734795 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf9zl\" (UniqueName: \"kubernetes.io/projected/29fe3734-fb6a-4303-801d-656fdecc8a16-kube-api-access-wf9zl\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.735114 master-0 kubenswrapper[36504]: I1203 23:21:33.734989 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29fe3734-fb6a-4303-801d-656fdecc8a16-utilities\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.839166 master-0 kubenswrapper[36504]: I1203 23:21:33.839084 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29fe3734-fb6a-4303-801d-656fdecc8a16-catalog-content\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.839524 master-0 kubenswrapper[36504]: I1203 23:21:33.839203 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf9zl\" (UniqueName: \"kubernetes.io/projected/29fe3734-fb6a-4303-801d-656fdecc8a16-kube-api-access-wf9zl\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.839524 master-0 kubenswrapper[36504]: I1203 23:21:33.839266 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29fe3734-fb6a-4303-801d-656fdecc8a16-utilities\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.839874 master-0 kubenswrapper[36504]: I1203 23:21:33.839826 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29fe3734-fb6a-4303-801d-656fdecc8a16-catalog-content\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.839955 master-0 kubenswrapper[36504]: I1203 23:21:33.839907 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29fe3734-fb6a-4303-801d-656fdecc8a16-utilities\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.860841 master-0 kubenswrapper[36504]: I1203 23:21:33.860702 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf9zl\" (UniqueName: \"kubernetes.io/projected/29fe3734-fb6a-4303-801d-656fdecc8a16-kube-api-access-wf9zl\") pod \"community-operators-57n6x\" (UID: \"29fe3734-fb6a-4303-801d-656fdecc8a16\") " pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:33.977562 master-0 kubenswrapper[36504]: I1203 23:21:33.977495 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:34.578946 master-0 kubenswrapper[36504]: I1203 23:21:34.578878 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57n6x"] Dec 03 23:21:34.627807 master-0 kubenswrapper[36504]: I1203 23:21:34.627709 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57n6x" event={"ID":"29fe3734-fb6a-4303-801d-656fdecc8a16","Type":"ContainerStarted","Data":"c39c5cbc2320c54205fe27ab93a365777cc616a310e0d7ae86beed25d76d0084"} Dec 03 23:21:35.644612 master-0 kubenswrapper[36504]: I1203 23:21:35.644553 36504 generic.go:334] "Generic (PLEG): container finished" podID="29fe3734-fb6a-4303-801d-656fdecc8a16" containerID="c0d32caa79c584fbd1f0bc9a84c1db9d87fddcae117ef63c4ce82159d18ea2f0" exitCode=0 Dec 03 23:21:35.645312 master-0 kubenswrapper[36504]: I1203 23:21:35.644671 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57n6x" event={"ID":"29fe3734-fb6a-4303-801d-656fdecc8a16","Type":"ContainerDied","Data":"c0d32caa79c584fbd1f0bc9a84c1db9d87fddcae117ef63c4ce82159d18ea2f0"} Dec 03 23:21:35.730584 master-0 kubenswrapper[36504]: E1203 23:21:35.730516 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:21:36.097290 master-0 kubenswrapper[36504]: I1203 23:21:36.097092 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:21:39.727350 master-0 kubenswrapper[36504]: I1203 23:21:39.727258 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57n6x" event={"ID":"29fe3734-fb6a-4303-801d-656fdecc8a16","Type":"ContainerStarted","Data":"93ad156fe8c6514ea2d5a9140919b5be1e5aed7a1a1d2b9a0ae208b984bd1a1a"} Dec 03 23:21:40.743135 master-0 kubenswrapper[36504]: I1203 23:21:40.743046 36504 generic.go:334] "Generic (PLEG): container finished" podID="29fe3734-fb6a-4303-801d-656fdecc8a16" containerID="93ad156fe8c6514ea2d5a9140919b5be1e5aed7a1a1d2b9a0ae208b984bd1a1a" exitCode=0 Dec 03 23:21:40.743135 master-0 kubenswrapper[36504]: I1203 23:21:40.743130 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57n6x" event={"ID":"29fe3734-fb6a-4303-801d-656fdecc8a16","Type":"ContainerDied","Data":"93ad156fe8c6514ea2d5a9140919b5be1e5aed7a1a1d2b9a0ae208b984bd1a1a"} Dec 03 23:21:41.758629 master-0 kubenswrapper[36504]: I1203 23:21:41.758530 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57n6x" event={"ID":"29fe3734-fb6a-4303-801d-656fdecc8a16","Type":"ContainerStarted","Data":"6882178e2cdc7ec436ac007c31a811aa34deafd598425c085b819a61723ce6ca"} Dec 03 23:21:41.783465 master-0 kubenswrapper[36504]: I1203 23:21:41.783327 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-57n6x" podStartSLOduration=3.280474476 podStartE2EDuration="8.783291946s" podCreationTimestamp="2025-12-03 23:21:33 +0000 UTC" firstStartedPulling="2025-12-03 23:21:35.653138821 +0000 UTC m=+4260.872910828" lastFinishedPulling="2025-12-03 23:21:41.155956291 +0000 UTC m=+4266.375728298" observedRunningTime="2025-12-03 23:21:41.779333093 +0000 UTC m=+4266.999105100" watchObservedRunningTime="2025-12-03 23:21:41.783291946 +0000 UTC m=+4267.003063943" Dec 03 23:21:43.978916 master-0 kubenswrapper[36504]: I1203 23:21:43.978829 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:43.978916 master-0 kubenswrapper[36504]: I1203 23:21:43.978901 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:44.036343 master-0 kubenswrapper[36504]: I1203 23:21:44.036254 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:54.031043 master-0 kubenswrapper[36504]: I1203 23:21:54.030970 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-57n6x" Dec 03 23:21:54.127682 master-0 kubenswrapper[36504]: I1203 23:21:54.127572 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57n6x"] Dec 03 23:21:54.189877 master-0 kubenswrapper[36504]: I1203 23:21:54.189812 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k98b2"] Dec 03 23:21:54.190200 master-0 kubenswrapper[36504]: I1203 23:21:54.190164 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k98b2" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="registry-server" containerID="cri-o://a4cced4dc8dcac3e7254d9fd507acb0e83f426face2c2e3f878759a96cd3bd74" gracePeriod=2 Dec 03 23:21:54.945302 master-0 kubenswrapper[36504]: I1203 23:21:54.945246 36504 generic.go:334] "Generic (PLEG): container finished" podID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerID="a4cced4dc8dcac3e7254d9fd507acb0e83f426face2c2e3f878759a96cd3bd74" exitCode=0 Dec 03 23:21:54.945581 master-0 kubenswrapper[36504]: I1203 23:21:54.945317 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k98b2" event={"ID":"e403ab42-1840-4292-a37c-a8d4feeb54ca","Type":"ContainerDied","Data":"a4cced4dc8dcac3e7254d9fd507acb0e83f426face2c2e3f878759a96cd3bd74"} Dec 03 23:21:55.579853 master-0 kubenswrapper[36504]: I1203 23:21:55.578579 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 23:21:55.622313 master-0 kubenswrapper[36504]: I1203 23:21:55.622254 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") pod \"e403ab42-1840-4292-a37c-a8d4feeb54ca\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " Dec 03 23:21:55.622605 master-0 kubenswrapper[36504]: I1203 23:21:55.622517 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") pod \"e403ab42-1840-4292-a37c-a8d4feeb54ca\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " Dec 03 23:21:55.623172 master-0 kubenswrapper[36504]: I1203 23:21:55.623143 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") pod \"e403ab42-1840-4292-a37c-a8d4feeb54ca\" (UID: \"e403ab42-1840-4292-a37c-a8d4feeb54ca\") " Dec 03 23:21:55.623723 master-0 kubenswrapper[36504]: I1203 23:21:55.623684 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities" (OuterVolumeSpecName: "utilities") pod "e403ab42-1840-4292-a37c-a8d4feeb54ca" (UID: "e403ab42-1840-4292-a37c-a8d4feeb54ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:21:55.625471 master-0 kubenswrapper[36504]: I1203 23:21:55.625285 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:21:55.626655 master-0 kubenswrapper[36504]: I1203 23:21:55.626595 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95" (OuterVolumeSpecName: "kube-api-access-tkg95") pod "e403ab42-1840-4292-a37c-a8d4feeb54ca" (UID: "e403ab42-1840-4292-a37c-a8d4feeb54ca"). InnerVolumeSpecName "kube-api-access-tkg95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:21:55.676369 master-0 kubenswrapper[36504]: I1203 23:21:55.676287 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e403ab42-1840-4292-a37c-a8d4feeb54ca" (UID: "e403ab42-1840-4292-a37c-a8d4feeb54ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:21:55.729077 master-0 kubenswrapper[36504]: I1203 23:21:55.728915 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkg95\" (UniqueName: \"kubernetes.io/projected/e403ab42-1840-4292-a37c-a8d4feeb54ca-kube-api-access-tkg95\") on node \"master-0\" DevicePath \"\"" Dec 03 23:21:55.729395 master-0 kubenswrapper[36504]: I1203 23:21:55.729369 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e403ab42-1840-4292-a37c-a8d4feeb54ca-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:21:55.967903 master-0 kubenswrapper[36504]: I1203 23:21:55.967496 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k98b2" event={"ID":"e403ab42-1840-4292-a37c-a8d4feeb54ca","Type":"ContainerDied","Data":"27cf3d4301968602620f0710474b0cd1874a47ae80ca26e646bde5b1b38a2e9d"} Dec 03 23:21:55.967903 master-0 kubenswrapper[36504]: I1203 23:21:55.967573 36504 scope.go:117] "RemoveContainer" containerID="a4cced4dc8dcac3e7254d9fd507acb0e83f426face2c2e3f878759a96cd3bd74" Dec 03 23:21:55.967903 master-0 kubenswrapper[36504]: I1203 23:21:55.967652 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k98b2" Dec 03 23:21:55.999952 master-0 kubenswrapper[36504]: I1203 23:21:55.999914 36504 scope.go:117] "RemoveContainer" containerID="7458eacc3a5edc54c5cf843060c75af4d4324f599075c78c8fcbd5a674afd301" Dec 03 23:21:56.043132 master-0 kubenswrapper[36504]: I1203 23:21:56.043067 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k98b2"] Dec 03 23:21:56.061233 master-0 kubenswrapper[36504]: I1203 23:21:56.060859 36504 scope.go:117] "RemoveContainer" containerID="60c50abe2ec8c7459d390c08606126e779403c662dcf37b0171073aa9b774934" Dec 03 23:21:56.068291 master-0 kubenswrapper[36504]: I1203 23:21:56.068202 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k98b2"] Dec 03 23:21:57.112165 master-0 kubenswrapper[36504]: I1203 23:21:57.112083 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" path="/var/lib/kubelet/pods/e403ab42-1840-4292-a37c-a8d4feeb54ca/volumes" Dec 03 23:22:35.728354 master-0 kubenswrapper[36504]: E1203 23:22:35.728247 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:22:56.096592 master-0 kubenswrapper[36504]: I1203 23:22:56.096521 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:22:58.096907 master-0 kubenswrapper[36504]: I1203 23:22:58.096813 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:23:35.720936 master-0 kubenswrapper[36504]: E1203 23:23:35.720873 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:24:24.095954 master-0 kubenswrapper[36504]: I1203 23:24:24.095744 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:24:26.096345 master-0 kubenswrapper[36504]: I1203 23:24:26.096280 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:24:35.719645 master-0 kubenswrapper[36504]: E1203 23:24:35.719560 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:25:32.097026 master-0 kubenswrapper[36504]: I1203 23:25:32.096938 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:25:35.729419 master-0 kubenswrapper[36504]: E1203 23:25:35.729324 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:25:40.096428 master-0 kubenswrapper[36504]: I1203 23:25:40.096329 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:26:35.719882 master-0 kubenswrapper[36504]: E1203 23:26:35.719812 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.942877 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c5pp2"] Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: E1203 23:26:35.943762 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="extract-utilities" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.943804 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="extract-utilities" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: E1203 23:26:35.943844 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="registry-server" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.943855 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="registry-server" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: E1203 23:26:35.943897 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="extract-content" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.943905 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="extract-content" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.944251 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="extract-utilities" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.944321 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="registry-server" Dec 03 23:26:35.944988 master-0 kubenswrapper[36504]: I1203 23:26:35.944349 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e403ab42-1840-4292-a37c-a8d4feeb54ca" containerName="extract-content" Dec 03 23:26:35.946757 master-0 kubenswrapper[36504]: I1203 23:26:35.946675 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:35.962759 master-0 kubenswrapper[36504]: I1203 23:26:35.959663 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5pp2"] Dec 03 23:26:36.043518 master-0 kubenswrapper[36504]: I1203 23:26:36.043406 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-catalog-content\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.044117 master-0 kubenswrapper[36504]: I1203 23:26:36.043565 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-utilities\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.044117 master-0 kubenswrapper[36504]: I1203 23:26:36.043749 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmhr\" (UniqueName: \"kubernetes.io/projected/95a96fc0-edb1-45b4-9853-bb9028b758f1-kube-api-access-kxmhr\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.148064 master-0 kubenswrapper[36504]: I1203 23:26:36.147393 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-utilities\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.148064 master-0 kubenswrapper[36504]: I1203 23:26:36.147737 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmhr\" (UniqueName: \"kubernetes.io/projected/95a96fc0-edb1-45b4-9853-bb9028b758f1-kube-api-access-kxmhr\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.148064 master-0 kubenswrapper[36504]: I1203 23:26:36.147924 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-catalog-content\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.148490 master-0 kubenswrapper[36504]: I1203 23:26:36.148110 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-utilities\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.148490 master-0 kubenswrapper[36504]: I1203 23:26:36.148444 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-catalog-content\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.745817 master-0 kubenswrapper[36504]: I1203 23:26:36.745697 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmhr\" (UniqueName: \"kubernetes.io/projected/95a96fc0-edb1-45b4-9853-bb9028b758f1-kube-api-access-kxmhr\") pod \"redhat-marketplace-c5pp2\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:36.905474 master-0 kubenswrapper[36504]: I1203 23:26:36.905391 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:37.408389 master-0 kubenswrapper[36504]: W1203 23:26:37.402124 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a96fc0_edb1_45b4_9853_bb9028b758f1.slice/crio-bcc5c4d33ec8cda7e795db3ec43fddd5a4c843ca80879875a8cf40af6ef0786f WatchSource:0}: Error finding container bcc5c4d33ec8cda7e795db3ec43fddd5a4c843ca80879875a8cf40af6ef0786f: Status 404 returned error can't find the container with id bcc5c4d33ec8cda7e795db3ec43fddd5a4c843ca80879875a8cf40af6ef0786f Dec 03 23:26:37.408389 master-0 kubenswrapper[36504]: I1203 23:26:37.407303 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5pp2"] Dec 03 23:26:37.784444 master-0 kubenswrapper[36504]: E1203 23:26:37.784347 36504 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a96fc0_edb1_45b4_9853_bb9028b758f1.slice/crio-d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a96fc0_edb1_45b4_9853_bb9028b758f1.slice/crio-conmon-d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad.scope\": RecentStats: unable to find data in memory cache]" Dec 03 23:26:38.377005 master-0 kubenswrapper[36504]: I1203 23:26:38.376936 36504 generic.go:334] "Generic (PLEG): container finished" podID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerID="d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad" exitCode=0 Dec 03 23:26:38.377382 master-0 kubenswrapper[36504]: I1203 23:26:38.377053 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5pp2" event={"ID":"95a96fc0-edb1-45b4-9853-bb9028b758f1","Type":"ContainerDied","Data":"d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad"} Dec 03 23:26:38.377512 master-0 kubenswrapper[36504]: I1203 23:26:38.377492 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5pp2" event={"ID":"95a96fc0-edb1-45b4-9853-bb9028b758f1","Type":"ContainerStarted","Data":"bcc5c4d33ec8cda7e795db3ec43fddd5a4c843ca80879875a8cf40af6ef0786f"} Dec 03 23:26:38.380352 master-0 kubenswrapper[36504]: I1203 23:26:38.379904 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:26:39.433833 master-0 kubenswrapper[36504]: I1203 23:26:39.433605 36504 generic.go:334] "Generic (PLEG): container finished" podID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerID="8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9" exitCode=0 Dec 03 23:26:39.433833 master-0 kubenswrapper[36504]: I1203 23:26:39.433686 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5pp2" event={"ID":"95a96fc0-edb1-45b4-9853-bb9028b758f1","Type":"ContainerDied","Data":"8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9"} Dec 03 23:26:40.450520 master-0 kubenswrapper[36504]: I1203 23:26:40.450431 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5pp2" event={"ID":"95a96fc0-edb1-45b4-9853-bb9028b758f1","Type":"ContainerStarted","Data":"bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735"} Dec 03 23:26:40.478583 master-0 kubenswrapper[36504]: I1203 23:26:40.478469 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c5pp2" podStartSLOduration=3.878117838 podStartE2EDuration="5.478444617s" podCreationTimestamp="2025-12-03 23:26:35 +0000 UTC" firstStartedPulling="2025-12-03 23:26:38.379816656 +0000 UTC m=+4563.599588663" lastFinishedPulling="2025-12-03 23:26:39.980143445 +0000 UTC m=+4565.199915442" observedRunningTime="2025-12-03 23:26:40.475796544 +0000 UTC m=+4565.695568561" watchObservedRunningTime="2025-12-03 23:26:40.478444617 +0000 UTC m=+4565.698216624" Dec 03 23:26:43.097164 master-0 kubenswrapper[36504]: I1203 23:26:43.097110 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:26:46.906228 master-0 kubenswrapper[36504]: I1203 23:26:46.906114 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:46.907843 master-0 kubenswrapper[36504]: I1203 23:26:46.907802 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:46.963583 master-0 kubenswrapper[36504]: I1203 23:26:46.963516 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:47.646717 master-0 kubenswrapper[36504]: I1203 23:26:47.646657 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:47.723470 master-0 kubenswrapper[36504]: I1203 23:26:47.722164 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5pp2"] Dec 03 23:26:49.592004 master-0 kubenswrapper[36504]: I1203 23:26:49.591908 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c5pp2" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="registry-server" containerID="cri-o://bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735" gracePeriod=2 Dec 03 23:26:50.284698 master-0 kubenswrapper[36504]: I1203 23:26:50.284640 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:50.422349 master-0 kubenswrapper[36504]: I1203 23:26:50.422212 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-utilities\") pod \"95a96fc0-edb1-45b4-9853-bb9028b758f1\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " Dec 03 23:26:50.422703 master-0 kubenswrapper[36504]: I1203 23:26:50.422685 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxmhr\" (UniqueName: \"kubernetes.io/projected/95a96fc0-edb1-45b4-9853-bb9028b758f1-kube-api-access-kxmhr\") pod \"95a96fc0-edb1-45b4-9853-bb9028b758f1\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " Dec 03 23:26:50.422869 master-0 kubenswrapper[36504]: I1203 23:26:50.422855 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-catalog-content\") pod \"95a96fc0-edb1-45b4-9853-bb9028b758f1\" (UID: \"95a96fc0-edb1-45b4-9853-bb9028b758f1\") " Dec 03 23:26:50.423886 master-0 kubenswrapper[36504]: I1203 23:26:50.423737 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-utilities" (OuterVolumeSpecName: "utilities") pod "95a96fc0-edb1-45b4-9853-bb9028b758f1" (UID: "95a96fc0-edb1-45b4-9853-bb9028b758f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:26:50.424589 master-0 kubenswrapper[36504]: I1203 23:26:50.424562 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:26:50.427438 master-0 kubenswrapper[36504]: I1203 23:26:50.427332 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a96fc0-edb1-45b4-9853-bb9028b758f1-kube-api-access-kxmhr" (OuterVolumeSpecName: "kube-api-access-kxmhr") pod "95a96fc0-edb1-45b4-9853-bb9028b758f1" (UID: "95a96fc0-edb1-45b4-9853-bb9028b758f1"). InnerVolumeSpecName "kube-api-access-kxmhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:26:50.462659 master-0 kubenswrapper[36504]: I1203 23:26:50.462499 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95a96fc0-edb1-45b4-9853-bb9028b758f1" (UID: "95a96fc0-edb1-45b4-9853-bb9028b758f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:26:50.528218 master-0 kubenswrapper[36504]: I1203 23:26:50.528123 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxmhr\" (UniqueName: \"kubernetes.io/projected/95a96fc0-edb1-45b4-9853-bb9028b758f1-kube-api-access-kxmhr\") on node \"master-0\" DevicePath \"\"" Dec 03 23:26:50.528218 master-0 kubenswrapper[36504]: I1203 23:26:50.528229 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a96fc0-edb1-45b4-9853-bb9028b758f1-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:26:50.610887 master-0 kubenswrapper[36504]: I1203 23:26:50.610817 36504 generic.go:334] "Generic (PLEG): container finished" podID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerID="bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735" exitCode=0 Dec 03 23:26:50.611612 master-0 kubenswrapper[36504]: I1203 23:26:50.610893 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5pp2" event={"ID":"95a96fc0-edb1-45b4-9853-bb9028b758f1","Type":"ContainerDied","Data":"bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735"} Dec 03 23:26:50.611612 master-0 kubenswrapper[36504]: I1203 23:26:50.610944 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c5pp2" event={"ID":"95a96fc0-edb1-45b4-9853-bb9028b758f1","Type":"ContainerDied","Data":"bcc5c4d33ec8cda7e795db3ec43fddd5a4c843ca80879875a8cf40af6ef0786f"} Dec 03 23:26:50.611612 master-0 kubenswrapper[36504]: I1203 23:26:50.610969 36504 scope.go:117] "RemoveContainer" containerID="bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735" Dec 03 23:26:50.611983 master-0 kubenswrapper[36504]: I1203 23:26:50.611929 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c5pp2" Dec 03 23:26:50.642834 master-0 kubenswrapper[36504]: I1203 23:26:50.642763 36504 scope.go:117] "RemoveContainer" containerID="8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9" Dec 03 23:26:50.676124 master-0 kubenswrapper[36504]: I1203 23:26:50.675984 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5pp2"] Dec 03 23:26:50.693407 master-0 kubenswrapper[36504]: I1203 23:26:50.693354 36504 scope.go:117] "RemoveContainer" containerID="d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad" Dec 03 23:26:50.694229 master-0 kubenswrapper[36504]: I1203 23:26:50.694175 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c5pp2"] Dec 03 23:26:50.736295 master-0 kubenswrapper[36504]: I1203 23:26:50.736251 36504 scope.go:117] "RemoveContainer" containerID="bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735" Dec 03 23:26:50.737164 master-0 kubenswrapper[36504]: E1203 23:26:50.737112 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735\": container with ID starting with bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735 not found: ID does not exist" containerID="bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735" Dec 03 23:26:50.737258 master-0 kubenswrapper[36504]: I1203 23:26:50.737181 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735"} err="failed to get container status \"bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735\": rpc error: code = NotFound desc = could not find container \"bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735\": container with ID starting with bd8257c78ea58bc4f3fd8ee6030eb76fd214fa4625828a64e9747b88e8583735 not found: ID does not exist" Dec 03 23:26:50.737258 master-0 kubenswrapper[36504]: I1203 23:26:50.737219 36504 scope.go:117] "RemoveContainer" containerID="8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9" Dec 03 23:26:50.737613 master-0 kubenswrapper[36504]: E1203 23:26:50.737578 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9\": container with ID starting with 8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9 not found: ID does not exist" containerID="8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9" Dec 03 23:26:50.737700 master-0 kubenswrapper[36504]: I1203 23:26:50.737611 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9"} err="failed to get container status \"8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9\": rpc error: code = NotFound desc = could not find container \"8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9\": container with ID starting with 8412a23f5bc647a90618422a7664926938115af93452022f6c40582a1a5ceef9 not found: ID does not exist" Dec 03 23:26:50.737700 master-0 kubenswrapper[36504]: I1203 23:26:50.737632 36504 scope.go:117] "RemoveContainer" containerID="d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad" Dec 03 23:26:50.738009 master-0 kubenswrapper[36504]: E1203 23:26:50.737984 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad\": container with ID starting with d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad not found: ID does not exist" containerID="d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad" Dec 03 23:26:50.738088 master-0 kubenswrapper[36504]: I1203 23:26:50.738012 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad"} err="failed to get container status \"d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad\": rpc error: code = NotFound desc = could not find container \"d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad\": container with ID starting with d3c83722be13f49e4455a1b541526ca0e8ad2a296914d2fa462d73d98c3d8cad not found: ID does not exist" Dec 03 23:26:51.114007 master-0 kubenswrapper[36504]: I1203 23:26:51.113856 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" path="/var/lib/kubelet/pods/95a96fc0-edb1-45b4-9853-bb9028b758f1/volumes" Dec 03 23:27:03.096845 master-0 kubenswrapper[36504]: I1203 23:27:03.096741 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:27:35.727214 master-0 kubenswrapper[36504]: E1203 23:27:35.727131 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:28:02.096441 master-0 kubenswrapper[36504]: I1203 23:28:02.096361 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:28:17.105802 master-0 kubenswrapper[36504]: I1203 23:28:17.101719 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:28:35.740186 master-0 kubenswrapper[36504]: E1203 23:28:35.740119 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:29:19.095162 master-0 kubenswrapper[36504]: I1203 23:29:19.095083 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:29:20.097016 master-0 kubenswrapper[36504]: I1203 23:29:20.096944 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:29:35.733098 master-0 kubenswrapper[36504]: E1203 23:29:35.733033 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:30:00.198173 master-0 kubenswrapper[36504]: I1203 23:30:00.197083 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks"] Dec 03 23:30:00.198173 master-0 kubenswrapper[36504]: E1203 23:30:00.198149 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="extract-utilities" Dec 03 23:30:00.198173 master-0 kubenswrapper[36504]: I1203 23:30:00.198176 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="extract-utilities" Dec 03 23:30:00.199131 master-0 kubenswrapper[36504]: E1203 23:30:00.198221 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="registry-server" Dec 03 23:30:00.199131 master-0 kubenswrapper[36504]: I1203 23:30:00.198229 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="registry-server" Dec 03 23:30:00.199131 master-0 kubenswrapper[36504]: E1203 23:30:00.198326 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="extract-content" Dec 03 23:30:00.199131 master-0 kubenswrapper[36504]: I1203 23:30:00.198338 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="extract-content" Dec 03 23:30:00.199131 master-0 kubenswrapper[36504]: I1203 23:30:00.198683 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a96fc0-edb1-45b4-9853-bb9028b758f1" containerName="registry-server" Dec 03 23:30:00.206422 master-0 kubenswrapper[36504]: I1203 23:30:00.200230 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.206422 master-0 kubenswrapper[36504]: I1203 23:30:00.203278 36504 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-nwpql" Dec 03 23:30:00.206422 master-0 kubenswrapper[36504]: I1203 23:30:00.203296 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:30:00.244912 master-0 kubenswrapper[36504]: I1203 23:30:00.240467 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks"] Dec 03 23:30:00.265310 master-0 kubenswrapper[36504]: I1203 23:30:00.264813 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-config-volume\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.265585 master-0 kubenswrapper[36504]: I1203 23:30:00.265386 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-secret-volume\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.265585 master-0 kubenswrapper[36504]: I1203 23:30:00.265443 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hgg9\" (UniqueName: \"kubernetes.io/projected/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-kube-api-access-4hgg9\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.374951 master-0 kubenswrapper[36504]: I1203 23:30:00.374861 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-secret-volume\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.375298 master-0 kubenswrapper[36504]: I1203 23:30:00.375035 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hgg9\" (UniqueName: \"kubernetes.io/projected/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-kube-api-access-4hgg9\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.375812 master-0 kubenswrapper[36504]: I1203 23:30:00.375757 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-config-volume\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.378016 master-0 kubenswrapper[36504]: I1203 23:30:00.377827 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-config-volume\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.386379 master-0 kubenswrapper[36504]: I1203 23:30:00.386280 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-secret-volume\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.404593 master-0 kubenswrapper[36504]: I1203 23:30:00.404521 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hgg9\" (UniqueName: \"kubernetes.io/projected/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-kube-api-access-4hgg9\") pod \"collect-profiles-29413410-h4vks\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:00.547323 master-0 kubenswrapper[36504]: I1203 23:30:00.547156 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:01.069698 master-0 kubenswrapper[36504]: I1203 23:30:01.069637 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks"] Dec 03 23:30:01.844701 master-0 kubenswrapper[36504]: I1203 23:30:01.844547 36504 generic.go:334] "Generic (PLEG): container finished" podID="ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" containerID="0cc3ad6cfc90883df86a454186204aac022b437cefcacf810e74ae99270ac184" exitCode=0 Dec 03 23:30:01.844701 master-0 kubenswrapper[36504]: I1203 23:30:01.844640 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" event={"ID":"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3","Type":"ContainerDied","Data":"0cc3ad6cfc90883df86a454186204aac022b437cefcacf810e74ae99270ac184"} Dec 03 23:30:01.844701 master-0 kubenswrapper[36504]: I1203 23:30:01.844687 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" event={"ID":"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3","Type":"ContainerStarted","Data":"6bc91b6b59e9d8718a99cf10234650588867b65a427d8103097af1af1e38546c"} Dec 03 23:30:03.668504 master-0 kubenswrapper[36504]: I1203 23:30:03.668448 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:03.823353 master-0 kubenswrapper[36504]: I1203 23:30:03.823203 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hgg9\" (UniqueName: \"kubernetes.io/projected/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-kube-api-access-4hgg9\") pod \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " Dec 03 23:30:03.823640 master-0 kubenswrapper[36504]: I1203 23:30:03.823479 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-config-volume\") pod \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " Dec 03 23:30:03.823640 master-0 kubenswrapper[36504]: I1203 23:30:03.823599 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-secret-volume\") pod \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\" (UID: \"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3\") " Dec 03 23:30:03.824382 master-0 kubenswrapper[36504]: I1203 23:30:03.824245 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-config-volume" (OuterVolumeSpecName: "config-volume") pod "ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" (UID: "ece28a14-e4b7-4f9a-8365-9a87dfbe0be3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:30:03.825907 master-0 kubenswrapper[36504]: I1203 23:30:03.825792 36504 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-config-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 23:30:03.828872 master-0 kubenswrapper[36504]: I1203 23:30:03.828712 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-kube-api-access-4hgg9" (OuterVolumeSpecName: "kube-api-access-4hgg9") pod "ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" (UID: "ece28a14-e4b7-4f9a-8365-9a87dfbe0be3"). InnerVolumeSpecName "kube-api-access-4hgg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:30:03.830259 master-0 kubenswrapper[36504]: I1203 23:30:03.830187 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" (UID: "ece28a14-e4b7-4f9a-8365-9a87dfbe0be3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:30:03.879634 master-0 kubenswrapper[36504]: I1203 23:30:03.879535 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" event={"ID":"ece28a14-e4b7-4f9a-8365-9a87dfbe0be3","Type":"ContainerDied","Data":"6bc91b6b59e9d8718a99cf10234650588867b65a427d8103097af1af1e38546c"} Dec 03 23:30:03.879634 master-0 kubenswrapper[36504]: I1203 23:30:03.879620 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bc91b6b59e9d8718a99cf10234650588867b65a427d8103097af1af1e38546c" Dec 03 23:30:03.880108 master-0 kubenswrapper[36504]: I1203 23:30:03.879747 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-h4vks" Dec 03 23:30:03.929144 master-0 kubenswrapper[36504]: I1203 23:30:03.929063 36504 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-secret-volume\") on node \"master-0\" DevicePath \"\"" Dec 03 23:30:03.929144 master-0 kubenswrapper[36504]: I1203 23:30:03.929128 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hgg9\" (UniqueName: \"kubernetes.io/projected/ece28a14-e4b7-4f9a-8365-9a87dfbe0be3-kube-api-access-4hgg9\") on node \"master-0\" DevicePath \"\"" Dec 03 23:30:04.784795 master-0 kubenswrapper[36504]: I1203 23:30:04.784096 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z"] Dec 03 23:30:04.799794 master-0 kubenswrapper[36504]: I1203 23:30:04.797855 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413365-mbz2z"] Dec 03 23:30:05.116322 master-0 kubenswrapper[36504]: I1203 23:30:05.116255 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab5d874-2936-4181-a579-3b3da719081f" path="/var/lib/kubelet/pods/9ab5d874-2936-4181-a579-3b3da719081f/volumes" Dec 03 23:30:20.098888 master-0 kubenswrapper[36504]: I1203 23:30:20.096531 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:30:23.097357 master-0 kubenswrapper[36504]: I1203 23:30:23.097259 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:30:35.746263 master-0 kubenswrapper[36504]: E1203 23:30:35.746200 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:30:48.296804 master-0 kubenswrapper[36504]: I1203 23:30:48.296698 36504 scope.go:117] "RemoveContainer" containerID="8c6ae18dd798993682f984a97ff8a834c1c36f8a63cb3a7dababb0bcd39b3a1d" Dec 03 23:31:14.494972 master-0 kubenswrapper[36504]: I1203 23:31:14.494870 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s25rq"] Dec 03 23:31:14.496238 master-0 kubenswrapper[36504]: E1203 23:31:14.495961 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" containerName="collect-profiles" Dec 03 23:31:14.496238 master-0 kubenswrapper[36504]: I1203 23:31:14.495990 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" containerName="collect-profiles" Dec 03 23:31:14.496376 master-0 kubenswrapper[36504]: I1203 23:31:14.496357 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ece28a14-e4b7-4f9a-8365-9a87dfbe0be3" containerName="collect-profiles" Dec 03 23:31:14.499397 master-0 kubenswrapper[36504]: I1203 23:31:14.499358 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.527739 master-0 kubenswrapper[36504]: I1203 23:31:14.515355 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s25rq"] Dec 03 23:31:14.592747 master-0 kubenswrapper[36504]: I1203 23:31:14.592662 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-utilities\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.593063 master-0 kubenswrapper[36504]: I1203 23:31:14.592805 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sjff\" (UniqueName: \"kubernetes.io/projected/d2a9e70c-e645-4b6d-b522-45dac740244e-kube-api-access-8sjff\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.593063 master-0 kubenswrapper[36504]: I1203 23:31:14.592976 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-catalog-content\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.696065 master-0 kubenswrapper[36504]: I1203 23:31:14.695961 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sjff\" (UniqueName: \"kubernetes.io/projected/d2a9e70c-e645-4b6d-b522-45dac740244e-kube-api-access-8sjff\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.696352 master-0 kubenswrapper[36504]: I1203 23:31:14.696279 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-catalog-content\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.696505 master-0 kubenswrapper[36504]: I1203 23:31:14.696475 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-utilities\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.697254 master-0 kubenswrapper[36504]: I1203 23:31:14.697192 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-catalog-content\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.697346 master-0 kubenswrapper[36504]: I1203 23:31:14.697226 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-utilities\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.713957 master-0 kubenswrapper[36504]: I1203 23:31:14.713878 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sjff\" (UniqueName: \"kubernetes.io/projected/d2a9e70c-e645-4b6d-b522-45dac740244e-kube-api-access-8sjff\") pod \"certified-operators-s25rq\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:14.837568 master-0 kubenswrapper[36504]: I1203 23:31:14.837412 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:15.397733 master-0 kubenswrapper[36504]: I1203 23:31:15.397652 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s25rq"] Dec 03 23:31:15.405069 master-0 kubenswrapper[36504]: W1203 23:31:15.404997 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a9e70c_e645_4b6d_b522_45dac740244e.slice/crio-7480fff257dfcab1ed9ca5c53fa0b50ad221ddbfd0725fe75d41818118817c23 WatchSource:0}: Error finding container 7480fff257dfcab1ed9ca5c53fa0b50ad221ddbfd0725fe75d41818118817c23: Status 404 returned error can't find the container with id 7480fff257dfcab1ed9ca5c53fa0b50ad221ddbfd0725fe75d41818118817c23 Dec 03 23:31:15.996259 master-0 kubenswrapper[36504]: I1203 23:31:15.996080 36504 generic.go:334] "Generic (PLEG): container finished" podID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerID="65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899" exitCode=0 Dec 03 23:31:15.996259 master-0 kubenswrapper[36504]: I1203 23:31:15.996167 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s25rq" event={"ID":"d2a9e70c-e645-4b6d-b522-45dac740244e","Type":"ContainerDied","Data":"65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899"} Dec 03 23:31:15.996259 master-0 kubenswrapper[36504]: I1203 23:31:15.996215 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s25rq" event={"ID":"d2a9e70c-e645-4b6d-b522-45dac740244e","Type":"ContainerStarted","Data":"7480fff257dfcab1ed9ca5c53fa0b50ad221ddbfd0725fe75d41818118817c23"} Dec 03 23:31:18.024508 master-0 kubenswrapper[36504]: I1203 23:31:18.024429 36504 generic.go:334] "Generic (PLEG): container finished" podID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerID="ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97" exitCode=0 Dec 03 23:31:18.024508 master-0 kubenswrapper[36504]: I1203 23:31:18.024497 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s25rq" event={"ID":"d2a9e70c-e645-4b6d-b522-45dac740244e","Type":"ContainerDied","Data":"ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97"} Dec 03 23:31:20.057567 master-0 kubenswrapper[36504]: I1203 23:31:20.057505 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s25rq" event={"ID":"d2a9e70c-e645-4b6d-b522-45dac740244e","Type":"ContainerStarted","Data":"6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063"} Dec 03 23:31:20.094029 master-0 kubenswrapper[36504]: I1203 23:31:20.093667 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s25rq" podStartSLOduration=3.61058175 podStartE2EDuration="6.093630934s" podCreationTimestamp="2025-12-03 23:31:14 +0000 UTC" firstStartedPulling="2025-12-03 23:31:15.999196335 +0000 UTC m=+4841.218968362" lastFinishedPulling="2025-12-03 23:31:18.482245529 +0000 UTC m=+4843.702017546" observedRunningTime="2025-12-03 23:31:20.079387437 +0000 UTC m=+4845.299159464" watchObservedRunningTime="2025-12-03 23:31:20.093630934 +0000 UTC m=+4845.313402941" Dec 03 23:31:24.096047 master-0 kubenswrapper[36504]: I1203 23:31:24.095866 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:31:24.838200 master-0 kubenswrapper[36504]: I1203 23:31:24.838037 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:24.840182 master-0 kubenswrapper[36504]: I1203 23:31:24.840133 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:24.929868 master-0 kubenswrapper[36504]: I1203 23:31:24.929804 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:25.200560 master-0 kubenswrapper[36504]: I1203 23:31:25.200492 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:25.282796 master-0 kubenswrapper[36504]: I1203 23:31:25.282693 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s25rq"] Dec 03 23:31:27.170169 master-0 kubenswrapper[36504]: I1203 23:31:27.170072 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s25rq" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="registry-server" containerID="cri-o://6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063" gracePeriod=2 Dec 03 23:31:27.843666 master-0 kubenswrapper[36504]: I1203 23:31:27.843603 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:27.993289 master-0 kubenswrapper[36504]: I1203 23:31:27.993140 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-utilities\") pod \"d2a9e70c-e645-4b6d-b522-45dac740244e\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " Dec 03 23:31:27.993289 master-0 kubenswrapper[36504]: I1203 23:31:27.993283 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sjff\" (UniqueName: \"kubernetes.io/projected/d2a9e70c-e645-4b6d-b522-45dac740244e-kube-api-access-8sjff\") pod \"d2a9e70c-e645-4b6d-b522-45dac740244e\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " Dec 03 23:31:27.993743 master-0 kubenswrapper[36504]: I1203 23:31:27.993705 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-catalog-content\") pod \"d2a9e70c-e645-4b6d-b522-45dac740244e\" (UID: \"d2a9e70c-e645-4b6d-b522-45dac740244e\") " Dec 03 23:31:27.995135 master-0 kubenswrapper[36504]: I1203 23:31:27.994829 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-utilities" (OuterVolumeSpecName: "utilities") pod "d2a9e70c-e645-4b6d-b522-45dac740244e" (UID: "d2a9e70c-e645-4b6d-b522-45dac740244e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:31:27.999889 master-0 kubenswrapper[36504]: I1203 23:31:27.999831 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a9e70c-e645-4b6d-b522-45dac740244e-kube-api-access-8sjff" (OuterVolumeSpecName: "kube-api-access-8sjff") pod "d2a9e70c-e645-4b6d-b522-45dac740244e" (UID: "d2a9e70c-e645-4b6d-b522-45dac740244e"). InnerVolumeSpecName "kube-api-access-8sjff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:31:28.058033 master-0 kubenswrapper[36504]: I1203 23:31:28.057838 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2a9e70c-e645-4b6d-b522-45dac740244e" (UID: "d2a9e70c-e645-4b6d-b522-45dac740244e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:31:28.102893 master-0 kubenswrapper[36504]: I1203 23:31:28.101158 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:31:28.102893 master-0 kubenswrapper[36504]: I1203 23:31:28.101216 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sjff\" (UniqueName: \"kubernetes.io/projected/d2a9e70c-e645-4b6d-b522-45dac740244e-kube-api-access-8sjff\") on node \"master-0\" DevicePath \"\"" Dec 03 23:31:28.102893 master-0 kubenswrapper[36504]: I1203 23:31:28.101228 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2a9e70c-e645-4b6d-b522-45dac740244e-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:31:28.186810 master-0 kubenswrapper[36504]: I1203 23:31:28.186737 36504 generic.go:334] "Generic (PLEG): container finished" podID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerID="6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063" exitCode=0 Dec 03 23:31:28.187541 master-0 kubenswrapper[36504]: I1203 23:31:28.186826 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s25rq" event={"ID":"d2a9e70c-e645-4b6d-b522-45dac740244e","Type":"ContainerDied","Data":"6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063"} Dec 03 23:31:28.187541 master-0 kubenswrapper[36504]: I1203 23:31:28.186876 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s25rq" Dec 03 23:31:28.187541 master-0 kubenswrapper[36504]: I1203 23:31:28.186915 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s25rq" event={"ID":"d2a9e70c-e645-4b6d-b522-45dac740244e","Type":"ContainerDied","Data":"7480fff257dfcab1ed9ca5c53fa0b50ad221ddbfd0725fe75d41818118817c23"} Dec 03 23:31:28.187541 master-0 kubenswrapper[36504]: I1203 23:31:28.186948 36504 scope.go:117] "RemoveContainer" containerID="6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063" Dec 03 23:31:28.223171 master-0 kubenswrapper[36504]: I1203 23:31:28.223120 36504 scope.go:117] "RemoveContainer" containerID="ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97" Dec 03 23:31:28.261321 master-0 kubenswrapper[36504]: I1203 23:31:28.261111 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s25rq"] Dec 03 23:31:28.265935 master-0 kubenswrapper[36504]: I1203 23:31:28.265893 36504 scope.go:117] "RemoveContainer" containerID="65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899" Dec 03 23:31:28.278684 master-0 kubenswrapper[36504]: I1203 23:31:28.278592 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s25rq"] Dec 03 23:31:28.343640 master-0 kubenswrapper[36504]: I1203 23:31:28.343571 36504 scope.go:117] "RemoveContainer" containerID="6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063" Dec 03 23:31:28.344341 master-0 kubenswrapper[36504]: E1203 23:31:28.344297 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063\": container with ID starting with 6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063 not found: ID does not exist" containerID="6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063" Dec 03 23:31:28.344433 master-0 kubenswrapper[36504]: I1203 23:31:28.344347 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063"} err="failed to get container status \"6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063\": rpc error: code = NotFound desc = could not find container \"6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063\": container with ID starting with 6b90fcb4f72272c1df81720469dbc55861baccdf0c567afed543554918c1c063 not found: ID does not exist" Dec 03 23:31:28.344433 master-0 kubenswrapper[36504]: I1203 23:31:28.344427 36504 scope.go:117] "RemoveContainer" containerID="ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97" Dec 03 23:31:28.345006 master-0 kubenswrapper[36504]: E1203 23:31:28.344945 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97\": container with ID starting with ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97 not found: ID does not exist" containerID="ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97" Dec 03 23:31:28.345086 master-0 kubenswrapper[36504]: I1203 23:31:28.344996 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97"} err="failed to get container status \"ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97\": rpc error: code = NotFound desc = could not find container \"ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97\": container with ID starting with ba40be22e074108107da00ff3470b4c378fd38fa08e549b1736f9fc42e36be97 not found: ID does not exist" Dec 03 23:31:28.345086 master-0 kubenswrapper[36504]: I1203 23:31:28.345020 36504 scope.go:117] "RemoveContainer" containerID="65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899" Dec 03 23:31:28.345681 master-0 kubenswrapper[36504]: E1203 23:31:28.345640 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899\": container with ID starting with 65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899 not found: ID does not exist" containerID="65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899" Dec 03 23:31:28.345747 master-0 kubenswrapper[36504]: I1203 23:31:28.345684 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899"} err="failed to get container status \"65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899\": rpc error: code = NotFound desc = could not find container \"65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899\": container with ID starting with 65bc55b9560538a02a742d5717d29661a56d143f7c6f5d11fc474a6f0b9a0899 not found: ID does not exist" Dec 03 23:31:29.111907 master-0 kubenswrapper[36504]: I1203 23:31:29.111821 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" path="/var/lib/kubelet/pods/d2a9e70c-e645-4b6d-b522-45dac740244e/volumes" Dec 03 23:31:33.377899 master-0 kubenswrapper[36504]: I1203 23:31:33.377793 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9hj8v"] Dec 03 23:31:33.379199 master-0 kubenswrapper[36504]: E1203 23:31:33.378734 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="extract-utilities" Dec 03 23:31:33.379199 master-0 kubenswrapper[36504]: I1203 23:31:33.378945 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="extract-utilities" Dec 03 23:31:33.379199 master-0 kubenswrapper[36504]: E1203 23:31:33.379010 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="registry-server" Dec 03 23:31:33.379199 master-0 kubenswrapper[36504]: I1203 23:31:33.379021 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="registry-server" Dec 03 23:31:33.379199 master-0 kubenswrapper[36504]: E1203 23:31:33.379123 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="extract-content" Dec 03 23:31:33.379199 master-0 kubenswrapper[36504]: I1203 23:31:33.379136 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="extract-content" Dec 03 23:31:33.381528 master-0 kubenswrapper[36504]: I1203 23:31:33.380297 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a9e70c-e645-4b6d-b522-45dac740244e" containerName="registry-server" Dec 03 23:31:33.383874 master-0 kubenswrapper[36504]: I1203 23:31:33.383753 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.402173 master-0 kubenswrapper[36504]: I1203 23:31:33.402096 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hj8v"] Dec 03 23:31:33.480043 master-0 kubenswrapper[36504]: I1203 23:31:33.479843 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-catalog-content\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.481281 master-0 kubenswrapper[36504]: I1203 23:31:33.480404 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-utilities\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.481281 master-0 kubenswrapper[36504]: I1203 23:31:33.480682 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtmh\" (UniqueName: \"kubernetes.io/projected/9036f786-2706-448d-a19a-0cdf6836bd5e-kube-api-access-mqtmh\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.584466 master-0 kubenswrapper[36504]: I1203 23:31:33.583492 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-catalog-content\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.584466 master-0 kubenswrapper[36504]: I1203 23:31:33.583674 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-utilities\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.584466 master-0 kubenswrapper[36504]: I1203 23:31:33.583764 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtmh\" (UniqueName: \"kubernetes.io/projected/9036f786-2706-448d-a19a-0cdf6836bd5e-kube-api-access-mqtmh\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.584914 master-0 kubenswrapper[36504]: I1203 23:31:33.584485 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-catalog-content\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.584914 master-0 kubenswrapper[36504]: I1203 23:31:33.584531 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-utilities\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.609185 master-0 kubenswrapper[36504]: I1203 23:31:33.608761 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtmh\" (UniqueName: \"kubernetes.io/projected/9036f786-2706-448d-a19a-0cdf6836bd5e-kube-api-access-mqtmh\") pod \"community-operators-9hj8v\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:33.768418 master-0 kubenswrapper[36504]: I1203 23:31:33.768253 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:34.288465 master-0 kubenswrapper[36504]: I1203 23:31:34.288384 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9hj8v"] Dec 03 23:31:35.329712 master-0 kubenswrapper[36504]: I1203 23:31:35.328333 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hj8v" event={"ID":"9036f786-2706-448d-a19a-0cdf6836bd5e","Type":"ContainerStarted","Data":"960a6721af0dedc507e778c859b64b2ab7ff834c01d5d1ef2f3cc2d6bf76ba75"} Dec 03 23:31:35.725728 master-0 kubenswrapper[36504]: E1203 23:31:35.725654 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:31:36.344719 master-0 kubenswrapper[36504]: I1203 23:31:36.344644 36504 generic.go:334] "Generic (PLEG): container finished" podID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerID="afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4" exitCode=0 Dec 03 23:31:36.344719 master-0 kubenswrapper[36504]: I1203 23:31:36.344717 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hj8v" event={"ID":"9036f786-2706-448d-a19a-0cdf6836bd5e","Type":"ContainerDied","Data":"afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4"} Dec 03 23:31:37.933999 master-0 kubenswrapper[36504]: I1203 23:31:37.933909 36504 trace.go:236] Trace[535094279]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (03-Dec-2025 23:31:36.919) (total time: 1014ms): Dec 03 23:31:37.933999 master-0 kubenswrapper[36504]: Trace[535094279]: [1.014577168s] [1.014577168s] END Dec 03 23:31:38.375346 master-0 kubenswrapper[36504]: I1203 23:31:38.375241 36504 generic.go:334] "Generic (PLEG): container finished" podID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerID="c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618" exitCode=0 Dec 03 23:31:38.375346 master-0 kubenswrapper[36504]: I1203 23:31:38.375318 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hj8v" event={"ID":"9036f786-2706-448d-a19a-0cdf6836bd5e","Type":"ContainerDied","Data":"c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618"} Dec 03 23:31:38.389691 master-0 kubenswrapper[36504]: I1203 23:31:38.389571 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:31:39.393021 master-0 kubenswrapper[36504]: I1203 23:31:39.392935 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hj8v" event={"ID":"9036f786-2706-448d-a19a-0cdf6836bd5e","Type":"ContainerStarted","Data":"85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af"} Dec 03 23:31:39.420528 master-0 kubenswrapper[36504]: I1203 23:31:39.420439 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9hj8v" podStartSLOduration=3.959929069 podStartE2EDuration="6.420410346s" podCreationTimestamp="2025-12-03 23:31:33 +0000 UTC" firstStartedPulling="2025-12-03 23:31:36.348265525 +0000 UTC m=+4861.568037532" lastFinishedPulling="2025-12-03 23:31:38.808746802 +0000 UTC m=+4864.028518809" observedRunningTime="2025-12-03 23:31:39.416437171 +0000 UTC m=+4864.636209198" watchObservedRunningTime="2025-12-03 23:31:39.420410346 +0000 UTC m=+4864.640182353" Dec 03 23:31:40.098740 master-0 kubenswrapper[36504]: I1203 23:31:40.098655 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:31:43.768547 master-0 kubenswrapper[36504]: I1203 23:31:43.768477 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:43.768547 master-0 kubenswrapper[36504]: I1203 23:31:43.768546 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:43.827718 master-0 kubenswrapper[36504]: I1203 23:31:43.827635 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:44.520523 master-0 kubenswrapper[36504]: I1203 23:31:44.520464 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:45.079602 master-0 kubenswrapper[36504]: I1203 23:31:45.079516 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hj8v"] Dec 03 23:31:46.501017 master-0 kubenswrapper[36504]: I1203 23:31:46.500917 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9hj8v" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="registry-server" containerID="cri-o://85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af" gracePeriod=2 Dec 03 23:31:47.163166 master-0 kubenswrapper[36504]: I1203 23:31:47.163101 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:47.300051 master-0 kubenswrapper[36504]: I1203 23:31:47.299901 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqtmh\" (UniqueName: \"kubernetes.io/projected/9036f786-2706-448d-a19a-0cdf6836bd5e-kube-api-access-mqtmh\") pod \"9036f786-2706-448d-a19a-0cdf6836bd5e\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " Dec 03 23:31:47.300561 master-0 kubenswrapper[36504]: I1203 23:31:47.300543 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-catalog-content\") pod \"9036f786-2706-448d-a19a-0cdf6836bd5e\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " Dec 03 23:31:47.300722 master-0 kubenswrapper[36504]: I1203 23:31:47.300699 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-utilities\") pod \"9036f786-2706-448d-a19a-0cdf6836bd5e\" (UID: \"9036f786-2706-448d-a19a-0cdf6836bd5e\") " Dec 03 23:31:47.302707 master-0 kubenswrapper[36504]: I1203 23:31:47.302655 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-utilities" (OuterVolumeSpecName: "utilities") pod "9036f786-2706-448d-a19a-0cdf6836bd5e" (UID: "9036f786-2706-448d-a19a-0cdf6836bd5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:31:47.303327 master-0 kubenswrapper[36504]: I1203 23:31:47.303279 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9036f786-2706-448d-a19a-0cdf6836bd5e-kube-api-access-mqtmh" (OuterVolumeSpecName: "kube-api-access-mqtmh") pod "9036f786-2706-448d-a19a-0cdf6836bd5e" (UID: "9036f786-2706-448d-a19a-0cdf6836bd5e"). InnerVolumeSpecName "kube-api-access-mqtmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:31:47.356341 master-0 kubenswrapper[36504]: I1203 23:31:47.355330 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9036f786-2706-448d-a19a-0cdf6836bd5e" (UID: "9036f786-2706-448d-a19a-0cdf6836bd5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:31:47.404620 master-0 kubenswrapper[36504]: I1203 23:31:47.404553 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:31:47.404620 master-0 kubenswrapper[36504]: I1203 23:31:47.404606 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9036f786-2706-448d-a19a-0cdf6836bd5e-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:31:47.404620 master-0 kubenswrapper[36504]: I1203 23:31:47.404619 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqtmh\" (UniqueName: \"kubernetes.io/projected/9036f786-2706-448d-a19a-0cdf6836bd5e-kube-api-access-mqtmh\") on node \"master-0\" DevicePath \"\"" Dec 03 23:31:47.520414 master-0 kubenswrapper[36504]: I1203 23:31:47.520340 36504 generic.go:334] "Generic (PLEG): container finished" podID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerID="85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af" exitCode=0 Dec 03 23:31:47.521317 master-0 kubenswrapper[36504]: I1203 23:31:47.520413 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hj8v" event={"ID":"9036f786-2706-448d-a19a-0cdf6836bd5e","Type":"ContainerDied","Data":"85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af"} Dec 03 23:31:47.521317 master-0 kubenswrapper[36504]: I1203 23:31:47.520469 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9hj8v" event={"ID":"9036f786-2706-448d-a19a-0cdf6836bd5e","Type":"ContainerDied","Data":"960a6721af0dedc507e778c859b64b2ab7ff834c01d5d1ef2f3cc2d6bf76ba75"} Dec 03 23:31:47.521317 master-0 kubenswrapper[36504]: I1203 23:31:47.520502 36504 scope.go:117] "RemoveContainer" containerID="85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af" Dec 03 23:31:47.521511 master-0 kubenswrapper[36504]: I1203 23:31:47.521493 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9hj8v" Dec 03 23:31:47.549236 master-0 kubenswrapper[36504]: I1203 23:31:47.549182 36504 scope.go:117] "RemoveContainer" containerID="c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618" Dec 03 23:31:47.579638 master-0 kubenswrapper[36504]: I1203 23:31:47.579533 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9hj8v"] Dec 03 23:31:47.597748 master-0 kubenswrapper[36504]: I1203 23:31:47.596280 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9hj8v"] Dec 03 23:31:47.597748 master-0 kubenswrapper[36504]: I1203 23:31:47.596597 36504 scope.go:117] "RemoveContainer" containerID="afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4" Dec 03 23:31:47.642930 master-0 kubenswrapper[36504]: I1203 23:31:47.642860 36504 scope.go:117] "RemoveContainer" containerID="85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af" Dec 03 23:31:47.644711 master-0 kubenswrapper[36504]: E1203 23:31:47.644660 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af\": container with ID starting with 85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af not found: ID does not exist" containerID="85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af" Dec 03 23:31:47.644856 master-0 kubenswrapper[36504]: I1203 23:31:47.644709 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af"} err="failed to get container status \"85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af\": rpc error: code = NotFound desc = could not find container \"85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af\": container with ID starting with 85f35352912c457b4ddaeffbca66d17ea294684b4ae828c7985722bb346ed7af not found: ID does not exist" Dec 03 23:31:47.644856 master-0 kubenswrapper[36504]: I1203 23:31:47.644742 36504 scope.go:117] "RemoveContainer" containerID="c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618" Dec 03 23:31:47.645382 master-0 kubenswrapper[36504]: E1203 23:31:47.645315 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618\": container with ID starting with c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618 not found: ID does not exist" containerID="c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618" Dec 03 23:31:47.645438 master-0 kubenswrapper[36504]: I1203 23:31:47.645390 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618"} err="failed to get container status \"c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618\": rpc error: code = NotFound desc = could not find container \"c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618\": container with ID starting with c8ed183bb955945ba7c50cabc79f5608935344d9cfd1956b57dcfc23922ef618 not found: ID does not exist" Dec 03 23:31:47.645438 master-0 kubenswrapper[36504]: I1203 23:31:47.645432 36504 scope.go:117] "RemoveContainer" containerID="afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4" Dec 03 23:31:47.646324 master-0 kubenswrapper[36504]: E1203 23:31:47.646275 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4\": container with ID starting with afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4 not found: ID does not exist" containerID="afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4" Dec 03 23:31:47.646324 master-0 kubenswrapper[36504]: I1203 23:31:47.646312 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4"} err="failed to get container status \"afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4\": rpc error: code = NotFound desc = could not find container \"afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4\": container with ID starting with afbb5ac4fe946a35a0c8de66552d01e947292e755e9a9d04f40b78afd97220e4 not found: ID does not exist" Dec 03 23:31:49.118107 master-0 kubenswrapper[36504]: I1203 23:31:49.118018 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" path="/var/lib/kubelet/pods/9036f786-2706-448d-a19a-0cdf6836bd5e/volumes" Dec 03 23:32:35.718107 master-0 kubenswrapper[36504]: E1203 23:32:35.718013 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:32:43.096534 master-0 kubenswrapper[36504]: I1203 23:32:43.096078 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:32:46.095991 master-0 kubenswrapper[36504]: I1203 23:32:46.095903 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:33:35.723526 master-0 kubenswrapper[36504]: E1203 23:33:35.723440 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:33:51.096576 master-0 kubenswrapper[36504]: I1203 23:33:51.096497 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:33:52.274600 master-0 kubenswrapper[36504]: I1203 23:33:52.274522 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nsn8m"] Dec 03 23:33:52.275334 master-0 kubenswrapper[36504]: E1203 23:33:52.275257 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="extract-utilities" Dec 03 23:33:52.275334 master-0 kubenswrapper[36504]: I1203 23:33:52.275276 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="extract-utilities" Dec 03 23:33:52.275334 master-0 kubenswrapper[36504]: E1203 23:33:52.275321 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="registry-server" Dec 03 23:33:52.275334 master-0 kubenswrapper[36504]: I1203 23:33:52.275331 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="registry-server" Dec 03 23:33:52.275468 master-0 kubenswrapper[36504]: E1203 23:33:52.275356 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="extract-content" Dec 03 23:33:52.275468 master-0 kubenswrapper[36504]: I1203 23:33:52.275364 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="extract-content" Dec 03 23:33:52.276066 master-0 kubenswrapper[36504]: I1203 23:33:52.276036 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9036f786-2706-448d-a19a-0cdf6836bd5e" containerName="registry-server" Dec 03 23:33:52.279151 master-0 kubenswrapper[36504]: I1203 23:33:52.279116 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.294510 master-0 kubenswrapper[36504]: I1203 23:33:52.294451 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsn8m"] Dec 03 23:33:52.361797 master-0 kubenswrapper[36504]: I1203 23:33:52.355474 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-catalog-content\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.361797 master-0 kubenswrapper[36504]: I1203 23:33:52.355713 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rf5l\" (UniqueName: \"kubernetes.io/projected/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-kube-api-access-9rf5l\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.361797 master-0 kubenswrapper[36504]: I1203 23:33:52.356126 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-utilities\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.459585 master-0 kubenswrapper[36504]: I1203 23:33:52.459493 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-utilities\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.460095 master-0 kubenswrapper[36504]: I1203 23:33:52.459655 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-catalog-content\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.460095 master-0 kubenswrapper[36504]: I1203 23:33:52.459986 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rf5l\" (UniqueName: \"kubernetes.io/projected/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-kube-api-access-9rf5l\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.460540 master-0 kubenswrapper[36504]: I1203 23:33:52.460484 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-utilities\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.460593 master-0 kubenswrapper[36504]: I1203 23:33:52.460513 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-catalog-content\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.477301 master-0 kubenswrapper[36504]: I1203 23:33:52.477239 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rf5l\" (UniqueName: \"kubernetes.io/projected/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-kube-api-access-9rf5l\") pod \"redhat-operators-nsn8m\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:52.640921 master-0 kubenswrapper[36504]: I1203 23:33:52.640493 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:33:53.171014 master-0 kubenswrapper[36504]: I1203 23:33:53.170945 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nsn8m"] Dec 03 23:33:53.784211 master-0 kubenswrapper[36504]: I1203 23:33:53.784045 36504 generic.go:334] "Generic (PLEG): container finished" podID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerID="9a53c60a08f8af8a241dd1568ba36d1a515a7f17132000abb5e06f3d2c340c8c" exitCode=0 Dec 03 23:33:53.784211 master-0 kubenswrapper[36504]: I1203 23:33:53.784115 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerDied","Data":"9a53c60a08f8af8a241dd1568ba36d1a515a7f17132000abb5e06f3d2c340c8c"} Dec 03 23:33:53.784211 master-0 kubenswrapper[36504]: I1203 23:33:53.784156 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerStarted","Data":"b452f7ef7f141f519c9d714833a0f7d6e818ecf7c567f0bfd5a9b433865117ca"} Dec 03 23:33:54.800350 master-0 kubenswrapper[36504]: I1203 23:33:54.800190 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerStarted","Data":"ce8ee0e65cd84b2115ceec4dc61ffa056cd0156c909053b25ece65802c936e27"} Dec 03 23:33:56.832262 master-0 kubenswrapper[36504]: I1203 23:33:56.832071 36504 generic.go:334] "Generic (PLEG): container finished" podID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerID="ce8ee0e65cd84b2115ceec4dc61ffa056cd0156c909053b25ece65802c936e27" exitCode=0 Dec 03 23:33:56.832262 master-0 kubenswrapper[36504]: I1203 23:33:56.832175 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerDied","Data":"ce8ee0e65cd84b2115ceec4dc61ffa056cd0156c909053b25ece65802c936e27"} Dec 03 23:33:57.850700 master-0 kubenswrapper[36504]: I1203 23:33:57.850499 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerStarted","Data":"659da20a5afda805b1b0a06005d6118c8d1cdeae7f3f52c6f6ac16a125d9a886"} Dec 03 23:33:57.878338 master-0 kubenswrapper[36504]: I1203 23:33:57.878218 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nsn8m" podStartSLOduration=2.376857507 podStartE2EDuration="5.878189314s" podCreationTimestamp="2025-12-03 23:33:52 +0000 UTC" firstStartedPulling="2025-12-03 23:33:53.787507022 +0000 UTC m=+4999.007279029" lastFinishedPulling="2025-12-03 23:33:57.288838829 +0000 UTC m=+5002.508610836" observedRunningTime="2025-12-03 23:33:57.872711063 +0000 UTC m=+5003.092483080" watchObservedRunningTime="2025-12-03 23:33:57.878189314 +0000 UTC m=+5003.097961321" Dec 03 23:33:59.097357 master-0 kubenswrapper[36504]: I1203 23:33:59.097199 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:34:02.641296 master-0 kubenswrapper[36504]: I1203 23:34:02.641127 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:34:02.642926 master-0 kubenswrapper[36504]: I1203 23:34:02.642873 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:34:02.692267 master-0 kubenswrapper[36504]: I1203 23:34:02.692204 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:34:03.013378 master-0 kubenswrapper[36504]: I1203 23:34:03.013257 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:34:03.112854 master-0 kubenswrapper[36504]: I1203 23:34:03.112788 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsn8m"] Dec 03 23:34:04.973855 master-0 kubenswrapper[36504]: I1203 23:34:04.973760 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nsn8m" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="registry-server" containerID="cri-o://659da20a5afda805b1b0a06005d6118c8d1cdeae7f3f52c6f6ac16a125d9a886" gracePeriod=2 Dec 03 23:34:05.989578 master-0 kubenswrapper[36504]: I1203 23:34:05.989506 36504 generic.go:334] "Generic (PLEG): container finished" podID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerID="659da20a5afda805b1b0a06005d6118c8d1cdeae7f3f52c6f6ac16a125d9a886" exitCode=0 Dec 03 23:34:05.989578 master-0 kubenswrapper[36504]: I1203 23:34:05.989572 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerDied","Data":"659da20a5afda805b1b0a06005d6118c8d1cdeae7f3f52c6f6ac16a125d9a886"} Dec 03 23:34:06.624419 master-0 kubenswrapper[36504]: I1203 23:34:06.624351 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:34:06.700845 master-0 kubenswrapper[36504]: I1203 23:34:06.700675 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rf5l\" (UniqueName: \"kubernetes.io/projected/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-kube-api-access-9rf5l\") pod \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " Dec 03 23:34:06.700845 master-0 kubenswrapper[36504]: I1203 23:34:06.700845 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-utilities\") pod \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " Dec 03 23:34:06.701224 master-0 kubenswrapper[36504]: I1203 23:34:06.700940 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-catalog-content\") pod \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\" (UID: \"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa\") " Dec 03 23:34:06.707060 master-0 kubenswrapper[36504]: I1203 23:34:06.706979 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-utilities" (OuterVolumeSpecName: "utilities") pod "c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" (UID: "c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:34:06.714193 master-0 kubenswrapper[36504]: I1203 23:34:06.714101 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-kube-api-access-9rf5l" (OuterVolumeSpecName: "kube-api-access-9rf5l") pod "c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" (UID: "c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa"). InnerVolumeSpecName "kube-api-access-9rf5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:34:06.806002 master-0 kubenswrapper[36504]: I1203 23:34:06.805926 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rf5l\" (UniqueName: \"kubernetes.io/projected/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-kube-api-access-9rf5l\") on node \"master-0\" DevicePath \"\"" Dec 03 23:34:06.806002 master-0 kubenswrapper[36504]: I1203 23:34:06.805990 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:34:06.822433 master-0 kubenswrapper[36504]: I1203 23:34:06.822353 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" (UID: "c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:34:06.909296 master-0 kubenswrapper[36504]: I1203 23:34:06.909222 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:34:07.010629 master-0 kubenswrapper[36504]: I1203 23:34:07.010455 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nsn8m" event={"ID":"c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa","Type":"ContainerDied","Data":"b452f7ef7f141f519c9d714833a0f7d6e818ecf7c567f0bfd5a9b433865117ca"} Dec 03 23:34:07.010629 master-0 kubenswrapper[36504]: I1203 23:34:07.010548 36504 scope.go:117] "RemoveContainer" containerID="659da20a5afda805b1b0a06005d6118c8d1cdeae7f3f52c6f6ac16a125d9a886" Dec 03 23:34:07.010629 master-0 kubenswrapper[36504]: I1203 23:34:07.010592 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nsn8m" Dec 03 23:34:07.036728 master-0 kubenswrapper[36504]: I1203 23:34:07.036669 36504 scope.go:117] "RemoveContainer" containerID="ce8ee0e65cd84b2115ceec4dc61ffa056cd0156c909053b25ece65802c936e27" Dec 03 23:34:07.062150 master-0 kubenswrapper[36504]: I1203 23:34:07.062085 36504 scope.go:117] "RemoveContainer" containerID="9a53c60a08f8af8a241dd1568ba36d1a515a7f17132000abb5e06f3d2c340c8c" Dec 03 23:34:07.115836 master-0 kubenswrapper[36504]: I1203 23:34:07.115736 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nsn8m"] Dec 03 23:34:07.130683 master-0 kubenswrapper[36504]: I1203 23:34:07.129968 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nsn8m"] Dec 03 23:34:09.118316 master-0 kubenswrapper[36504]: I1203 23:34:09.118237 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" path="/var/lib/kubelet/pods/c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa/volumes" Dec 03 23:34:23.888264 master-0 kubenswrapper[36504]: I1203 23:34:23.888204 36504 trace.go:236] Trace[1220418747]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (03-Dec-2025 23:34:22.825) (total time: 1063ms): Dec 03 23:34:23.888264 master-0 kubenswrapper[36504]: Trace[1220418747]: [1.063116037s] [1.063116037s] END Dec 03 23:34:35.726034 master-0 kubenswrapper[36504]: E1203 23:34:35.725972 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:34:52.096296 master-0 kubenswrapper[36504]: I1203 23:34:52.096120 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:35:16.096418 master-0 kubenswrapper[36504]: I1203 23:35:16.096344 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:35:35.728362 master-0 kubenswrapper[36504]: E1203 23:35:35.728284 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:36:15.135846 master-0 kubenswrapper[36504]: I1203 23:36:15.135742 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:36:35.109513 master-0 kubenswrapper[36504]: I1203 23:36:35.107432 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:36:35.722090 master-0 kubenswrapper[36504]: E1203 23:36:35.722016 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:36:51.470724 master-0 kubenswrapper[36504]: I1203 23:36:51.470501 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9c8d"] Dec 03 23:36:51.471570 master-0 kubenswrapper[36504]: E1203 23:36:51.471508 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="registry-server" Dec 03 23:36:51.471570 master-0 kubenswrapper[36504]: I1203 23:36:51.471531 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="registry-server" Dec 03 23:36:51.471673 master-0 kubenswrapper[36504]: E1203 23:36:51.471620 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="extract-content" Dec 03 23:36:51.471673 master-0 kubenswrapper[36504]: I1203 23:36:51.471629 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="extract-content" Dec 03 23:36:51.471761 master-0 kubenswrapper[36504]: E1203 23:36:51.471680 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="extract-utilities" Dec 03 23:36:51.471761 master-0 kubenswrapper[36504]: I1203 23:36:51.471693 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="extract-utilities" Dec 03 23:36:51.472176 master-0 kubenswrapper[36504]: I1203 23:36:51.472144 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9f0af9b-0293-4db4-9ba2-464f4c3ed9fa" containerName="registry-server" Dec 03 23:36:51.475340 master-0 kubenswrapper[36504]: I1203 23:36:51.475283 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.527433 master-0 kubenswrapper[36504]: I1203 23:36:51.527353 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9c8d"] Dec 03 23:36:51.548791 master-0 kubenswrapper[36504]: I1203 23:36:51.547495 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-utilities\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.548791 master-0 kubenswrapper[36504]: I1203 23:36:51.547628 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-catalog-content\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.548791 master-0 kubenswrapper[36504]: I1203 23:36:51.547691 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s86v\" (UniqueName: \"kubernetes.io/projected/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-kube-api-access-2s86v\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.653957 master-0 kubenswrapper[36504]: I1203 23:36:51.652552 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-utilities\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.653957 master-0 kubenswrapper[36504]: I1203 23:36:51.652881 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-catalog-content\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.653957 master-0 kubenswrapper[36504]: I1203 23:36:51.652946 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s86v\" (UniqueName: \"kubernetes.io/projected/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-kube-api-access-2s86v\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.653957 master-0 kubenswrapper[36504]: I1203 23:36:51.653218 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-utilities\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.653957 master-0 kubenswrapper[36504]: I1203 23:36:51.653596 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-catalog-content\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:51.950785 master-0 kubenswrapper[36504]: I1203 23:36:51.942603 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s86v\" (UniqueName: \"kubernetes.io/projected/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-kube-api-access-2s86v\") pod \"redhat-marketplace-b9c8d\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:52.117718 master-0 kubenswrapper[36504]: I1203 23:36:52.117641 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:36:52.642401 master-0 kubenswrapper[36504]: I1203 23:36:52.642323 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9c8d"] Dec 03 23:36:53.612582 master-0 kubenswrapper[36504]: I1203 23:36:53.612410 36504 generic.go:334] "Generic (PLEG): container finished" podID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerID="366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7" exitCode=0 Dec 03 23:36:53.612582 master-0 kubenswrapper[36504]: I1203 23:36:53.612502 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerDied","Data":"366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7"} Dec 03 23:36:53.612582 master-0 kubenswrapper[36504]: I1203 23:36:53.612579 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerStarted","Data":"845bae05d4845831e43d9e2736aafa00e61030aba8f8670894d720b8e6ae3d8f"} Dec 03 23:36:53.615618 master-0 kubenswrapper[36504]: I1203 23:36:53.615570 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:36:54.630947 master-0 kubenswrapper[36504]: I1203 23:36:54.630847 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerStarted","Data":"44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0"} Dec 03 23:36:55.651138 master-0 kubenswrapper[36504]: I1203 23:36:55.651064 36504 generic.go:334] "Generic (PLEG): container finished" podID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerID="44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0" exitCode=0 Dec 03 23:36:55.651695 master-0 kubenswrapper[36504]: I1203 23:36:55.651143 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerDied","Data":"44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0"} Dec 03 23:36:56.668978 master-0 kubenswrapper[36504]: I1203 23:36:56.668899 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerStarted","Data":"f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd"} Dec 03 23:36:56.696074 master-0 kubenswrapper[36504]: I1203 23:36:56.695962 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9c8d" podStartSLOduration=3.2344644049999998 podStartE2EDuration="5.695927984s" podCreationTimestamp="2025-12-03 23:36:51 +0000 UTC" firstStartedPulling="2025-12-03 23:36:53.615440171 +0000 UTC m=+5178.835212178" lastFinishedPulling="2025-12-03 23:36:56.07690373 +0000 UTC m=+5181.296675757" observedRunningTime="2025-12-03 23:36:56.690162643 +0000 UTC m=+5181.909934660" watchObservedRunningTime="2025-12-03 23:36:56.695927984 +0000 UTC m=+5181.915699991" Dec 03 23:37:02.118466 master-0 kubenswrapper[36504]: I1203 23:37:02.118345 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:37:02.118466 master-0 kubenswrapper[36504]: I1203 23:37:02.118434 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:37:02.196924 master-0 kubenswrapper[36504]: I1203 23:37:02.196844 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:37:02.804227 master-0 kubenswrapper[36504]: I1203 23:37:02.804135 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:37:02.887428 master-0 kubenswrapper[36504]: I1203 23:37:02.887350 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9c8d"] Dec 03 23:37:04.775945 master-0 kubenswrapper[36504]: I1203 23:37:04.775847 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9c8d" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="registry-server" containerID="cri-o://f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd" gracePeriod=2 Dec 03 23:37:05.438075 master-0 kubenswrapper[36504]: I1203 23:37:05.437941 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:37:05.471437 master-0 kubenswrapper[36504]: I1203 23:37:05.471352 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s86v\" (UniqueName: \"kubernetes.io/projected/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-kube-api-access-2s86v\") pod \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " Dec 03 23:37:05.471437 master-0 kubenswrapper[36504]: I1203 23:37:05.471429 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-catalog-content\") pod \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " Dec 03 23:37:05.492973 master-0 kubenswrapper[36504]: I1203 23:37:05.492687 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" (UID: "9f0bc48e-27ba-4179-b8b2-f0a370fa96f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:37:05.501702 master-0 kubenswrapper[36504]: I1203 23:37:05.501635 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-kube-api-access-2s86v" (OuterVolumeSpecName: "kube-api-access-2s86v") pod "9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" (UID: "9f0bc48e-27ba-4179-b8b2-f0a370fa96f9"). InnerVolumeSpecName "kube-api-access-2s86v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:37:05.576103 master-0 kubenswrapper[36504]: I1203 23:37:05.575859 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-utilities\") pod \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\" (UID: \"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9\") " Dec 03 23:37:05.576883 master-0 kubenswrapper[36504]: I1203 23:37:05.576818 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-utilities" (OuterVolumeSpecName: "utilities") pod "9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" (UID: "9f0bc48e-27ba-4179-b8b2-f0a370fa96f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:37:05.577551 master-0 kubenswrapper[36504]: I1203 23:37:05.577513 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s86v\" (UniqueName: \"kubernetes.io/projected/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-kube-api-access-2s86v\") on node \"master-0\" DevicePath \"\"" Dec 03 23:37:05.577551 master-0 kubenswrapper[36504]: I1203 23:37:05.577544 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:37:05.577639 master-0 kubenswrapper[36504]: I1203 23:37:05.577554 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:37:05.794272 master-0 kubenswrapper[36504]: I1203 23:37:05.794172 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerDied","Data":"f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd"} Dec 03 23:37:05.794272 master-0 kubenswrapper[36504]: I1203 23:37:05.794264 36504 scope.go:117] "RemoveContainer" containerID="f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd" Dec 03 23:37:05.795218 master-0 kubenswrapper[36504]: I1203 23:37:05.794190 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9c8d" Dec 03 23:37:05.795218 master-0 kubenswrapper[36504]: I1203 23:37:05.794085 36504 generic.go:334] "Generic (PLEG): container finished" podID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerID="f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd" exitCode=0 Dec 03 23:37:05.795218 master-0 kubenswrapper[36504]: I1203 23:37:05.794412 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9c8d" event={"ID":"9f0bc48e-27ba-4179-b8b2-f0a370fa96f9","Type":"ContainerDied","Data":"845bae05d4845831e43d9e2736aafa00e61030aba8f8670894d720b8e6ae3d8f"} Dec 03 23:37:05.822837 master-0 kubenswrapper[36504]: I1203 23:37:05.822757 36504 scope.go:117] "RemoveContainer" containerID="44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0" Dec 03 23:37:05.854051 master-0 kubenswrapper[36504]: I1203 23:37:05.853820 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9c8d"] Dec 03 23:37:05.859262 master-0 kubenswrapper[36504]: I1203 23:37:05.859157 36504 scope.go:117] "RemoveContainer" containerID="366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7" Dec 03 23:37:05.872672 master-0 kubenswrapper[36504]: I1203 23:37:05.872580 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9c8d"] Dec 03 23:37:05.914976 master-0 kubenswrapper[36504]: I1203 23:37:05.914884 36504 scope.go:117] "RemoveContainer" containerID="f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd" Dec 03 23:37:05.915810 master-0 kubenswrapper[36504]: E1203 23:37:05.915590 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd\": container with ID starting with f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd not found: ID does not exist" containerID="f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd" Dec 03 23:37:05.915810 master-0 kubenswrapper[36504]: I1203 23:37:05.915625 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd"} err="failed to get container status \"f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd\": rpc error: code = NotFound desc = could not find container \"f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd\": container with ID starting with f54dad9a5f2fb7d232e5bba1cc7c7ea5b5b034fbb75b53692daecc5fb3ebddfd not found: ID does not exist" Dec 03 23:37:05.915810 master-0 kubenswrapper[36504]: I1203 23:37:05.915650 36504 scope.go:117] "RemoveContainer" containerID="44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0" Dec 03 23:37:05.916286 master-0 kubenswrapper[36504]: E1203 23:37:05.916261 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0\": container with ID starting with 44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0 not found: ID does not exist" containerID="44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0" Dec 03 23:37:05.916349 master-0 kubenswrapper[36504]: I1203 23:37:05.916285 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0"} err="failed to get container status \"44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0\": rpc error: code = NotFound desc = could not find container \"44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0\": container with ID starting with 44ace9784fdae4772dadf8bddcac663018570f55d26b7bb39d450ab2ad6093c0 not found: ID does not exist" Dec 03 23:37:05.916349 master-0 kubenswrapper[36504]: I1203 23:37:05.916300 36504 scope.go:117] "RemoveContainer" containerID="366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7" Dec 03 23:37:05.916899 master-0 kubenswrapper[36504]: E1203 23:37:05.916823 36504 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7\": container with ID starting with 366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7 not found: ID does not exist" containerID="366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7" Dec 03 23:37:05.917054 master-0 kubenswrapper[36504]: I1203 23:37:05.917008 36504 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7"} err="failed to get container status \"366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7\": rpc error: code = NotFound desc = could not find container \"366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7\": container with ID starting with 366c2cfe64d2c4e9dcf61fdb7274d03abb3ab978e9c06c39f556c9517576a7c7 not found: ID does not exist" Dec 03 23:37:07.111540 master-0 kubenswrapper[36504]: I1203 23:37:07.111460 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" path="/var/lib/kubelet/pods/9f0bc48e-27ba-4179-b8b2-f0a370fa96f9/volumes" Dec 03 23:37:35.719380 master-0 kubenswrapper[36504]: E1203 23:37:35.719294 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:37:41.096514 master-0 kubenswrapper[36504]: I1203 23:37:41.096385 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:38:05.096318 master-0 kubenswrapper[36504]: I1203 23:38:05.096166 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:38:35.727927 master-0 kubenswrapper[36504]: E1203 23:38:35.727838 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:38:48.097028 master-0 kubenswrapper[36504]: I1203 23:38:48.096730 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:39:10.096506 master-0 kubenswrapper[36504]: I1203 23:39:10.096424 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:39:35.736959 master-0 kubenswrapper[36504]: E1203 23:39:35.736881 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:40:04.097485 master-0 kubenswrapper[36504]: I1203 23:40:04.097412 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:40:35.127402 master-0 kubenswrapper[36504]: I1203 23:40:35.127347 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:40:35.743363 master-0 kubenswrapper[36504]: E1203 23:40:35.743305 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:41:25.834456 master-0 kubenswrapper[36504]: I1203 23:41:25.834394 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z89xv"] Dec 03 23:41:25.839097 master-0 kubenswrapper[36504]: E1203 23:41:25.839045 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="registry-server" Dec 03 23:41:25.839458 master-0 kubenswrapper[36504]: I1203 23:41:25.839441 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="registry-server" Dec 03 23:41:25.839748 master-0 kubenswrapper[36504]: E1203 23:41:25.839730 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="extract-utilities" Dec 03 23:41:25.839866 master-0 kubenswrapper[36504]: I1203 23:41:25.839851 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="extract-utilities" Dec 03 23:41:25.840885 master-0 kubenswrapper[36504]: E1203 23:41:25.840865 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="extract-content" Dec 03 23:41:25.840991 master-0 kubenswrapper[36504]: I1203 23:41:25.840976 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="extract-content" Dec 03 23:41:25.843484 master-0 kubenswrapper[36504]: I1203 23:41:25.843438 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0bc48e-27ba-4179-b8b2-f0a370fa96f9" containerName="registry-server" Dec 03 23:41:25.848097 master-0 kubenswrapper[36504]: I1203 23:41:25.848048 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:25.897264 master-0 kubenswrapper[36504]: I1203 23:41:25.897169 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z89xv"] Dec 03 23:41:25.947213 master-0 kubenswrapper[36504]: I1203 23:41:25.947144 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l96g7\" (UniqueName: \"kubernetes.io/projected/ab6b7cae-a521-40a2-8bbd-da138b86cd48-kube-api-access-l96g7\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:25.947544 master-0 kubenswrapper[36504]: I1203 23:41:25.947282 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-catalog-content\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:25.947544 master-0 kubenswrapper[36504]: I1203 23:41:25.947485 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-utilities\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.051058 master-0 kubenswrapper[36504]: I1203 23:41:26.050976 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l96g7\" (UniqueName: \"kubernetes.io/projected/ab6b7cae-a521-40a2-8bbd-da138b86cd48-kube-api-access-l96g7\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.051350 master-0 kubenswrapper[36504]: I1203 23:41:26.051083 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-catalog-content\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.051350 master-0 kubenswrapper[36504]: I1203 23:41:26.051255 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-utilities\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.051956 master-0 kubenswrapper[36504]: I1203 23:41:26.051920 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-catalog-content\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.052045 master-0 kubenswrapper[36504]: I1203 23:41:26.051995 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-utilities\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.072104 master-0 kubenswrapper[36504]: I1203 23:41:26.072030 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l96g7\" (UniqueName: \"kubernetes.io/projected/ab6b7cae-a521-40a2-8bbd-da138b86cd48-kube-api-access-l96g7\") pod \"certified-operators-z89xv\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.236228 master-0 kubenswrapper[36504]: I1203 23:41:26.236144 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:26.910364 master-0 kubenswrapper[36504]: W1203 23:41:26.910276 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab6b7cae_a521_40a2_8bbd_da138b86cd48.slice/crio-f1a76285dab6b13550ac0613d7f6d1ce9f44fdadb58e45257bf424ca9a43edc1 WatchSource:0}: Error finding container f1a76285dab6b13550ac0613d7f6d1ce9f44fdadb58e45257bf424ca9a43edc1: Status 404 returned error can't find the container with id f1a76285dab6b13550ac0613d7f6d1ce9f44fdadb58e45257bf424ca9a43edc1 Dec 03 23:41:26.917003 master-0 kubenswrapper[36504]: I1203 23:41:26.916922 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z89xv"] Dec 03 23:41:27.096337 master-0 kubenswrapper[36504]: I1203 23:41:27.096125 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:41:27.858279 master-0 kubenswrapper[36504]: I1203 23:41:27.858218 36504 generic.go:334] "Generic (PLEG): container finished" podID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerID="e960d19bd787ec35c629a0debe6bf2d68cea2948224654c7e604fae228eb3199" exitCode=0 Dec 03 23:41:27.858568 master-0 kubenswrapper[36504]: I1203 23:41:27.858286 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerDied","Data":"e960d19bd787ec35c629a0debe6bf2d68cea2948224654c7e604fae228eb3199"} Dec 03 23:41:27.858568 master-0 kubenswrapper[36504]: I1203 23:41:27.858326 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerStarted","Data":"f1a76285dab6b13550ac0613d7f6d1ce9f44fdadb58e45257bf424ca9a43edc1"} Dec 03 23:41:28.891489 master-0 kubenswrapper[36504]: I1203 23:41:28.891190 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerStarted","Data":"1f2760e71b2f569e26be9c8fdfd1509ef1ecda96c8ff2395898d6a51897a01c5"} Dec 03 23:41:29.908510 master-0 kubenswrapper[36504]: I1203 23:41:29.908404 36504 generic.go:334] "Generic (PLEG): container finished" podID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerID="1f2760e71b2f569e26be9c8fdfd1509ef1ecda96c8ff2395898d6a51897a01c5" exitCode=0 Dec 03 23:41:29.909357 master-0 kubenswrapper[36504]: I1203 23:41:29.908618 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerDied","Data":"1f2760e71b2f569e26be9c8fdfd1509ef1ecda96c8ff2395898d6a51897a01c5"} Dec 03 23:41:30.926704 master-0 kubenswrapper[36504]: I1203 23:41:30.926632 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerStarted","Data":"ae4e3b8bb1a91414400c1f23706c4c93aae6798b42a14f12af1c876376f527e5"} Dec 03 23:41:30.971331 master-0 kubenswrapper[36504]: I1203 23:41:30.971235 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z89xv" podStartSLOduration=3.472672072 podStartE2EDuration="5.971203267s" podCreationTimestamp="2025-12-03 23:41:25 +0000 UTC" firstStartedPulling="2025-12-03 23:41:27.861359051 +0000 UTC m=+5453.081131058" lastFinishedPulling="2025-12-03 23:41:30.359890226 +0000 UTC m=+5455.579662253" observedRunningTime="2025-12-03 23:41:30.952275595 +0000 UTC m=+5456.172047602" watchObservedRunningTime="2025-12-03 23:41:30.971203267 +0000 UTC m=+5456.190975274" Dec 03 23:41:32.954175 master-0 kubenswrapper[36504]: I1203 23:41:32.954105 36504 generic.go:334] "Generic (PLEG): container finished" podID="e8222600-745f-4d3f-87a9-9a8607c75bbf" containerID="c7630d8bb8b76d2bd906fc5c17d79414c4e1c50143de9d066111121f7484d0f7" exitCode=1 Dec 03 23:41:32.954175 master-0 kubenswrapper[36504]: I1203 23:41:32.954175 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s01-scenario-tests" event={"ID":"e8222600-745f-4d3f-87a9-9a8607c75bbf","Type":"ContainerDied","Data":"c7630d8bb8b76d2bd906fc5c17d79414c4e1c50143de9d066111121f7484d0f7"} Dec 03 23:41:34.508438 master-0 kubenswrapper[36504]: I1203 23:41:34.508342 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 23:41:34.656646 master-0 kubenswrapper[36504]: I1203 23:41:34.656480 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-workdir\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.656646 master-0 kubenswrapper[36504]: I1203 23:41:34.656616 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-ca-certs\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.657059 master-0 kubenswrapper[36504]: I1203 23:41:34.657027 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config-secret\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.657202 master-0 kubenswrapper[36504]: I1203 23:41:34.657174 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-temporary\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.657631 master-0 kubenswrapper[36504]: I1203 23:41:34.657434 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.657631 master-0 kubenswrapper[36504]: I1203 23:41:34.657490 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-config-data\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.657631 master-0 kubenswrapper[36504]: I1203 23:41:34.657607 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config\") pod \"e8222600-745f-4d3f-87a9-9a8607c75bbf\" (UID: \"e8222600-745f-4d3f-87a9-9a8607c75bbf\") " Dec 03 23:41:34.658920 master-0 kubenswrapper[36504]: I1203 23:41:34.658869 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-config-data" (OuterVolumeSpecName: "config-data") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:34.659613 master-0 kubenswrapper[36504]: I1203 23:41:34.659575 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:34.660649 master-0 kubenswrapper[36504]: I1203 23:41:34.660616 36504 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-temporary\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.660649 master-0 kubenswrapper[36504]: I1203 23:41:34.660643 36504 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-config-data\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.664906 master-0 kubenswrapper[36504]: I1203 23:41:34.664824 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:34.694315 master-0 kubenswrapper[36504]: I1203 23:41:34.694241 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:34.706757 master-0 kubenswrapper[36504]: I1203 23:41:34.706652 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:34.715298 master-0 kubenswrapper[36504]: I1203 23:41:34.715225 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43" (OuterVolumeSpecName: "test-operator-logs") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 23:41:34.760344 master-0 kubenswrapper[36504]: I1203 23:41:34.755012 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e8222600-745f-4d3f-87a9-9a8607c75bbf" (UID: "e8222600-745f-4d3f-87a9-9a8607c75bbf"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:34.770233 master-0 kubenswrapper[36504]: I1203 23:41:34.770140 36504 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config-secret\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.770233 master-0 kubenswrapper[36504]: I1203 23:41:34.770259 36504 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") on node \"master-0\" " Dec 03 23:41:34.770233 master-0 kubenswrapper[36504]: I1203 23:41:34.770279 36504 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8222600-745f-4d3f-87a9-9a8607c75bbf-openstack-config\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.770233 master-0 kubenswrapper[36504]: I1203 23:41:34.770292 36504 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e8222600-745f-4d3f-87a9-9a8607c75bbf-test-operator-ephemeral-workdir\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.770233 master-0 kubenswrapper[36504]: I1203 23:41:34.770307 36504 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e8222600-745f-4d3f-87a9-9a8607c75bbf-ca-certs\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.810376 master-0 kubenswrapper[36504]: I1203 23:41:34.810167 36504 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 23:41:34.810930 master-0 kubenswrapper[36504]: I1203 23:41:34.810912 36504 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104" (UniqueName: "kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43") on node "master-0" Dec 03 23:41:34.873965 master-0 kubenswrapper[36504]: I1203 23:41:34.873898 36504 reconciler_common.go:293] "Volume detached for volume \"pvc-b6aa9b1f-d5c4-4eb3-b294-3bcee2bb8104\" (UniqueName: \"kubernetes.io/csi/topolvm.io^d446a673-7d0b-4209-b74d-593574e40d43\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:34.983180 master-0 kubenswrapper[36504]: I1203 23:41:34.983016 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-s01-scenario-tests" event={"ID":"e8222600-745f-4d3f-87a9-9a8607c75bbf","Type":"ContainerDied","Data":"37f9b36a7be5412cd584afc5ea391f52bdda5743188eb4148f6f1a8e3f6d273b"} Dec 03 23:41:34.983564 master-0 kubenswrapper[36504]: I1203 23:41:34.983538 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37f9b36a7be5412cd584afc5ea391f52bdda5743188eb4148f6f1a8e3f6d273b" Dec 03 23:41:34.983797 master-0 kubenswrapper[36504]: I1203 23:41:34.983103 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-s01-scenario-tests" Dec 03 23:41:35.721964 master-0 kubenswrapper[36504]: E1203 23:41:35.721907 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:41:36.237709 master-0 kubenswrapper[36504]: I1203 23:41:36.237628 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:36.237709 master-0 kubenswrapper[36504]: I1203 23:41:36.237718 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:36.298894 master-0 kubenswrapper[36504]: I1203 23:41:36.298755 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:37.056802 master-0 kubenswrapper[36504]: I1203 23:41:37.056712 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:37.096265 master-0 kubenswrapper[36504]: I1203 23:41:37.096101 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-monitoring/monitoring-plugin-688c48b94d-mj6wx" secret="" err="secret \"default-dockercfg-dg9ph\" not found" Dec 03 23:41:37.121592 master-0 kubenswrapper[36504]: I1203 23:41:37.121514 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z89xv"] Dec 03 23:41:39.041300 master-0 kubenswrapper[36504]: I1203 23:41:39.041196 36504 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z89xv" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="registry-server" containerID="cri-o://ae4e3b8bb1a91414400c1f23706c4c93aae6798b42a14f12af1c876376f527e5" gracePeriod=2 Dec 03 23:41:40.109158 master-0 kubenswrapper[36504]: I1203 23:41:40.108857 36504 generic.go:334] "Generic (PLEG): container finished" podID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerID="ae4e3b8bb1a91414400c1f23706c4c93aae6798b42a14f12af1c876376f527e5" exitCode=0 Dec 03 23:41:40.109158 master-0 kubenswrapper[36504]: I1203 23:41:40.108920 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerDied","Data":"ae4e3b8bb1a91414400c1f23706c4c93aae6798b42a14f12af1c876376f527e5"} Dec 03 23:41:40.109158 master-0 kubenswrapper[36504]: I1203 23:41:40.108958 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z89xv" event={"ID":"ab6b7cae-a521-40a2-8bbd-da138b86cd48","Type":"ContainerDied","Data":"f1a76285dab6b13550ac0613d7f6d1ce9f44fdadb58e45257bf424ca9a43edc1"} Dec 03 23:41:40.109158 master-0 kubenswrapper[36504]: I1203 23:41:40.108972 36504 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1a76285dab6b13550ac0613d7f6d1ce9f44fdadb58e45257bf424ca9a43edc1" Dec 03 23:41:40.156002 master-0 kubenswrapper[36504]: I1203 23:41:40.155298 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:40.206567 master-0 kubenswrapper[36504]: I1203 23:41:40.206474 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-catalog-content\") pod \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " Dec 03 23:41:40.206843 master-0 kubenswrapper[36504]: I1203 23:41:40.206575 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l96g7\" (UniqueName: \"kubernetes.io/projected/ab6b7cae-a521-40a2-8bbd-da138b86cd48-kube-api-access-l96g7\") pod \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " Dec 03 23:41:40.207217 master-0 kubenswrapper[36504]: I1203 23:41:40.207134 36504 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-utilities\") pod \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\" (UID: \"ab6b7cae-a521-40a2-8bbd-da138b86cd48\") " Dec 03 23:41:40.210396 master-0 kubenswrapper[36504]: I1203 23:41:40.210352 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-utilities" (OuterVolumeSpecName: "utilities") pod "ab6b7cae-a521-40a2-8bbd-da138b86cd48" (UID: "ab6b7cae-a521-40a2-8bbd-da138b86cd48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:40.211000 master-0 kubenswrapper[36504]: I1203 23:41:40.210947 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab6b7cae-a521-40a2-8bbd-da138b86cd48-kube-api-access-l96g7" (OuterVolumeSpecName: "kube-api-access-l96g7") pod "ab6b7cae-a521-40a2-8bbd-da138b86cd48" (UID: "ab6b7cae-a521-40a2-8bbd-da138b86cd48"). InnerVolumeSpecName "kube-api-access-l96g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:40.266524 master-0 kubenswrapper[36504]: I1203 23:41:40.266433 36504 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab6b7cae-a521-40a2-8bbd-da138b86cd48" (UID: "ab6b7cae-a521-40a2-8bbd-da138b86cd48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:40.310375 master-0 kubenswrapper[36504]: I1203 23:41:40.310316 36504 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-catalog-content\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:40.310375 master-0 kubenswrapper[36504]: I1203 23:41:40.310370 36504 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l96g7\" (UniqueName: \"kubernetes.io/projected/ab6b7cae-a521-40a2-8bbd-da138b86cd48-kube-api-access-l96g7\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:40.310375 master-0 kubenswrapper[36504]: I1203 23:41:40.310382 36504 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab6b7cae-a521-40a2-8bbd-da138b86cd48-utilities\") on node \"master-0\" DevicePath \"\"" Dec 03 23:41:41.124293 master-0 kubenswrapper[36504]: I1203 23:41:41.124233 36504 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z89xv" Dec 03 23:41:41.184482 master-0 kubenswrapper[36504]: I1203 23:41:41.184402 36504 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z89xv"] Dec 03 23:41:41.201779 master-0 kubenswrapper[36504]: I1203 23:41:41.201678 36504 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z89xv"] Dec 03 23:41:43.129845 master-0 kubenswrapper[36504]: I1203 23:41:43.129735 36504 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" path="/var/lib/kubelet/pods/ab6b7cae-a521-40a2-8bbd-da138b86cd48/volumes" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: I1203 23:41:48.335152 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2bns8/must-gather-6hdjj"] Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: E1203 23:41:48.335950 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8222600-745f-4d3f-87a9-9a8607c75bbf" containerName="tempest-tests-tests-runner" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: I1203 23:41:48.335968 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8222600-745f-4d3f-87a9-9a8607c75bbf" containerName="tempest-tests-tests-runner" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: E1203 23:41:48.336000 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="extract-content" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: I1203 23:41:48.336008 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="extract-content" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: E1203 23:41:48.336086 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="extract-utilities" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: I1203 23:41:48.336096 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="extract-utilities" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: E1203 23:41:48.336119 36504 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="registry-server" Dec 03 23:41:48.336635 master-0 kubenswrapper[36504]: I1203 23:41:48.336124 36504 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="registry-server" Dec 03 23:41:48.337978 master-0 kubenswrapper[36504]: I1203 23:41:48.337940 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6b7cae-a521-40a2-8bbd-da138b86cd48" containerName="registry-server" Dec 03 23:41:48.338020 master-0 kubenswrapper[36504]: I1203 23:41:48.337984 36504 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8222600-745f-4d3f-87a9-9a8607c75bbf" containerName="tempest-tests-tests-runner" Dec 03 23:41:48.340803 master-0 kubenswrapper[36504]: I1203 23:41:48.340726 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.347160 master-0 kubenswrapper[36504]: I1203 23:41:48.345505 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2bns8"/"kube-root-ca.crt" Dec 03 23:41:48.347916 master-0 kubenswrapper[36504]: I1203 23:41:48.345834 36504 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2bns8"/"openshift-service-ca.crt" Dec 03 23:41:48.375131 master-0 kubenswrapper[36504]: I1203 23:41:48.375060 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2bns8/must-gather-qn29q"] Dec 03 23:41:48.378754 master-0 kubenswrapper[36504]: I1203 23:41:48.378708 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.409369 master-0 kubenswrapper[36504]: I1203 23:41:48.409291 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bns8/must-gather-6hdjj"] Dec 03 23:41:48.413858 master-0 kubenswrapper[36504]: I1203 23:41:48.413692 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd-must-gather-output\") pod \"must-gather-6hdjj\" (UID: \"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd\") " pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.416579 master-0 kubenswrapper[36504]: I1203 23:41:48.414162 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a631f930-56f5-4f20-aed0-fe6a80a2a5cf-must-gather-output\") pod \"must-gather-qn29q\" (UID: \"a631f930-56f5-4f20-aed0-fe6a80a2a5cf\") " pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.416579 master-0 kubenswrapper[36504]: I1203 23:41:48.414890 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzkq\" (UniqueName: \"kubernetes.io/projected/a631f930-56f5-4f20-aed0-fe6a80a2a5cf-kube-api-access-7fzkq\") pod \"must-gather-qn29q\" (UID: \"a631f930-56f5-4f20-aed0-fe6a80a2a5cf\") " pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.416579 master-0 kubenswrapper[36504]: I1203 23:41:48.415052 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m8r8\" (UniqueName: \"kubernetes.io/projected/8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd-kube-api-access-8m8r8\") pod \"must-gather-6hdjj\" (UID: \"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd\") " pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.439956 master-0 kubenswrapper[36504]: I1203 23:41:48.439866 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bns8/must-gather-qn29q"] Dec 03 23:41:48.519280 master-0 kubenswrapper[36504]: I1203 23:41:48.519212 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd-must-gather-output\") pod \"must-gather-6hdjj\" (UID: \"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd\") " pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.520110 master-0 kubenswrapper[36504]: I1203 23:41:48.519325 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a631f930-56f5-4f20-aed0-fe6a80a2a5cf-must-gather-output\") pod \"must-gather-qn29q\" (UID: \"a631f930-56f5-4f20-aed0-fe6a80a2a5cf\") " pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.520110 master-0 kubenswrapper[36504]: I1203 23:41:48.519463 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzkq\" (UniqueName: \"kubernetes.io/projected/a631f930-56f5-4f20-aed0-fe6a80a2a5cf-kube-api-access-7fzkq\") pod \"must-gather-qn29q\" (UID: \"a631f930-56f5-4f20-aed0-fe6a80a2a5cf\") " pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.520110 master-0 kubenswrapper[36504]: I1203 23:41:48.519518 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m8r8\" (UniqueName: \"kubernetes.io/projected/8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd-kube-api-access-8m8r8\") pod \"must-gather-6hdjj\" (UID: \"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd\") " pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.520110 master-0 kubenswrapper[36504]: I1203 23:41:48.520046 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a631f930-56f5-4f20-aed0-fe6a80a2a5cf-must-gather-output\") pod \"must-gather-qn29q\" (UID: \"a631f930-56f5-4f20-aed0-fe6a80a2a5cf\") " pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.520110 master-0 kubenswrapper[36504]: I1203 23:41:48.520092 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd-must-gather-output\") pod \"must-gather-6hdjj\" (UID: \"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd\") " pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.537977 master-0 kubenswrapper[36504]: I1203 23:41:48.537829 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzkq\" (UniqueName: \"kubernetes.io/projected/a631f930-56f5-4f20-aed0-fe6a80a2a5cf-kube-api-access-7fzkq\") pod \"must-gather-qn29q\" (UID: \"a631f930-56f5-4f20-aed0-fe6a80a2a5cf\") " pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:48.542815 master-0 kubenswrapper[36504]: I1203 23:41:48.542728 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m8r8\" (UniqueName: \"kubernetes.io/projected/8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd-kube-api-access-8m8r8\") pod \"must-gather-6hdjj\" (UID: \"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd\") " pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.670450 master-0 kubenswrapper[36504]: I1203 23:41:48.670361 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/must-gather-6hdjj" Dec 03 23:41:48.707276 master-0 kubenswrapper[36504]: I1203 23:41:48.707166 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/must-gather-qn29q" Dec 03 23:41:49.159582 master-0 kubenswrapper[36504]: I1203 23:41:49.159519 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bns8/must-gather-6hdjj"] Dec 03 23:41:49.334374 master-0 kubenswrapper[36504]: I1203 23:41:49.334167 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/must-gather-6hdjj" event={"ID":"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd","Type":"ContainerStarted","Data":"ee15d12e17d5b3963b0a138cc473b8ca2f935a2e3708022fa07ef834a743ba33"} Dec 03 23:41:49.369889 master-0 kubenswrapper[36504]: I1203 23:41:49.369818 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bns8/must-gather-qn29q"] Dec 03 23:41:49.389721 master-0 kubenswrapper[36504]: W1203 23:41:49.389613 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda631f930_56f5_4f20_aed0_fe6a80a2a5cf.slice/crio-dca9dbd3f62a057545ebf4b0ef9567c9ed8c86542a49f7124a4765636a2959fe WatchSource:0}: Error finding container dca9dbd3f62a057545ebf4b0ef9567c9ed8c86542a49f7124a4765636a2959fe: Status 404 returned error can't find the container with id dca9dbd3f62a057545ebf4b0ef9567c9ed8c86542a49f7124a4765636a2959fe Dec 03 23:41:50.352626 master-0 kubenswrapper[36504]: I1203 23:41:50.352542 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/must-gather-qn29q" event={"ID":"a631f930-56f5-4f20-aed0-fe6a80a2a5cf","Type":"ContainerStarted","Data":"dca9dbd3f62a057545ebf4b0ef9567c9ed8c86542a49f7124a4765636a2959fe"} Dec 03 23:41:51.376950 master-0 kubenswrapper[36504]: I1203 23:41:51.376878 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/must-gather-qn29q" event={"ID":"a631f930-56f5-4f20-aed0-fe6a80a2a5cf","Type":"ContainerStarted","Data":"98efc419ff45bcf42dca20c922039f4fd73bc4b8d0b0f18190480563805a63b4"} Dec 03 23:41:51.377606 master-0 kubenswrapper[36504]: I1203 23:41:51.376966 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/must-gather-qn29q" event={"ID":"a631f930-56f5-4f20-aed0-fe6a80a2a5cf","Type":"ContainerStarted","Data":"96bd6f208c7b89e9e835c63d60c7a15cf1532abefa39d0b8cf6227d035914417"} Dec 03 23:41:51.409683 master-0 kubenswrapper[36504]: I1203 23:41:51.409564 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2bns8/must-gather-qn29q" podStartSLOduration=2.423832991 podStartE2EDuration="3.409530569s" podCreationTimestamp="2025-12-03 23:41:48 +0000 UTC" firstStartedPulling="2025-12-03 23:41:49.392954358 +0000 UTC m=+5474.612726365" lastFinishedPulling="2025-12-03 23:41:50.378651936 +0000 UTC m=+5475.598423943" observedRunningTime="2025-12-03 23:41:51.398600127 +0000 UTC m=+5476.618372134" watchObservedRunningTime="2025-12-03 23:41:51.409530569 +0000 UTC m=+5476.629302576" Dec 03 23:41:52.631076 master-0 kubenswrapper[36504]: I1203 23:41:52.631022 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7c49fbfc6f-xg98g_0e9427b8-d62c-45f7-97d0-1f7667ff27aa/cluster-version-operator/0.log" Dec 03 23:41:54.195482 master-0 kubenswrapper[36504]: I1203 23:41:54.195435 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-7c49fbfc6f-xg98g_0e9427b8-d62c-45f7-97d0-1f7667ff27aa/cluster-version-operator/1.log" Dec 03 23:41:54.498534 master-0 kubenswrapper[36504]: I1203 23:41:54.497449 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/must-gather-6hdjj" event={"ID":"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd","Type":"ContainerStarted","Data":"3c9628f151ec0c077ddfa4950321d9a5279a00e71c830cb0b0ba770d00ccbd64"} Dec 03 23:41:55.512471 master-0 kubenswrapper[36504]: I1203 23:41:55.512415 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/must-gather-6hdjj" event={"ID":"8de63d2c-3fb5-42a2-b24d-d046c6c6f6fd","Type":"ContainerStarted","Data":"40a41030bb4a654c1e4b3c91aa6507d3026521397e9ed42af254c750829051e2"} Dec 03 23:41:55.791865 master-0 kubenswrapper[36504]: I1203 23:41:55.789490 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2bns8/must-gather-6hdjj" podStartSLOduration=3.038182666 podStartE2EDuration="7.789462543s" podCreationTimestamp="2025-12-03 23:41:48 +0000 UTC" firstStartedPulling="2025-12-03 23:41:49.171698933 +0000 UTC m=+5474.391470940" lastFinishedPulling="2025-12-03 23:41:53.92297881 +0000 UTC m=+5479.142750817" observedRunningTime="2025-12-03 23:41:55.786423608 +0000 UTC m=+5481.006195615" watchObservedRunningTime="2025-12-03 23:41:55.789462543 +0000 UTC m=+5481.009234540" Dec 03 23:41:56.265716 master-0 kubenswrapper[36504]: E1203 23:41:56.265648 36504 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:45370->192.168.32.10:39335: write tcp 192.168.32.10:45370->192.168.32.10:39335: write: broken pipe Dec 03 23:41:56.900791 master-0 kubenswrapper[36504]: I1203 23:41:56.898629 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-qjfzg_ae3ea3bf-06f5-4649-b689-bc1d92f6d0a3/nmstate-console-plugin/0.log" Dec 03 23:41:56.986790 master-0 kubenswrapper[36504]: I1203 23:41:56.985831 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q66gg_f9d2fce1-d9f0-4d7b-b18a-e2dd0b941003/nmstate-handler/0.log" Dec 03 23:41:57.013561 master-0 kubenswrapper[36504]: I1203 23:41:57.011353 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mct9g_eed33862-ae60-48df-b4ef-8406a4928db2/nmstate-metrics/0.log" Dec 03 23:41:57.029809 master-0 kubenswrapper[36504]: I1203 23:41:57.026373 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-mct9g_eed33862-ae60-48df-b4ef-8406a4928db2/kube-rbac-proxy/0.log" Dec 03 23:41:57.059444 master-0 kubenswrapper[36504]: I1203 23:41:57.059186 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-db99x_16ea3a17-93f8-4f62-9776-301fd54be174/nmstate-operator/0.log" Dec 03 23:41:57.082563 master-0 kubenswrapper[36504]: I1203 23:41:57.082321 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-ph6w5_451399c6-0304-4409-a5a7-26a242268011/nmstate-webhook/0.log" Dec 03 23:41:57.388791 master-0 kubenswrapper[36504]: I1203 23:41:57.382305 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7lk9d_4bf325e2-1bc1-4f4d-bccc-99757f834915/controller/0.log" Dec 03 23:41:57.397781 master-0 kubenswrapper[36504]: I1203 23:41:57.395708 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7lk9d_4bf325e2-1bc1-4f4d-bccc-99757f834915/kube-rbac-proxy/0.log" Dec 03 23:41:57.459796 master-0 kubenswrapper[36504]: I1203 23:41:57.459053 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/controller/0.log" Dec 03 23:41:59.730270 master-0 kubenswrapper[36504]: I1203 23:41:59.730196 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/frr/0.log" Dec 03 23:41:59.749117 master-0 kubenswrapper[36504]: I1203 23:41:59.743345 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/reloader/0.log" Dec 03 23:41:59.765802 master-0 kubenswrapper[36504]: I1203 23:41:59.765301 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/frr-metrics/0.log" Dec 03 23:41:59.805025 master-0 kubenswrapper[36504]: I1203 23:41:59.804985 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/kube-rbac-proxy/0.log" Dec 03 23:41:59.816685 master-0 kubenswrapper[36504]: I1203 23:41:59.816639 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/kube-rbac-proxy-frr/0.log" Dec 03 23:41:59.830961 master-0 kubenswrapper[36504]: I1203 23:41:59.830913 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/cp-frr-files/0.log" Dec 03 23:41:59.857502 master-0 kubenswrapper[36504]: I1203 23:41:59.855678 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/cp-reloader/0.log" Dec 03 23:41:59.873459 master-0 kubenswrapper[36504]: I1203 23:41:59.873318 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/cp-metrics/0.log" Dec 03 23:41:59.899794 master-0 kubenswrapper[36504]: I1203 23:41:59.898435 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-jh949_bb20fc19-0922-413a-b811-36351794d540/frr-k8s-webhook-server/0.log" Dec 03 23:41:59.975348 master-0 kubenswrapper[36504]: I1203 23:41:59.975291 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78597876b7-mbz6f_d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d/manager/0.log" Dec 03 23:41:59.994969 master-0 kubenswrapper[36504]: I1203 23:41:59.994844 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5664d6fd-phlzw_8676b196-ec39-41e6-938c-ad3f66a9b83e/webhook-server/0.log" Dec 03 23:42:00.597604 master-0 kubenswrapper[36504]: I1203 23:42:00.597424 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fnjz4_24153b70-6280-405c-b1d0-8bdd936bd646/speaker/0.log" Dec 03 23:42:00.609658 master-0 kubenswrapper[36504]: I1203 23:42:00.609605 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fnjz4_24153b70-6280-405c-b1d0-8bdd936bd646/kube-rbac-proxy/0.log" Dec 03 23:42:01.963049 master-0 kubenswrapper[36504]: I1203 23:42:01.962992 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcdctl/0.log" Dec 03 23:42:01.998491 master-0 kubenswrapper[36504]: I1203 23:42:01.998424 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2bns8/master-0-debug-7ql7s"] Dec 03 23:42:02.001723 master-0 kubenswrapper[36504]: I1203 23:42:02.001687 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.158799 master-0 kubenswrapper[36504]: I1203 23:42:02.158596 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prmqm\" (UniqueName: \"kubernetes.io/projected/10eb5f18-31af-4b8c-b595-f6e776c4646e-kube-api-access-prmqm\") pod \"master-0-debug-7ql7s\" (UID: \"10eb5f18-31af-4b8c-b595-f6e776c4646e\") " pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.159117 master-0 kubenswrapper[36504]: I1203 23:42:02.158974 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10eb5f18-31af-4b8c-b595-f6e776c4646e-host\") pod \"master-0-debug-7ql7s\" (UID: \"10eb5f18-31af-4b8c-b595-f6e776c4646e\") " pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.265896 master-0 kubenswrapper[36504]: I1203 23:42:02.265179 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10eb5f18-31af-4b8c-b595-f6e776c4646e-host\") pod \"master-0-debug-7ql7s\" (UID: \"10eb5f18-31af-4b8c-b595-f6e776c4646e\") " pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.265896 master-0 kubenswrapper[36504]: I1203 23:42:02.265317 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prmqm\" (UniqueName: \"kubernetes.io/projected/10eb5f18-31af-4b8c-b595-f6e776c4646e-kube-api-access-prmqm\") pod \"master-0-debug-7ql7s\" (UID: \"10eb5f18-31af-4b8c-b595-f6e776c4646e\") " pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.267806 master-0 kubenswrapper[36504]: I1203 23:42:02.266419 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10eb5f18-31af-4b8c-b595-f6e776c4646e-host\") pod \"master-0-debug-7ql7s\" (UID: \"10eb5f18-31af-4b8c-b595-f6e776c4646e\") " pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.316798 master-0 kubenswrapper[36504]: I1203 23:42:02.314066 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prmqm\" (UniqueName: \"kubernetes.io/projected/10eb5f18-31af-4b8c-b595-f6e776c4646e-kube-api-access-prmqm\") pod \"master-0-debug-7ql7s\" (UID: \"10eb5f18-31af-4b8c-b595-f6e776c4646e\") " pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.347796 master-0 kubenswrapper[36504]: I1203 23:42:02.346810 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" Dec 03 23:42:02.457800 master-0 kubenswrapper[36504]: I1203 23:42:02.455747 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd/0.log" Dec 03 23:42:02.530915 master-0 kubenswrapper[36504]: I1203 23:42:02.528730 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-metrics/0.log" Dec 03 23:42:02.542474 master-0 kubenswrapper[36504]: I1203 23:42:02.542410 36504 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:42:02.564786 master-0 kubenswrapper[36504]: I1203 23:42:02.561686 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-readyz/0.log" Dec 03 23:42:02.612809 master-0 kubenswrapper[36504]: I1203 23:42:02.611762 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-rev/0.log" Dec 03 23:42:02.645797 master-0 kubenswrapper[36504]: I1203 23:42:02.641865 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/setup/0.log" Dec 03 23:42:02.673793 master-0 kubenswrapper[36504]: I1203 23:42:02.671631 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-ensure-env-vars/0.log" Dec 03 23:42:02.694801 master-0 kubenswrapper[36504]: I1203 23:42:02.693549 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-resources-copy/0.log" Dec 03 23:42:02.709798 master-0 kubenswrapper[36504]: I1203 23:42:02.708472 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" event={"ID":"10eb5f18-31af-4b8c-b595-f6e776c4646e","Type":"ContainerStarted","Data":"1fd4b287d223c50d0ebfa1857bc54c995d013503ba378a390301d9efc52e1b67"} Dec 03 23:42:02.769806 master-0 kubenswrapper[36504]: I1203 23:42:02.767241 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_3497f5dd-4c6f-4108-a948-481cef475ba9/installer/0.log" Dec 03 23:42:02.826913 master-0 kubenswrapper[36504]: I1203 23:42:02.821618 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_70e52a8c-7f9e-47fa-85ca-41f90dcb9747/installer/0.log" Dec 03 23:42:03.141705 master-0 kubenswrapper[36504]: I1203 23:42:03.137396 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-65b5c9bf6b-s2qsn_8ad757cc-9780-4081-89ba-415ce2751b01/oauth-openshift/0.log" Dec 03 23:42:04.162801 master-0 kubenswrapper[36504]: I1203 23:42:04.160895 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-q7jjz_87c3edb2-12e8-45b0-99ac-9a794dd2881d/assisted-installer-controller/0.log" Dec 03 23:42:04.834850 master-0 kubenswrapper[36504]: I1203 23:42:04.831953 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/3.log" Dec 03 23:42:04.872299 master-0 kubenswrapper[36504]: I1203 23:42:04.870960 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7479ffdf48-fqnsm_785612fc-3f78-4f1a-bc83-7afe5d3b8056/authentication-operator/4.log" Dec 03 23:42:05.064794 master-0 kubenswrapper[36504]: I1203 23:42:05.064424 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq"] Dec 03 23:42:05.069031 master-0 kubenswrapper[36504]: I1203 23:42:05.068916 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.092925 master-0 kubenswrapper[36504]: I1203 23:42:05.089247 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq"] Dec 03 23:42:05.193733 master-0 kubenswrapper[36504]: I1203 23:42:05.193636 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-proc\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.194482 master-0 kubenswrapper[36504]: I1203 23:42:05.193827 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-sys\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.194482 master-0 kubenswrapper[36504]: I1203 23:42:05.193867 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtk97\" (UniqueName: \"kubernetes.io/projected/429bed46-48f1-4785-8139-09fb796f03bc-kube-api-access-rtk97\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.194482 master-0 kubenswrapper[36504]: I1203 23:42:05.193961 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-podres\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.194482 master-0 kubenswrapper[36504]: I1203 23:42:05.194055 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-lib-modules\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.298372 master-0 kubenswrapper[36504]: I1203 23:42:05.298204 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-sys\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.298372 master-0 kubenswrapper[36504]: I1203 23:42:05.298291 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtk97\" (UniqueName: \"kubernetes.io/projected/429bed46-48f1-4785-8139-09fb796f03bc-kube-api-access-rtk97\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.298816 master-0 kubenswrapper[36504]: I1203 23:42:05.298392 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-sys\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.298816 master-0 kubenswrapper[36504]: I1203 23:42:05.298495 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-podres\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.298816 master-0 kubenswrapper[36504]: I1203 23:42:05.298415 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-podres\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.299005 master-0 kubenswrapper[36504]: I1203 23:42:05.298948 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-lib-modules\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.300076 master-0 kubenswrapper[36504]: I1203 23:42:05.299570 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-proc\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.300076 master-0 kubenswrapper[36504]: I1203 23:42:05.299955 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-proc\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.300221 master-0 kubenswrapper[36504]: I1203 23:42:05.300109 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/429bed46-48f1-4785-8139-09fb796f03bc-lib-modules\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.321914 master-0 kubenswrapper[36504]: I1203 23:42:05.321845 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtk97\" (UniqueName: \"kubernetes.io/projected/429bed46-48f1-4785-8139-09fb796f03bc-kube-api-access-rtk97\") pod \"perf-node-gather-daemonset-qlmxq\" (UID: \"429bed46-48f1-4785-8139-09fb796f03bc\") " pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:05.482563 master-0 kubenswrapper[36504]: I1203 23:42:05.481551 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:06.046799 master-0 kubenswrapper[36504]: I1203 23:42:06.032496 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54f97f57-xq6ch_698e6d87-1a58-493c-8b69-d22c89d26ac5/router/5.log" Dec 03 23:42:06.046799 master-0 kubenswrapper[36504]: I1203 23:42:06.035037 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-54f97f57-xq6ch_698e6d87-1a58-493c-8b69-d22c89d26ac5/router/4.log" Dec 03 23:42:06.121077 master-0 kubenswrapper[36504]: W1203 23:42:06.117877 36504 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod429bed46_48f1_4785_8139_09fb796f03bc.slice/crio-01eb392d34ab468d78bace0660a349f233b0dad95911c013224b2673a9429418 WatchSource:0}: Error finding container 01eb392d34ab468d78bace0660a349f233b0dad95911c013224b2673a9429418: Status 404 returned error can't find the container with id 01eb392d34ab468d78bace0660a349f233b0dad95911c013224b2673a9429418 Dec 03 23:42:06.123326 master-0 kubenswrapper[36504]: I1203 23:42:06.122449 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq"] Dec 03 23:42:06.811209 master-0 kubenswrapper[36504]: I1203 23:42:06.811129 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" event={"ID":"429bed46-48f1-4785-8139-09fb796f03bc","Type":"ContainerStarted","Data":"aede6a749d8cbacaa8619d155cf52e46b7795feae2d849d8f661e50061a791fe"} Dec 03 23:42:06.811209 master-0 kubenswrapper[36504]: I1203 23:42:06.811202 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" event={"ID":"429bed46-48f1-4785-8139-09fb796f03bc","Type":"ContainerStarted","Data":"01eb392d34ab468d78bace0660a349f233b0dad95911c013224b2673a9429418"} Dec 03 23:42:06.812049 master-0 kubenswrapper[36504]: I1203 23:42:06.811352 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:06.856946 master-0 kubenswrapper[36504]: I1203 23:42:06.856758 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" podStartSLOduration=1.856723445 podStartE2EDuration="1.856723445s" podCreationTimestamp="2025-12-03 23:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:06.831226897 +0000 UTC m=+5492.050998924" watchObservedRunningTime="2025-12-03 23:42:06.856723445 +0000 UTC m=+5492.076495452" Dec 03 23:42:07.029325 master-0 kubenswrapper[36504]: I1203 23:42:07.029184 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67d47fb995-88vr2_246b7846-0dfd-43a8-bcfa-81e7435060dc/oauth-apiserver/0.log" Dec 03 23:42:07.050298 master-0 kubenswrapper[36504]: I1203 23:42:07.050219 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67d47fb995-88vr2_246b7846-0dfd-43a8-bcfa-81e7435060dc/fix-audit-permissions/0.log" Dec 03 23:42:07.824959 master-0 kubenswrapper[36504]: I1203 23:42:07.824645 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kb5rx_858384f3-5741-4e67-8669-2eb2b2dcaf7f/kube-rbac-proxy/0.log" Dec 03 23:42:07.861293 master-0 kubenswrapper[36504]: I1203 23:42:07.861218 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kb5rx_858384f3-5741-4e67-8669-2eb2b2dcaf7f/cluster-autoscaler-operator/0.log" Dec 03 23:42:07.891029 master-0 kubenswrapper[36504]: I1203 23:42:07.890961 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-7f88444875-kb5rx_858384f3-5741-4e67-8669-2eb2b2dcaf7f/cluster-autoscaler-operator/1.log" Dec 03 23:42:07.928467 master-0 kubenswrapper[36504]: I1203 23:42:07.923375 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/4.log" Dec 03 23:42:07.928467 master-0 kubenswrapper[36504]: I1203 23:42:07.923899 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/cluster-baremetal-operator/3.log" Dec 03 23:42:07.946850 master-0 kubenswrapper[36504]: I1203 23:42:07.944586 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5fdc576499-q9tf6_fa9b5917-d4f3-4372-a200-45b57412f92f/baremetal-kube-rbac-proxy/0.log" Dec 03 23:42:07.967209 master-0 kubenswrapper[36504]: I1203 23:42:07.967138 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/1.log" Dec 03 23:42:07.973069 master-0 kubenswrapper[36504]: I1203 23:42:07.970759 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-66f4cc99d4-jlq49_ba624ed0-32cc-4c87-81a5-708a8a8a7f88/control-plane-machine-set-operator/2.log" Dec 03 23:42:07.998600 master-0 kubenswrapper[36504]: I1203 23:42:07.998488 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-w9xk2_8b56c318-09b7-47f0-a7bf-32eb96e836ca/kube-rbac-proxy/0.log" Dec 03 23:42:08.025921 master-0 kubenswrapper[36504]: I1203 23:42:08.025800 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-w9xk2_8b56c318-09b7-47f0-a7bf-32eb96e836ca/machine-api-operator/1.log" Dec 03 23:42:08.026528 master-0 kubenswrapper[36504]: I1203 23:42:08.026501 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-7486ff55f-w9xk2_8b56c318-09b7-47f0-a7bf-32eb96e836ca/machine-api-operator/0.log" Dec 03 23:42:09.528861 master-0 kubenswrapper[36504]: I1203 23:42:09.528750 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/cluster-cloud-controller-manager/0.log" Dec 03 23:42:09.533873 master-0 kubenswrapper[36504]: I1203 23:42:09.532622 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/cluster-cloud-controller-manager/1.log" Dec 03 23:42:09.564450 master-0 kubenswrapper[36504]: I1203 23:42:09.564395 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/0.log" Dec 03 23:42:09.566435 master-0 kubenswrapper[36504]: I1203 23:42:09.566394 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/config-sync-controllers/1.log" Dec 03 23:42:09.589275 master-0 kubenswrapper[36504]: I1203 23:42:09.589198 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-6c74dddbfb-f2rzx_def52ba3-77c1-4e0c-8a0d-44ff4d677607/kube-rbac-proxy/0.log" Dec 03 23:42:11.844154 master-0 kubenswrapper[36504]: I1203 23:42:11.841315 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-7c4dc67499-jhd6n_3a7e0eea-3da8-43de-87bc-d10231e7c239/kube-rbac-proxy/0.log" Dec 03 23:42:11.885328 master-0 kubenswrapper[36504]: I1203 23:42:11.885237 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-7c4dc67499-jhd6n_3a7e0eea-3da8-43de-87bc-d10231e7c239/cloud-credential-operator/0.log" Dec 03 23:42:13.702832 master-0 kubenswrapper[36504]: I1203 23:42:13.702489 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/4.log" Dec 03 23:42:13.707099 master-0 kubenswrapper[36504]: I1203 23:42:13.706935 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-config-operator/5.log" Dec 03 23:42:13.721832 master-0 kubenswrapper[36504]: I1203 23:42:13.721342 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-68c95b6cf5-2cs5d_e50b85a6-7767-4fca-8133-8243bdd85e5d/openshift-api/0.log" Dec 03 23:42:14.748463 master-0 kubenswrapper[36504]: I1203 23:42:14.748399 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77df56447c-s242z_9d022890-8679-440b-bfb1-dbda6ee771f0/console-operator/0.log" Dec 03 23:42:15.526795 master-0 kubenswrapper[36504]: I1203 23:42:15.526546 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2bns8/perf-node-gather-daemonset-qlmxq" Dec 03 23:42:15.599741 master-0 kubenswrapper[36504]: I1203 23:42:15.599657 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6b7b7b5c-jwfc2_8cb7ccb1-c763-4f80-8d15-f26be3cf795b/console/0.log" Dec 03 23:42:15.681799 master-0 kubenswrapper[36504]: I1203 23:42:15.681713 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6f5db8559b-xzb5p_716a95ab-7d31-41ac-ad14-f756f17b3179/download-server/0.log" Dec 03 23:42:16.594663 master-0 kubenswrapper[36504]: I1203 23:42:16.594426 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/2.log" Dec 03 23:42:16.599895 master-0 kubenswrapper[36504]: I1203 23:42:16.598928 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-f84784664-hv5z8_6e96335e-1866-41c8-b128-b95e783a9be4/cluster-storage-operator/3.log" Dec 03 23:42:16.623489 master-0 kubenswrapper[36504]: I1203 23:42:16.623423 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/5.log" Dec 03 23:42:16.624206 master-0 kubenswrapper[36504]: I1203 23:42:16.624069 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-86897dd478-g4ldp_28c42112-a09e-4b7a-b23b-c06bef69cbfb/snapshot-controller/4.log" Dec 03 23:42:16.663751 master-0 kubenswrapper[36504]: I1203 23:42:16.663687 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-l9q2j_add88bf0-c88d-427d-94bb-897e088a1378/csi-snapshot-controller-operator/1.log" Dec 03 23:42:16.665541 master-0 kubenswrapper[36504]: I1203 23:42:16.665476 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-7b795784b8-l9q2j_add88bf0-c88d-427d-94bb-897e088a1378/csi-snapshot-controller-operator/0.log" Dec 03 23:42:17.560593 master-0 kubenswrapper[36504]: I1203 23:42:17.560519 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-6b7bcd6566-qcg9x_a9a3f403-a742-4977-901a-cf4a8eb7df5a/dns-operator/0.log" Dec 03 23:42:17.580738 master-0 kubenswrapper[36504]: I1203 23:42:17.580688 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-6b7bcd6566-qcg9x_a9a3f403-a742-4977-901a-cf4a8eb7df5a/kube-rbac-proxy/0.log" Dec 03 23:42:18.423589 master-0 kubenswrapper[36504]: I1203 23:42:18.422801 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9skcn_54767c36-ca29-4c91-9a8a-9699ecfa4afb/dns/0.log" Dec 03 23:42:18.449797 master-0 kubenswrapper[36504]: I1203 23:42:18.446953 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9skcn_54767c36-ca29-4c91-9a8a-9699ecfa4afb/kube-rbac-proxy/0.log" Dec 03 23:42:18.498306 master-0 kubenswrapper[36504]: I1203 23:42:18.498006 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dx8h_66aa2598-f4b6-4d3a-9623-aeb707e4912b/dns-node-resolver/0.log" Dec 03 23:42:19.378397 master-0 kubenswrapper[36504]: I1203 23:42:19.378255 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/3.log" Dec 03 23:42:19.388929 master-0 kubenswrapper[36504]: I1203 23:42:19.388872 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-7978bf889c-w8hsm_82055cfc-b4ce-4a00-a51d-141059947693/etcd-operator/4.log" Dec 03 23:42:20.080482 master-0 kubenswrapper[36504]: I1203 23:42:20.075868 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" event={"ID":"10eb5f18-31af-4b8c-b595-f6e776c4646e","Type":"ContainerStarted","Data":"b9953e46533c0a19357ee080121f545b695c2b1a699e87e93eec1940414cfa86"} Dec 03 23:42:20.118473 master-0 kubenswrapper[36504]: I1203 23:42:20.117395 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2bns8/master-0-debug-7ql7s" podStartSLOduration=2.68112289 podStartE2EDuration="19.11735762s" podCreationTimestamp="2025-12-03 23:42:01 +0000 UTC" firstStartedPulling="2025-12-03 23:42:02.542269599 +0000 UTC m=+5487.762041606" lastFinishedPulling="2025-12-03 23:42:18.978504319 +0000 UTC m=+5504.198276336" observedRunningTime="2025-12-03 23:42:20.099314566 +0000 UTC m=+5505.319086573" watchObservedRunningTime="2025-12-03 23:42:20.11735762 +0000 UTC m=+5505.337129847" Dec 03 23:42:20.174461 master-0 kubenswrapper[36504]: I1203 23:42:20.174391 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcdctl/0.log" Dec 03 23:42:20.782920 master-0 kubenswrapper[36504]: I1203 23:42:20.782860 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd/0.log" Dec 03 23:42:20.800283 master-0 kubenswrapper[36504]: I1203 23:42:20.800203 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-metrics/0.log" Dec 03 23:42:20.822721 master-0 kubenswrapper[36504]: I1203 23:42:20.822646 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-readyz/0.log" Dec 03 23:42:20.862246 master-0 kubenswrapper[36504]: I1203 23:42:20.862187 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-rev/0.log" Dec 03 23:42:20.882681 master-0 kubenswrapper[36504]: I1203 23:42:20.882639 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/setup/0.log" Dec 03 23:42:20.897363 master-0 kubenswrapper[36504]: I1203 23:42:20.897304 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-ensure-env-vars/0.log" Dec 03 23:42:20.923463 master-0 kubenswrapper[36504]: I1203 23:42:20.923401 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_4dd8b778e190b1975a0a8fad534da6dd/etcd-resources-copy/0.log" Dec 03 23:42:20.994248 master-0 kubenswrapper[36504]: I1203 23:42:20.994207 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_3497f5dd-4c6f-4108-a948-481cef475ba9/installer/0.log" Dec 03 23:42:21.055381 master-0 kubenswrapper[36504]: I1203 23:42:21.055167 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_70e52a8c-7f9e-47fa-85ca-41f90dcb9747/installer/0.log" Dec 03 23:42:22.205095 master-0 kubenswrapper[36504]: I1203 23:42:22.205041 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-65dc4bcb88-2vvsj_39f0e973-7864-4842-af8e-47718ab1804c/cluster-image-registry-operator/0.log" Dec 03 23:42:22.219871 master-0 kubenswrapper[36504]: I1203 23:42:22.219805 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-65dc4bcb88-2vvsj_39f0e973-7864-4842-af8e-47718ab1804c/cluster-image-registry-operator/1.log" Dec 03 23:42:22.235845 master-0 kubenswrapper[36504]: I1203 23:42:22.235758 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jsp5g_688984b6-3531-488d-a769-908bb88fa342/node-ca/0.log" Dec 03 23:42:22.921383 master-0 kubenswrapper[36504]: I1203 23:42:22.921312 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/5.log" Dec 03 23:42:22.927194 master-0 kubenswrapper[36504]: I1203 23:42:22.927159 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/ingress-operator/6.log" Dec 03 23:42:22.942581 master-0 kubenswrapper[36504]: I1203 23:42:22.942535 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-85dbd94574-2hxlh_0869de9b-6f5b-4c31-81ad-02a9c8888193/kube-rbac-proxy/0.log" Dec 03 23:42:23.842005 master-0 kubenswrapper[36504]: I1203 23:42:23.841452 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qsfnw_7c8ec36d-9179-40ab-a448-440b4501b3e0/serve-healthcheck-canary/0.log" Dec 03 23:42:24.932206 master-0 kubenswrapper[36504]: I1203 23:42:24.932131 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-59d99f9b7b-x4tfh_578b2d03-b8b3-4c75-adde-73899c472ad7/insights-operator/1.log" Dec 03 23:42:24.958303 master-0 kubenswrapper[36504]: I1203 23:42:24.958233 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-59d99f9b7b-x4tfh_578b2d03-b8b3-4c75-adde-73899c472ad7/insights-operator/2.log" Dec 03 23:42:25.687463 master-0 kubenswrapper[36504]: I1203 23:42:25.687282 36504 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qn7z7"] Dec 03 23:42:25.691016 master-0 kubenswrapper[36504]: I1203 23:42:25.690970 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.709014 master-0 kubenswrapper[36504]: I1203 23:42:25.708923 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qn7z7"] Dec 03 23:42:25.810141 master-0 kubenswrapper[36504]: I1203 23:42:25.810070 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-catalog-content\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.810584 master-0 kubenswrapper[36504]: I1203 23:42:25.810554 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtmc\" (UniqueName: \"kubernetes.io/projected/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-kube-api-access-bhtmc\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.810792 master-0 kubenswrapper[36504]: I1203 23:42:25.810752 36504 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-utilities\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.921170 master-0 kubenswrapper[36504]: I1203 23:42:25.921098 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-catalog-content\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.922159 master-0 kubenswrapper[36504]: I1203 23:42:25.922141 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtmc\" (UniqueName: \"kubernetes.io/projected/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-kube-api-access-bhtmc\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.922291 master-0 kubenswrapper[36504]: I1203 23:42:25.922276 36504 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-utilities\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.923031 master-0 kubenswrapper[36504]: I1203 23:42:25.923011 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-utilities\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.923141 master-0 kubenswrapper[36504]: I1203 23:42:25.922075 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-catalog-content\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:25.954102 master-0 kubenswrapper[36504]: I1203 23:42:25.953877 36504 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtmc\" (UniqueName: \"kubernetes.io/projected/20725dc2-d8b9-4685-b0b5-d4cef442f4d5-kube-api-access-bhtmc\") pod \"community-operators-qn7z7\" (UID: \"20725dc2-d8b9-4685-b0b5-d4cef442f4d5\") " pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:26.033810 master-0 kubenswrapper[36504]: I1203 23:42:26.033012 36504 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:26.966041 master-0 kubenswrapper[36504]: I1203 23:42:26.965965 36504 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qn7z7"] Dec 03 23:42:27.242453 master-0 kubenswrapper[36504]: I1203 23:42:27.242273 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qn7z7" event={"ID":"20725dc2-d8b9-4685-b0b5-d4cef442f4d5","Type":"ContainerStarted","Data":"ef8e80a449098537d59463d27c838e512d78f3a322cf78d6bdefbfce02586b13"} Dec 03 23:42:27.414215 master-0 kubenswrapper[36504]: I1203 23:42:27.414104 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/alertmanager/0.log" Dec 03 23:42:27.436172 master-0 kubenswrapper[36504]: I1203 23:42:27.436093 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/config-reloader/0.log" Dec 03 23:42:27.454303 master-0 kubenswrapper[36504]: I1203 23:42:27.454230 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/kube-rbac-proxy-web/0.log" Dec 03 23:42:27.477819 master-0 kubenswrapper[36504]: I1203 23:42:27.477749 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/kube-rbac-proxy/0.log" Dec 03 23:42:27.502712 master-0 kubenswrapper[36504]: I1203 23:42:27.502533 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/kube-rbac-proxy-metric/0.log" Dec 03 23:42:27.524146 master-0 kubenswrapper[36504]: I1203 23:42:27.523864 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/prom-label-proxy/0.log" Dec 03 23:42:27.542536 master-0 kubenswrapper[36504]: I1203 23:42:27.542467 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_bdfca75d-cacc-4a25-83c9-a1fe7f2f140a/init-config-reloader/0.log" Dec 03 23:42:27.895762 master-0 kubenswrapper[36504]: I1203 23:42:27.895715 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-69cc794c58-vns7s_bebd69d2-5b0f-4b66-8722-d6861eba3e12/cluster-monitoring-operator/0.log" Dec 03 23:42:27.921733 master-0 kubenswrapper[36504]: I1203 23:42:27.921679 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7dcc7f9bd6-kldf9_2d592f19-c7b9-4b29-9ca2-848572067908/kube-state-metrics/0.log" Dec 03 23:42:28.201985 master-0 kubenswrapper[36504]: I1203 23:42:28.201932 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7dcc7f9bd6-kldf9_2d592f19-c7b9-4b29-9ca2-848572067908/kube-rbac-proxy-main/0.log" Dec 03 23:42:28.219267 master-0 kubenswrapper[36504]: I1203 23:42:28.219202 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7dcc7f9bd6-kldf9_2d592f19-c7b9-4b29-9ca2-848572067908/kube-rbac-proxy-self/0.log" Dec 03 23:42:28.267938 master-0 kubenswrapper[36504]: I1203 23:42:28.267874 36504 generic.go:334] "Generic (PLEG): container finished" podID="20725dc2-d8b9-4685-b0b5-d4cef442f4d5" containerID="cc8c339f5563031f441ac1e4cd42550f55b2385354e8a82b20ace5c1d2761036" exitCode=0 Dec 03 23:42:28.268165 master-0 kubenswrapper[36504]: I1203 23:42:28.267947 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qn7z7" event={"ID":"20725dc2-d8b9-4685-b0b5-d4cef442f4d5","Type":"ContainerDied","Data":"cc8c339f5563031f441ac1e4cd42550f55b2385354e8a82b20ace5c1d2761036"} Dec 03 23:42:28.559985 master-0 kubenswrapper[36504]: I1203 23:42:28.559897 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-576975bfdb-gd855_1993ad4d-95fc-4af9-9ce3-e29112a1a3a4/metrics-server/0.log" Dec 03 23:42:28.693592 master-0 kubenswrapper[36504]: I1203 23:42:28.693504 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-688c48b94d-mj6wx_abe93580-7514-4a69-a8af-c11a0f3c8f8a/monitoring-plugin/0.log" Dec 03 23:42:28.781549 master-0 kubenswrapper[36504]: I1203 23:42:28.781384 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nkjnl_bcbec7ef-0b98-4346-8c6b-c5fa37e90286/node-exporter/0.log" Dec 03 23:42:28.861852 master-0 kubenswrapper[36504]: I1203 23:42:28.861784 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nkjnl_bcbec7ef-0b98-4346-8c6b-c5fa37e90286/kube-rbac-proxy/0.log" Dec 03 23:42:28.877683 master-0 kubenswrapper[36504]: I1203 23:42:28.877623 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nkjnl_bcbec7ef-0b98-4346-8c6b-c5fa37e90286/init-textfile/0.log" Dec 03 23:42:28.903090 master-0 kubenswrapper[36504]: I1203 23:42:28.903012 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-57cbc648f8-rhf8p_d0be52f3-b318-4630-b4da-f3c4a57d5818/kube-rbac-proxy-main/0.log" Dec 03 23:42:28.925296 master-0 kubenswrapper[36504]: I1203 23:42:28.925233 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-57cbc648f8-rhf8p_d0be52f3-b318-4630-b4da-f3c4a57d5818/kube-rbac-proxy-self/0.log" Dec 03 23:42:29.023133 master-0 kubenswrapper[36504]: I1203 23:42:29.023058 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-57cbc648f8-rhf8p_d0be52f3-b318-4630-b4da-f3c4a57d5818/openshift-state-metrics/0.log" Dec 03 23:42:29.069636 master-0 kubenswrapper[36504]: I1203 23:42:29.069556 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/prometheus/0.log" Dec 03 23:42:29.088175 master-0 kubenswrapper[36504]: I1203 23:42:29.088092 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/config-reloader/0.log" Dec 03 23:42:29.112070 master-0 kubenswrapper[36504]: I1203 23:42:29.112007 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/thanos-sidecar/0.log" Dec 03 23:42:29.165001 master-0 kubenswrapper[36504]: I1203 23:42:29.164903 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/kube-rbac-proxy-web/0.log" Dec 03 23:42:29.188140 master-0 kubenswrapper[36504]: I1203 23:42:29.188100 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/kube-rbac-proxy/0.log" Dec 03 23:42:29.212514 master-0 kubenswrapper[36504]: I1203 23:42:29.212442 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/kube-rbac-proxy-thanos/0.log" Dec 03 23:42:29.235619 master-0 kubenswrapper[36504]: I1203 23:42:29.235554 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ac5d9240-cb7d-4714-9891-dd602555b7c1/init-config-reloader/0.log" Dec 03 23:42:29.278312 master-0 kubenswrapper[36504]: I1203 23:42:29.278271 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-565bdcb8-7s9vg_d6fafa97-812d-4588-95f8-7c4d85f53098/prometheus-operator/0.log" Dec 03 23:42:29.296909 master-0 kubenswrapper[36504]: I1203 23:42:29.296853 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-565bdcb8-7s9vg_d6fafa97-812d-4588-95f8-7c4d85f53098/kube-rbac-proxy/0.log" Dec 03 23:42:29.314921 master-0 kubenswrapper[36504]: I1203 23:42:29.314868 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-6d4cbfb4b-rqszb_922419d4-b528-472e-8215-4a55a96dab08/prometheus-operator-admission-webhook/0.log" Dec 03 23:42:29.347201 master-0 kubenswrapper[36504]: I1203 23:42:29.347125 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55c7ddc498-h4b86_4af62c74-a1a5-497c-80d8-9cb916e8e762/telemeter-client/0.log" Dec 03 23:42:29.363544 master-0 kubenswrapper[36504]: I1203 23:42:29.363471 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55c7ddc498-h4b86_4af62c74-a1a5-497c-80d8-9cb916e8e762/reload/0.log" Dec 03 23:42:29.391821 master-0 kubenswrapper[36504]: I1203 23:42:29.391748 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-55c7ddc498-h4b86_4af62c74-a1a5-497c-80d8-9cb916e8e762/kube-rbac-proxy/0.log" Dec 03 23:42:29.427128 master-0 kubenswrapper[36504]: I1203 23:42:29.427079 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f59c45d65-v6hb2_d0e75f01-426f-4c0d-8398-76bd3edc06cb/thanos-query/0.log" Dec 03 23:42:29.446301 master-0 kubenswrapper[36504]: I1203 23:42:29.446210 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f59c45d65-v6hb2_d0e75f01-426f-4c0d-8398-76bd3edc06cb/kube-rbac-proxy-web/0.log" Dec 03 23:42:29.469333 master-0 kubenswrapper[36504]: I1203 23:42:29.469247 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f59c45d65-v6hb2_d0e75f01-426f-4c0d-8398-76bd3edc06cb/kube-rbac-proxy/0.log" Dec 03 23:42:29.486459 master-0 kubenswrapper[36504]: I1203 23:42:29.486388 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f59c45d65-v6hb2_d0e75f01-426f-4c0d-8398-76bd3edc06cb/prom-label-proxy/0.log" Dec 03 23:42:29.507123 master-0 kubenswrapper[36504]: I1203 23:42:29.507048 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f59c45d65-v6hb2_d0e75f01-426f-4c0d-8398-76bd3edc06cb/kube-rbac-proxy-rules/0.log" Dec 03 23:42:29.523920 master-0 kubenswrapper[36504]: I1203 23:42:29.523872 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f59c45d65-v6hb2_d0e75f01-426f-4c0d-8398-76bd3edc06cb/kube-rbac-proxy-metrics/0.log" Dec 03 23:42:30.357142 master-0 kubenswrapper[36504]: I1203 23:42:30.356965 36504 generic.go:334] "Generic (PLEG): container finished" podID="20725dc2-d8b9-4685-b0b5-d4cef442f4d5" containerID="15832dfd4dcc9988d6430418535ec2469ffbe34f3faa1f6271290a7dcb785361" exitCode=0 Dec 03 23:42:30.357142 master-0 kubenswrapper[36504]: I1203 23:42:30.357056 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qn7z7" event={"ID":"20725dc2-d8b9-4685-b0b5-d4cef442f4d5","Type":"ContainerDied","Data":"15832dfd4dcc9988d6430418535ec2469ffbe34f3faa1f6271290a7dcb785361"} Dec 03 23:42:32.255489 master-0 kubenswrapper[36504]: I1203 23:42:32.254791 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7lk9d_4bf325e2-1bc1-4f4d-bccc-99757f834915/controller/0.log" Dec 03 23:42:32.272949 master-0 kubenswrapper[36504]: I1203 23:42:32.271147 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-7lk9d_4bf325e2-1bc1-4f4d-bccc-99757f834915/kube-rbac-proxy/0.log" Dec 03 23:42:32.301634 master-0 kubenswrapper[36504]: I1203 23:42:32.301577 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/controller/0.log" Dec 03 23:42:33.408623 master-0 kubenswrapper[36504]: I1203 23:42:33.408541 36504 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qn7z7" event={"ID":"20725dc2-d8b9-4685-b0b5-d4cef442f4d5","Type":"ContainerStarted","Data":"7f59726c93ecf2c9831d06d90219a54df7661e566bd4f977f9848555a2f0bbf1"} Dec 03 23:42:33.454528 master-0 kubenswrapper[36504]: I1203 23:42:33.454424 36504 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qn7z7" podStartSLOduration=4.782599564 podStartE2EDuration="8.454384776s" podCreationTimestamp="2025-12-03 23:42:25 +0000 UTC" firstStartedPulling="2025-12-03 23:42:28.27076462 +0000 UTC m=+5513.490536617" lastFinishedPulling="2025-12-03 23:42:31.942549822 +0000 UTC m=+5517.162321829" observedRunningTime="2025-12-03 23:42:33.437761876 +0000 UTC m=+5518.657533903" watchObservedRunningTime="2025-12-03 23:42:33.454384776 +0000 UTC m=+5518.674156783" Dec 03 23:42:34.420630 master-0 kubenswrapper[36504]: I1203 23:42:34.420536 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/frr/0.log" Dec 03 23:42:34.442645 master-0 kubenswrapper[36504]: I1203 23:42:34.442236 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/reloader/0.log" Dec 03 23:42:34.458332 master-0 kubenswrapper[36504]: I1203 23:42:34.458282 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/frr-metrics/0.log" Dec 03 23:42:34.471406 master-0 kubenswrapper[36504]: I1203 23:42:34.471369 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/kube-rbac-proxy/0.log" Dec 03 23:42:34.493007 master-0 kubenswrapper[36504]: I1203 23:42:34.492922 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/kube-rbac-proxy-frr/0.log" Dec 03 23:42:34.508273 master-0 kubenswrapper[36504]: I1203 23:42:34.508223 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/cp-frr-files/0.log" Dec 03 23:42:34.531079 master-0 kubenswrapper[36504]: I1203 23:42:34.531007 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/cp-reloader/0.log" Dec 03 23:42:34.553661 master-0 kubenswrapper[36504]: I1203 23:42:34.553602 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qb4bc_25f6f30f-28f4-4f42-861c-b413ab0691df/cp-metrics/0.log" Dec 03 23:42:34.579123 master-0 kubenswrapper[36504]: I1203 23:42:34.579061 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-jh949_bb20fc19-0922-413a-b811-36351794d540/frr-k8s-webhook-server/0.log" Dec 03 23:42:34.611426 master-0 kubenswrapper[36504]: I1203 23:42:34.611346 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-78597876b7-mbz6f_d1a2be4c-5ab1-4f5d-b2ab-a161805d9a3d/manager/0.log" Dec 03 23:42:34.637464 master-0 kubenswrapper[36504]: I1203 23:42:34.637281 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5664d6fd-phlzw_8676b196-ec39-41e6-938c-ad3f66a9b83e/webhook-server/0.log" Dec 03 23:42:35.109464 master-0 kubenswrapper[36504]: I1203 23:42:35.108264 36504 kubelet_pods.go:1000] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-jsp5g" secret="" err="secret \"node-ca-dockercfg-l4d7p\" not found" Dec 03 23:42:35.114377 master-0 kubenswrapper[36504]: I1203 23:42:35.114308 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fnjz4_24153b70-6280-405c-b1d0-8bdd936bd646/speaker/0.log" Dec 03 23:42:35.152978 master-0 kubenswrapper[36504]: I1203 23:42:35.152923 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-fnjz4_24153b70-6280-405c-b1d0-8bdd936bd646/kube-rbac-proxy/0.log" Dec 03 23:42:35.762627 master-0 kubenswrapper[36504]: E1203 23:42:35.759949 36504 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d5d61a_c5de_4619_9afb_7fad63ba0525.slice/crio-1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Error finding container 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49: Status 404 returned error can't find the container with id 1f7bf306356a7a093361697828d618941d149ab1e54ea3ea7cde2cc33e39ef49 Dec 03 23:42:36.035331 master-0 kubenswrapper[36504]: I1203 23:42:36.035163 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:36.035331 master-0 kubenswrapper[36504]: I1203 23:42:36.035246 36504 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:36.105621 master-0 kubenswrapper[36504]: I1203 23:42:36.105553 36504 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qn7z7" Dec 03 23:42:37.555628 master-0 kubenswrapper[36504]: I1203 23:42:37.555551 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-96glt_b7f68d19-71d4-4129-a575-3ee57fa53493/cluster-node-tuning-operator/1.log" Dec 03 23:42:37.556488 master-0 kubenswrapper[36504]: I1203 23:42:37.556009 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-bbd9b9dff-96glt_b7f68d19-71d4-4129-a575-3ee57fa53493/cluster-node-tuning-operator/0.log" Dec 03 23:42:37.587303 master-0 kubenswrapper[36504]: I1203 23:42:37.587242 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fvghq_b4e6d3a1-b3a7-4c7f-b76c-84ca2c957b23/tuned/0.log" Dec 03 23:42:39.356378 master-0 kubenswrapper[36504]: I1203 23:42:39.356327 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-jxw8c_08432be8-0086-48d2-a93d-7a474e96749d/kube-apiserver-operator/2.log" Dec 03 23:42:39.429798 master-0 kubenswrapper[36504]: I1203 23:42:39.427231 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-5b557b5f57-jxw8c_08432be8-0086-48d2-a93d-7a474e96749d/kube-apiserver-operator/3.log" Dec 03 23:42:40.302413 master-0 kubenswrapper[36504]: I1203 23:42:40.302357 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_4c2364d3-47b2-4784-9c42-76bf2547b797/installer/0.log" Dec 03 23:42:40.328664 master-0 kubenswrapper[36504]: I1203 23:42:40.328607 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_8d60f02e-1803-461e-9606-667d91fcae14/installer/0.log" Dec 03 23:42:40.360720 master-0 kubenswrapper[36504]: I1203 23:42:40.360644 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-5-master-0_5c8c7291-3150-46a5-9d14-57a23bb51cc0/installer/0.log" Dec 03 23:42:40.391795 master-0 kubenswrapper[36504]: I1203 23:42:40.389894 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-7-master-0_0e0ffab3-4f70-4bc8-9464-598c0668a4b8/installer/0.log" Dec 03 23:42:40.908006 master-0 kubenswrapper[36504]: I1203 23:42:40.907849 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7035982c3bafe5c31d1c2515ed3c2b7b/kube-apiserver/0.log" Dec 03 23:42:40.924866 master-0 kubenswrapper[36504]: I1203 23:42:40.924762 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7035982c3bafe5c31d1c2515ed3c2b7b/kube-apiserver-cert-syncer/0.log" Dec 03 23:42:40.946092 master-0 kubenswrapper[36504]: I1203 23:42:40.946034 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7035982c3bafe5c31d1c2515ed3c2b7b/kube-apiserver-cert-regeneration-controller/0.log" Dec 03 23:42:40.964753 master-0 kubenswrapper[36504]: I1203 23:42:40.964496 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7035982c3bafe5c31d1c2515ed3c2b7b/kube-apiserver-insecure-readyz/0.log" Dec 03 23:42:40.987534 master-0 kubenswrapper[36504]: I1203 23:42:40.987478 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7035982c3bafe5c31d1c2515ed3c2b7b/kube-apiserver-check-endpoints/0.log" Dec 03 23:42:41.005080 master-0 kubenswrapper[36504]: I1203 23:42:41.005028 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_7035982c3bafe5c31d1c2515ed3c2b7b/setup/0.log" Dec 03 23:42:42.365006 master-0 kubenswrapper[36504]: I1203 23:42:42.362611 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/kube-rbac-proxy/0.log" Dec 03 23:42:42.388210 master-0 kubenswrapper[36504]: I1203 23:42:42.387508 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/1.log" Dec 03 23:42:42.396601 master-0 kubenswrapper[36504]: I1203 23:42:42.394819 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-754cfd84-bnstw_9f458bae-f6b0-4bdc-a9fb-cdfdd51c1cf6/manager/2.log" Dec 03 23:42:43.266518 master-0 kubenswrapper[36504]: I1203 23:42:43.265715 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-pmtjz_7f06b54a-85b9-4d0a-b544-0d591bec73cf/cert-manager-controller/0.log" Dec 03 23:42:43.306610 master-0 kubenswrapper[36504]: I1203 23:42:43.305898 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-vrzdf_05c0c9d6-6747-4c9c-946c-7e3eb71388a2/cert-manager-cainjector/0.log" Dec 03 23:42:43.323759 master-0 kubenswrapper[36504]: I1203 23:42:43.323706 36504 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-vnp9x_387eac4b-3ef7-4290-822f-fcf198ad50b9/cert-manager-webhook/0.log"